Generative AI: LLMs in Business
Generative AI offers enormous productivity benefits for individuals and organizations. Many of us ChatGPT for writing assistance, summarizing papers, brainstorming etc.
On business applications, generative AI begins with large language models (LLMs). LLMs have a lot of general knowledge, but they don’t know everything.
The key is how to help LLMs to include knowledge specific to an organization and/or application domain. At present there are two methods:
- Retrieval augmented generation (RAG): RAG references an authoritative knowledge base outside of its training data sources before generating a response. As a result, rather than providing standard, generic responses, generative AI based system can now sift through FAQs, customer orders, documentation, use cases, and company blogs, delivering answers that are tailored for each specific scenario and client.
- Fine-tune models: Adapt LLM to your task by training it on a dataset specific to your task.
This could pave the way to develop the generative AI based system to share supply chain knowledge across the company’s sales teams, finance, product design, engineering, planning, procurement and logistics teams, benefiting all areas of the business.