What Managers Need to Know about LLMs: Unlocking the Power of Transformative AI

The emergence of Large Language Models (LLMs) built on the foundation of transformers has proven to be a transformational event for businesses and industries worldwide. It seems everyone is talking about ChatGPT and other foundation models. Predictions about the impact of LLMs range from euphoria to skepticism to existential concern about what it means for the future.

But taking a step back from the excitement and fears, managers and C-suite executives would do well to recognize the significance of LLM technology and how vendors provide their employees with tools and resources to serve customers better, reduce workloads, and improve overall business outcomes. Companies that lag in their adoption may see their business disrupted by those willing to forge ahead, even though the landscape is still shifting.

Vendors introduce new models, features and tools almost daily. In the face of such rapid change, the primary challenges are, first, to understand the core capabilities of foundational models, second, to appreciate the technologies designed to tailor LLMs’ output to be industry-specific or use-case specific and, most importantly, to determine the best way to engage employees to take advantage of the tools offered by top vendors.

The Transformer Revolution

LLMs are the product of a revolutionary approach to machine learning known as transformers. These models are constructed by training on massive volumes of data through a procedure focused on predicting the next token in a sequence. While it seems counterintuitive that a system focused on simply predicting the next word in a sentence could have much value, the process has produced astonishing results. Transformers have led to the development of computers that not only appear to comprehend human language and thought, but that also communicate with us seamlessly and conversationally.

Just in terms of what LLMs mean for Natural Language Processing (NLP), these transformer capabilities have eliminated the labor-intensive preparatory work conversational designers have been tasked with for decades. In the past, handling customer queries such as “What’s your return policy?” required designers to meticulously prepare an NLP system, creating specific “intents” and extensive training data. This meant anticipating various ways customers might phrase their questions. For instance, “Can I send this back?” or “How do I return this?” were just a few examples. If a customer’s query didn’t align with the prepared intent, the system would often respond with a frustrating “Sorry, I don’t understand.”

The Nimble Power of Transformers

Transformer-driven LLMs and Generative AI (GenAI) are also powerful because their capabilities are so broad and not limited to “just” breakthroughs in NLP. Prior to this technology, businesses would need to acquire a whole suite of tools for handling their contact centers–or partner with a vendor specialized in providing all the capabilities needed.

Now pre-packaged offerings by companies such as OpenAI, grounded in foundation models such as OpenAI’s GPT-4 and related machine learning systems, can perform a full range of capabilities, including at a high-level:

  • Natural Language Processing and Understanding (NLU/NLP)
  • Natural Language Generation (NLG)
  • Automatic Speech Recognition (ASR)
  • Text-to-Speech (TTS)
  • Sentiment Analysis
  • Image detection and understanding
  • Image generation
  • Reasoning and problem solving
  • Computer programming / code generation

With this full suite of pre-built capabilities, even small development teams can create applications that would once be unimaginable–such as chatbots that interact with customers, understand and respond accurately to their questions, carry out transactions on their behalf, communicate with live contact center agents, and summarize the details of customer interactions.

But how can LLMs be tailored to address specific business queries, even if they were not initially trained on the nuances of a company’s products, services, and policies?

The Magic of Embeddings and Vectors

The answer lies in providing LLMs with corporate data sources in a format they can comprehend. LLMs conceptualize the world and the relationships between words and concepts in a mathematical space called vector space. Fortunately, there’s a simple process for translating any document into vectors an LLM can understand. The process consists of calling a function to generate embeddings for the document and storing the mathematical results in a vector database. Yet from a user or developer’s point of view, they are simply uploading documents, often through the sort of “drag and drop” interface made familiar by email or file transfer programs for decades.

Two popular methods of providing LLMs with corporate knowledge are “Retrieval Augmented Generation” (RAG) and “Fine-Tuning”. Both approaches involved uploading corporate documents, such as knowledge bases, product descriptions, user manuals, and customer conversation transcripts to a resource that can then represent the data in a manner understandable by an LLM.

RAG combines the strengths of information retrieval and content generation, ensuring that LLMs can fetch and generate relevant information in real-time, making them a potent tool for knowledge-intensive tasks. When RAG is used, no additional upfront model training is required.

Fine-Tuning makes LLMs even more effective in specific business contexts. Similar to embeddings, fine-tuning narrows down the model’s knowledge to cater to domain-specific queries. It refines the model’s responses, ensuring it aligns with a company’s business and operational needs. To leverage fine-tuning, programmers train an LLM on corporate data that has already been transformed into vectors.

Virtual Assistant Vendors: Still Essential

The burning question for managers is, “Now that we have LLMs, does my company even need a virtual assistant vendor anymore?” For most companies, the answer is most likely yes. While LLMs have transformed the landscape and empowered developers, integrating these capabilities with existing enterprise systems, such as contact center solutions, can be a challenge.

Top Enterprise Intelligent Assistant vendors, such as those we reviewed in our recent 2023 Conversational AI Intelliview: Decision-Makers Guide to Enterprise Intelligent Assistants are quickly integrating powerful LLM capabilities into their products. These upgrades provide businesses that already license their solutions with powerful new tools to improve customer service while automating and streamlining tasks for employees.

For example, many of the vendors we covered in the report already leverage LLMs to provide incredibly quick and accurate call summaries of voice calls with customers. Some use LLMs to generate possible answers to customer questions. Others help companies easily upload their corporate knowledge into a vector database to make it available to the LLM for queries. Many are already leveraging LLMs to augment and improve existing NLP technology.

It’s certainly possible for in-house development teams to use tools provided by OpenAI, Google, Amazon and others to build some of their own LLM-powered applications. In fact, it probably makes sense for these teams to at least tinker to understand the concepts behind this powerful technology. But partnering with an established intelligent assistant provider is most likely a faster path to implementing the breakthrough benefits that promise to improve customer service, streamline processes, and accelerate positive business outcomes.

Technology Matters: It Helps to Have Your Own Blueprint

Managers must also consider whether they need to care about the underlying technology employed by their enterprise intelligent assistant vendor. In our view the answer is “yes”. As we mentioned above, the leading providers are well-aware of the seismic shift that has occurred in human-to-computer conversational interfaces with the advent of transformer-based LLMs.

We recommend working with vendors who have already begun using LLMs in their offerings and have a roadmap for continued integration as new capabilities emerge. Failing to do so may result in unnecessary labor and outdated NLP systems, depriving your customers and employees of the advantages offered by this remarkable transformer technology.

In a world where technology can rapidly reshape business operations, adoption of LLMs based on transformers is poised to have great impact. Intelligent assistant vendors have a crucial role to play in bridging the gap between transformative AI and seamless integration into your business operations. Staying competitive, while embracing cutting-edge technologies is becoming a necessity. It starts with making the best use of tools that engage and involve employees that can apply subject matter expertise toward driving desired business outcomes.



Categories: Intelligent Assistants