Much Ado About Amazon Q

[Lead Analyst Dan Miller contributed to this post]

Amazon Web Service’s (AWS’s) annual global gathering, re:Invent, coincided with the one-year anniversary of OpenAI’s release of ChatGPT, as well as OpenAI’s near-death experience. No wonder AWS was feeling pressure to highlight advancements it has been making in the realm of GenAI. Even though the Seattle-based parent company had been hard at work on its own breakthrough in Natural Language Understanding, sponsoring the Alexa Prize for years and investing heavily in NLU research, it was OpenAI that dropped ChatGPT onto the world one year ago.

ChatGPT amazed everyone because it was truly conversational. It had (and has) the ability to understand natural language, generate responses, and tweak those responses based on suggestions from its users.

Microsoft, with its close alliance to OpenAI and Satya Nadella’s obsession to always stay ahead, quickly integrated Open AI’s advanced GenAI into Bing. The Colossus of Redmond then began rolling out Microsoft Copilot features for enterprise users, and introduced powerful GenAI features into its Microsoft Digital Contact Center Platform. Google has its Duet AI offering, which leverages its own GenAI solutions to empower enterprise users of Google Workspace products, and its Contact Center AI for customer experience.

Amazon offered its Bedrock service back in April, offering access to a variety of foundational models that developers could leverage to build their own GenAI-powered applications safely within the AWS ecosystem. It added intelligent assistants to the mix in August with the debut of Agents for Amazon Bedrock. But where, we wondered, was Amazon’s more complete response to the capabilities offered by its competitors in the enterprise cloud employee productivity and contact center spaces?

Amazon Q Unveiled at re:Invent 2023

We finally got our answer at re:Invent 2023, where AWS CEO Adam Selipsky announced Amazon Q. The AI assistant, powered by any of the GenAI models available within Bedrock, offers benefits for both back-office employees as well as customer experience focused employees through Amazon Connect.

For employees in the back office, Q can answer questions leveraging corporate knowledge from CRM and other systems, automate tasks such as creating service tickets, or help write marketing copy and blog posts.

Q is also trained to answer tough questions about managing AWS infrastructure and creating or optimizing computer code. Since AWS is filled with features and the documentation is dense and often difficult to follow, having an AI-powered assistant like Q to show you step-by-step instructions is a great time saver.

But since we focus our research on intelligent assistance in the contact center, let’s turn our attention to how Q, along with other GenAI tools, function within Amazon Connect.

GenAI in Amazon Connect for Improved Customer Support

The transformative power of GenAI is now obvious. Virtually every Contact Center solution provider is quickly integrating the most useful, and least risky, capabilities into their offerings.

Call summarization, intent identification, and suggested responses are on their way to commodity status. Amazon Connect now provides all these capabilities and has given each GenAI-supported feature a different product name. Taken as a whole, the total adds up to a powerful set of capabilities.

Amazon Q in Connect: AI Assistant for the Contact Center Agent

“I am Amazon Q, your AI assistant! As I listen to the conversation I will provide suggestions.”

Q acts as a capable assistant to customer service agents, helping them quickly grasp the customer’s issue while suggesting possible responses and courses of action to resolve the problem or request. Q starts by understanding the customer’s intent, a remarkable feat, but table stakes these days. It then leverages corporate information to deliver real-time, accurate responses to whatever issue prompted the customer’s voice call, chat, or text message. Armed with such an assistant, a call center agent should be able to handle more incoming calls while improving overall accuracy of responses.

Amazon’s announcement indicates that Connect customers can trial Q for free until March 1, 2024, with easy activation on the Amazon Connect console.

Amazon Connect Contact Lens: Call Summarization

Amazon Connect’s Contact Lens debuted more than a year ago as an analytics and workforce optimization service. With the addition of Q, Amazon now calls their post-contact summarization service Contact Lens. It uses GenAI to condense customer interactions into coherent summaries for quick reference and improved service. This capability also empowers managers to efficiently monitor and enhance contact quality and agent performance.

Amazon Lex Assisted Slot Resolution: Enhancing Chatbot Accuracy

Amazon has also introduced a new assisted slot resolution feature in its Lex (Natural Language Processing) service. Using GenAI, it provides more accurate understanding of slot values, such as numerical values, city names, or product names. These improvements, which should make it easier for chatbots, voicebots, and IVR systems powered by Lex to understand customers, were made possible by the advanced reasoning capabilities of LLMs.

Customer Profiles with LLMs: Personalized Experiences Made Easier

Amazon employs multiple Large Language Models (LLMs) to facilitate the efficient creation of unified customer profiles. When contact center administrators integrate customer data from diverse backend systems, Customer Profiles helps streamline the process. Customer Profiles assists by interpreting data formats, comprehending content representation, and establishing unified and accurate customer profiles. These enriched profiles empower contact center managers to deliver highly personalized customer experiences. The quick and efficient setup of customer profiles allows for more tailored interactions, meeting customer expectations and fostering satisfaction.

The Potential of a Multi-Model Approach

With the past as prologue, deep-pocketed hyperscalers will continue to invest in ever-larger Large Language Models. The race for LLM leadership is an expensive proposition, and one that Amazon has elected not to enter. Instead, Amazon Bedrock is a managed service that enables developers to choose the most appropriate foundation model for the task at hand.

Amazon’s own family of foundation models, Titan, is just one of many options that an enterprise developer can choose for a particular task. The company is hedging its bets by continuing to offer tailored versions of Titan and allowing its customers for opt for its home-grown resources. This approach is yet another example of the winning cloud strategy that Amazon and AWS have pursued over the years. In the world of Contact Center as a Service (CCaaS), for example, AWS is the host for almost every leading CCaaS provider (think NICE, Genesys, Five9, Cisco and others) is hosted in the AWS Cloud. So is Amazon Connect, obviously. In competitive situations Amazon has many paths to a “win”. If Amazon Connect is chosen, that’s a clear win. But a loss to a CCaaS provider running its software in the AWS Cloud is also a win. It is a product posture that Amazon has grown very accustomed to.

Enter the fast-changing, and expensive, world of building and maintaining LLMs. There are already hundreds of candidates to offer domain-specific language models. The number of companies with the resources to keep up with OpenAI/Microsoft’s GPT series, or Google’s PaLM is much smaller. Amazon’s offer-your-own-and-host-the-leaders strategy is a winner in the CCaaS realm and is worth pursuing in the LLM world.

Categories: Intelligent Assistants, Articles