Enterprise Connect Meets Gen “G” (for Generative)

Judging by the content of keynote presentations and discussions in and out of the breakout sessions at Enterprise Connect 2023, the presence of pre-trained Large Language Models (LLMs) and Generative AI has eclipsed all else in the pantheon of purchase criteria for Customer Experience, Contact Center, collaboration or Unified Communications IT infrastructure. As enterprise decision makers plan their investment strategies for cloud-based resources, they must come to grips that “full-stack” solutions now include a Conversational AI layer – with special attention to Generative AI and Large Language Models (LLMs).

Speakers and exhibitors like Amazon, Cisco, Genesys, Google, Microsoft, NICE, Vonage, Verint, and Zoom (among others) highlighted how well their enterprise customers are integrating Conversational AI (most often a flavor of OpenAI’s GPT-x) into the routines of both customers and employees. My core takeaway and advice to them is to treat “Conversational AI” as a commodity, and come to grips with these four aspects of the new reality:

  • Generative AI is Everywhere: Hundreds of millions of people have had their chance to ask tools like Dall-E or Hugging Face for pictures or ChatGPT, Bard or others for prose. End-users are are able to discover and define their own use cases and, in doing so, are in the process of discovering what the models do well and when they fail.
  • AI is my “Copilot”: It is not a coincidence that the “Copilot” metaphor is used by both Microsoft and Adobe in their trade names for Generative AI-based services and Github Copilot has been a go-to resource for developers since June 2021.  As background Opus Research’s first sighting of this phenomenon was to note that SAP used an identical identifier for its “deep linking assistant” in 2017, and Amelia (then IPSoft) used the same name for its agent assist service in 2019.
  • Use cases have been years in the making: The rapid introduction of call summarization, intent recognition and categorization, bot building and tuning, next-best action/recommendations trace their lineage to technologies with roots in the 1960s. Deep Neural Networks, Natural Language Processing, Machine Learning and Semantic Search are deeply embedded with the API-based offerings of OpenAI, Google, nVidia and others that solutions providers have had years to play with.
  • All Implementations are flawed: With hundreds of millions of users, the ranks of sophisticated and influential users is growing. Each has a tale to tell of gaps in the LLM’s knowledge base, errors in responses and just plain “hallucinations”. Such known flaws create large business opportunities for opportunistic entrepreneurial firms, as well as individual subject matter experts, who develop services or platforms to overcome defects in training data or errors in output.

There are already dozens of use cases for Generative AI and LLMs in large enterprises. Post-call analytics and summarization has a built in ROI for contact center managers. Both development and ongoing training of chatbots and voicebots are being greatly accelerated by judicious use of AI-infused resources. The deployment of personal assistants or copilots to assist in sales, HR/Onboarding and repetitive administrative tasks (like scheduling a meeting) will continue to drive wide-spread acceptance. The genie is out of its bottle.



Categories: Intelligent Assistants, Articles