How Chat Analytics Differs from Voice Analytics

[Editor’s note: This guest post is co-authored by Tony Medrano, CEO of RapportBoost.AI and Jeremy Watkin, Director of Customer Experience at FCR]

Today’s contact centers are overflowing with insights about how to deliver a better customer experience. There’s just one problem: the best insights are housed in the customer conversations (phone call recordings and chat and email transcripts) and it’s traditionally been incredibly time consuming or costly, to garner actionable insights. These are still largely untapped resources and within them are a treasure trove of information that can fuel business success.

Thanks to advances in artificial intelligence, specifically natural language processing, and machine learning, the tide is beginning to turn as voice and chat analytics platforms are becoming more accessible, affordable, and insightful. With many new players in the market, it makes sense to spend a few moments discussing how voice and chat analytics are similar and different while also highlighting some of their key benefits.

A chat analytics platform and a voice analytics platform are like a screwdriver and a hammer. While both are tools aimed at solving a similar problem, they often arrive at the solution in very different ways. Just like hand tools engineered for specific applications, voice and chat analytics platforms are engineered to optimize specific customer support channels.

Voice Analytics
First, let’s look at voice analytics platforms. Key players in the market include i2X, Cogito, and Gong.io. These tools gain insights and generate recommendations both by analyzing the text of call transcripts and the features of the human voice. There are currently three main approaches for voice or speech analytics:

  • Phonetic – Listening to the call for specific keywords and phrases like, “I’m upset” or “I want to cancel my account.”
  • Full transcription – Converting the entire call into text for the ease of reading and also deeper analysis. This is more resource intensive but also more insightful as well, giving access to every word spoken on the call.
  • Extended speech emotion recognition Understanding and analyzing not only what is said on a call but how it’s said. The emotion behind a statement helps better understand what a customer said and meant.

In addition to these approaches, some voice analytics platforms analyze the human voice for energy, tone, tenseness, volume, and pace — regardless of what language is actually being spoken.

Chat Analytics
Chat analytics platforms, like RapportBoost, analyze the data generated by a company’s live chat conversations to uncover the drivers of a successful conversation. And the great news is that chat conversations are already converted into text — eliminating the need for transcription.

By using machine learning and natural language processing, RapportBoost understands the conversation between the customer and agent and generates company-specific recommendations for agents regarding their communication style. Imagine the ability to tie often subtle patterns like formality or lack thereof, reassurance, optimal message length, cadence, and word choice back to key business and success objectives like customer satisfaction, closed sales, and first contact resolution.

Building a model that drives better outcomes
Building a voice analytics or chat analytics platform requires significant software development and data science resources. And all too often these tools can sit on the shelf if they aren’t built with the right outcomes in mind.

When done right, the system can analyze the thousands of variables that make up a conversation. It can take note of the actions or words used by agents that are most likely to change the outcome of a phone call or chat — the actions that result in a successful sale. The success of any analytics platform lies in identifying those variables that affect conversation outcomes, determine their impact, and fuel efficient coaching of agents to effectively lead their interaction with a customer or prospective customer.

Beyond the text: Comparing the variables that change outcomes
It’s important to note that the variables that affect the outcomes of chat and voice conversations, while important in each, are different because spoken word is different than written. People use a fundamentally different language for different channels. The challenge regardless of channel is to move beyond just the words to understand tone, emotions, and cadence in alignment with customer expectations. Here are three key variables that are important to watch.

Tone
Both chat and voice offer unique opportunities for expressing tone. Chat relies on grammar, punctuation, word choice, timing, abbreviations, and the occasional emoji. The human voice on the other hand, conveys tone through volume, rate of speech, and variations in pitch. Voice analytics company Cogito learns by analyzing thousands of phone conversations to understand the tone and patterns that lead to success. They then provide guidance to contact center agents in real-time around critical skills like turn-taking (AKA not talking over the customer), tone, empathy, mimicry, and tenseness.

Cadence
Also known as turn-taking, cadence comes naturally (for most) during face-to-face or video conversations thanks to non-verbal cues like facial expressions and body language. It’s not quite so easy via phone and chat. Voice and chat applications mitigate this problem in different ways. On the voice side, i2x and Gong.io analyze recorded calls to recommend an optimized talk-to-listen ratio for customer support agents. For chat, RapportBoost analyzes conversations to recommend optimal formality, message length, reassurance, question type, and timing between messages.

Patterns of speech
Patterns of speech can vary significantly between customer support channels like chat and voice. On phone calls agents are often required to communicate significantly more information and detail to customers. In contrast, customers in chat are communicating on an internet-connected device so agents can link the customer to helpful information like product pages and the company knowledge base. Patterns of speech like volume, tone, and speed of delivery while sharing critical information with customers requires extra care over the phone.

Also regarding speech patterns, it’s important to compare the potential for spontaneous interaction. Chat is de facto linear. Only one message can be sent at a time, and a visual cue often indicates when one party is typing — sometimes even allowing agents to see what the customer is typing. Phone calls on the other hand have greater potential for interruption and improvisation. i2x helps agents stay on track by providing a ‘don’t say list’. Gong.io overcomes this challenge by building a playbook based on top agent’s questions and pricing conversations.

Each customer support channel, whether it’s voice, email, chat, etc holds unique potential for optimization. Voice and chat analytics platforms like those mentioned in this article continue to make huge gains in this area. As you evaluate the tools that are right for your business, it’s important to adopt a platform that aligns with your specific channel mix. It’s also essential to understand the variables in communication for the different support channels to analyze tone, cadence, and speech patterns. Once these are identified and understood, they fuel a coaching and continuous improvement process that will drive your business toward your desired outcomes and beyond.

———–

Guest Post Authors:

  • Tony Medrano is CEO of RapportBoost.AI, a leading conversational sales analysis platform for brands that use chat, SMS and messenger tools to engage customers. Tony received his MBA and JD from Stanford, M.A. from Columbia and B.A. from Harvard.
  • Jeremy Watkin is the Director of Customer Experience at FCR, a leading provider of outsourced call center and business process solutions. He has more than 17 years of experience as a customer service and experience professional.



Categories: Conversational Intelligence, Intelligent Assistants, Articles