Ask Yourself: “Am I Wiretapping Calls With My Own Customers?”

In a case called Gils v Patagonia, a law firm that specializes in “high-stakes contingency fee and class action litigation” is representing an individual (and her cohort of California teleshoppers) who claim that calls to Patagonia (yes, the super environmentally-conscious outfitter) are ” intercepted, listened to, recorded, and used by an undisclosed third party….” In this case the third party is contact center as a service (CCaaS) provider Talkdesk.

Like Verint, NICE, Five9, Amazon (with AmazonConnect), and about a dozen other of its CCaaS peers, Talkdesk has integrated GenAI into its service offerings in response to market demand for resources that help agents quickly understand the caller’s intent, recognize when conversations are going sideways (when a caller is upset), and extract other insights that ensure better customer service, starting with compliance with guidelines that every agent (and virtual agent) should follow.

There is an even larger group of technology providers who supply companies with resources that capture and transcribe conversations and then use “AI” to summarize the content of the call, detect and suggest any follow up actions that need to be taken, and provide other fodder for GenerativeAI (GenAI) -infused platforms to work their magic. Entire workflows are being designed to speed up the processes involved in booking appointments, composing follow-up emails, and sharing insights with product or marketing teams that can do something about it. Anyone who has participated in a Zoom call or Teams Meeting where the AI Assistant or Copilot has summarized hours-long discussions into quickly reviewable five paragraph documents, might share my opinion that this use of the technology is beneficial and nobody has been injured in their production.

I shouldn’t conflate internal use of AI assistants with those being employed in customer care because employees, for the most part, are aware that they are being monitored constantly. It’s almost a condition of employment. That said, there are already “preambles” to every call into brands stating “This call may be monitored for security and training purposes” which, I would argue, covers the training of domain specific language models, kept within a company’s firewalls and employed to provide better quality customer experience. I know it’s a very pale flavor of disclosure and does not equate to a fully-transparent description of the treatment of personally identifiable information (PII), but what Patagonia is doing, in my opinion, is not “intercepting” calls into its own contact center and then sharing it with Talkdesk, anymore than an enterprise is intercepting conversations among its employees and sharing it with Microsoft, Zoom, Vonage or another “third-party” for some nefarious purpose.

Instead, and this is what disclosures in the form of a preamble or privacy statement should say, “[Our company] saves transcripts of all conversations incompliance with applicable laws and for training purposes, including the training of AI resources that improve our quality of service.” It is an offshoot of call recording, not the product of illegal wiretapping.

Running Afoul of the California Invasion of Privacy Act (CIPA)

Contingency-only law firms are opportunistic in picking their battles. It helps when there is a large “class” to represent. In this case they are talking about every California resident who has called a customer support line that is hooked up to a CCaaS. It also helps to have a statute like the California Invasion of Privacy Act (CIPA), which is crafted to describe prohibited activities in language that is subject to interpretation, like cataloguing a broad spectrum of devices and resources that can be used to help companies “ automate tasks or mine customer data.”

The brief calls Talkdesk a “data intelligence company”, rather than a CCaaS. This term is used quite purposefully in order to demonstrate that common practices are in violation of the California Invasion of Privacy Act, which asserts that businesses are breaking the law when they “intercept, record, and analyze” conversations with their customers without proper consent. The word “intercept” has a history and is a loaded term. It has been used most broadly to refer to “Lawful Intercept”, or the use of facilities in telecommunications and telephone networks that “selectively wiretap individual subscribers”. That is not what is going on here.

This Case Should Not Pass a Smell Test; But Take it as a Warning

The Plaintiff’s logic is flawed. The plaintiff’s lawyers accuse Patagonia of wiretapping its own phone calls so that a 3rd party, Talkdesk, can use it for its own purposes without gaining the explicit consent of a caller. Among the nefarious ends that Talkdesk seeks to achieve are rapid recognition of a customer’s intent in order to provide prompt responses; sentiment detection, to help agents be more effective; and automated quality monitoring to improve services. That hardly amounts to unlawful access of data to be used by a 3rd Party “for its own purposes.”

Just about every solution provider and consultant advise their clients to craft “preambles” designed to inform customers that “calls are recorded for training and security purposes.” Now that asynchronous conversations take place across multiple channels over spans of time, an audio prompt at the beginning of a phone call is already inadequate. Regulators and legislators would prefer fully-transparent descriptions of the measures taken to protect customer privacy and explicit consent surrounding specific uses.

History tells us that there will be many more lawsuits. Fellow analyst Sheila McGee Smith provides an excellent rundown of measures that brands and solutions providers should consider in this post. Here is my own list (organized for the three concerned parties):

  • For brands (like Patagonia):  Tailor preambles and privacy notifications to describe precisely and transparently how they are capturing and analyzing conversational content.
  • For solution providers (like Talkdesk): Bake privacy protection into solution architectures and workflows, and take pains to provide customers with a set of “best practices” for informing their customers about the ways audio and text associated with their conversations is used to train resources that improve the service they receive.
  • For customers: Don’t sue! Demand more control over how your conversational data and personal information is used or shared on your behalf. Get ready for the new age of GenAI-informed CX.

It will help when a brand can demonstrate that they offer better experience thanks to GenAI. It will also be a great leap forward when and if both brands and solution providers become truly transparent to their customers to the point where they can demonstrate that it is worth it to permit the recording, transcription and analytics of conversational content in order to gain better service quality.

 



Categories: Intelligent Assistants