Accusations of “Controversial Speech Recognition” Will Chill Introduction of “Voice First” Services

Spotify has stepped up its forays into the world of Automated Speech Recognition (ASR) and, in the process, exposed the schizophrenic nature of global understanding and acceptance of rich Conversational Intelligence. In early April, “Hey Spotify” debuted with much positive fanfare. It is an example a custom wake word that provides hands-free access to voice-based search and command of Spotify features. Now, after invoking the service by saying “Hey Spotify,” a subscriber can benefit from the recommendation algorithms and machine learning by saying something like “Play ‘My Daily Drive’ Mix,” and off they go.

So far so good.

Yet, the formal launch of “Hey Spotify” occurred within a few days of the the U.S. Patent office granting patent #10,891,984, pertaining to the identification of taste attributes from an audio signal.” The patent filing mentions a series of instances where the content of spoken input, along with how it is said can be used to make for a surprisingly good audio experience. It is one that takes into account the gender, emotional state, age and dialect or accent or the speaker as well as his or her location, including physical attributes and the audio characteristics of that location. Responses to the spoken input can also be personalized by taking into account previously indicated preferences, listening practices and recommendations of friends.

This far, not so good.

On April 2 Access Now, a non-profit organization concerned with human rights in the digital age, wrote a letter to Daniel Ek, CEO and co-founder of Spotify, urging him to abandon any plans to implement these “surveillance” technologies immediately. Spotify did not respond directly to Access Now’s entreaty, but cited a letter it had previously sent the organization stating that “Spotify has never implemented the technology described in the patent in any of our products and we have no plans to do so,”

So much for a good idea.

A Threat to Empathetic Customer Care

Through Access Now’s lens, applying a technology that detects an individual’s emotion puts “Spotify in a dangerous position of power in relation to people using the service. Spotify has an incentive to manipulate a person’s emotions in a way that encourages them to continue listening to content on its platform — which could look like playing on a person’s depression to keep them depressed.” This framing occurs contemporaneously with the drafting of a European Commission White Paper on AI which, among other things, seeks to ban the use of artificial intelligence to “manipulate human behavior.” The Brookings Institution equated this to China’s “social scoring” programs for individual conduct. But the report is subject to interpretation and Access Now’s concern about reinforcing music selections could be deemed to fall into this category.

Initiatives by the EU and Access Now raise a caution flag for all solution providers in the digital commerce and customer care domains as they incorporation Conversational AI into their products and services. For years, we’ve been told that the line of demarcation is between live agents. who are capable of empathy and sympathy, and intelligent virtual agents, that draw the line at understanding and intent recognition. Neither is assigned the role of “manipulation” although it could be said that the “m” word is core to helping individuals accomplish their goals quickly and happily.

Attempting to draft laws and regulations that prevent Ai-based manipulation is a fool’s errand. More importantly, it will have a chilling effect on the introduction of applications of Conversational AI that are truly helpful to individuals in their roles as prospects, customers, clients or members. Companies are either ethical, or they are not. The fact that Spotify can suggest an artist or song that I might want to hear based on my tastes, preferences, past selections or current mood is exactly why I chose the service. Trusted retailers, hotel chains, airlines, pharmacies, healthcare providers and others are entitled to do the same.

We have to be thoughtful as we strike a balance between empathetic handling of calls for personal assistance and blanket condemnation of such efforts as blatant manipulation of individuals’ emotions.



Categories: Intelligent Assistants

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.