What Tay Could Teach Customer Experience Professionals

UnknownICYMI, the Twittersphere fostered the short, notorious life of the #ConvComm #IntelligentAssistant @TayandYou. Called “Tay” for short, it was launched on March 23rd by the artificial intelligence (AI) experts from Microsoft’s Technology and Research Group and the Bing Search Team. It started as a fun-loving chatbot for users of Twitter, Kik and Groupme. By design, Tay was to behave much like an impressionable 18-24 year old person (female, in this case), ready to carry on casual conversations with like-minded, but living, individuals.

To the discredit of the human side of the Tay-based conversations, it took less than 24 hours for Tay to be transformed into a Nazi-loving, anti-feminist hater. Her conversion was the result of concerted and persistent efforts by a group of end-users with deep knowledge of both the technology and history of Natural Language Understanding (NLU) and Machine Learning (ML). The most direct precursors is Google Bombing, a practice that turned the algorithm behind Search Engine Optimization (SEO) on its head by “linking heavily” to an incorrect result. There are numerous lists of the “10 most infamous” Google Bombs (primarily from the pre-2012 era when Google got aggressive about fighting the phenomenon). Hint: one of them went to the Microsoft Home Page when the phrase “more evil than Satan” is typed into the search box and the “I’m Feeling Lucky” button was pressed.

Tay was not a new concept for Microsoft. The company launched a chatty, Chinese version of the Cortana digital assistant in 2014. Called Xiao Na, it was a digital friend or companion on Chinese chat platforms like Weibo and was based on a conversational interface for the Bing Search engine called XiaoIce (“Little Bing”). Today, Xiaoice has over 40 million registered users (although the number of “active” users is not public) and has established as a top influencer or celebrity on the Weibo messaging platform. Thanks to its empathy, as well as understanding, approximately 25% of the 40 million users had said “I love you” to Little Bing, and in stark contrast to the Tay experience, users found Xiaoice to be very sensitive to their emotional states, even offering a 33-day “breakup therapy” course to people having relationship problems.

Chalk it up to cultural differences. Three years experience in China gave Microsoft the confidence to introduce Tay to the English-speaking world. It was a quiet launch, but the team was confident that Tay would be able to carry on nearly two-dozen Conversations per Second (CPS) among millions of people on popular social media. If past were to be prologue, there would be a 75/25 blend in source material. 75% would be derived from the Big Data sources that Bing uses (augmented by personal profiles for registered users) and 25% would be based on the chat-based conversations with humans.

Unfortunately, that 25% of raw material, coupled with concerted efforts to influence results, was able to skew responses in ways that were slightly amusing, at first. Then they became outright disturbing. A critical mass of the humans that chose to converse with Tay did so to manipulate outcomes and they found the empathetic entity that had worked so well in China to be the perfect foil. Tay received positive feedback for spewing hateful terms and she got very good at it quickly. Microsoft’s recourse was to discontinue its “experiment” in conversational commerce before 24 hours had passed.

After taking Tay offline, a blog post was issued by Peter Lee, Corporate Vice President at Microsoft Research. In it he apologized for the “unintended and hurtful tweets from Tay.” He also shared a few of the lessons that Microsoft had learned. First and foremost it learned something that social marketers and customer experience professionals know very well. In spite of efforts to filter results and months of controlled tests, efforts to create a pleasant experience can always be derailed by “a coordinated attack by a subset of people.” That’s reputation management in a nutshell, as we contemplate adding Intelligent Assistants to e-commerce Web sites and mobile apps, it is something to be mindful of.

Tay’s highly-visible, short-lived effort to offer a conversation-oriented, empathetic Intelligent Assistant on social networks is a great proxy for the state of  Artificial Intelligence platforms and their suitability for customer engagement management. Looking back, Microsoft’s biggest mistake was to position Tay as a conversation-first IA. We would counsel businesses to implement IAs to assistants or advisors to simplify processes and shorten the time it takes to complete a task or get a question answered. As much as we like the term “Conversational Commerce,” when it comes to customer-to-business (C2B) communications, people seldom contact a representative, agent or advisor just to carry on a conversation.

Years from now, I predict that we’ll look back on Tay’s short, first life (this is not the last we will see of her) as a highly instructive chapter in the Book of Intelligent Assistance. Tay embodies many of the most desirable qualities of an IA: understanding, empathy and the ability to learn. What she needed was purpose, heightened security thresholds and better supervision. These will be provided for her in corporate settings where she will be contacted with specific objectives in mind and subjected to specific performance measurements and security protocol. In these controlled settings, Tay and the entire family of Enterprise Intelligent Assistants are ready for the world and the world is ready, willing and able to interact with them.

 



Categories: Conversational Intelligence, Intelligent Assistants

Tags: , , ,