Emotionally Intelligent Digital Assistants Have the Ability to Understand Your Feelings

touchHow important will emotional intelligence be for the coming generation of enterprise intelligent assistants? Recent advances and opinions suggest that the ability of an intelligent assistant to understand the emotional state of those they’re dealing with could be a key differentiator.

Andrew Moore wrote a piece in Scientific American describing “Why 2016 Could Be a Watershed Year for Emotional Intelligence–in Machines.” Shelly Fan recently published “Anticipating Your Needs: Emotional Intelligence Is the Key to Smarter Gadgets” on Singularity Hub.

Both Moore and Fan write enthusiastically about how much more effective and helpful smart devices will be once they understand our feelings. Both authors envision the possibilities of virtual psychiatrists that can diagnose depression, marketers that can better gauge how well campaigns are resonating, and school teachers that get real-time feedback on what lesson plans are engaging and which are putting pupils to sleep.

What are some of the technologies required to give intelligent assistants and other smart devices emotional intelligence? The most readily accessible data points are probably the words that people use. The W3C has developed an Emotion Markup Language that provides a means to detect and encode words that signal emotions across a broad spectrum. Companies such as Heartbeat Technologies are also mapping how specific words and phrases reveal underlying emotional states.

The next most probable target is biometric data that can be gleaned from wearables. Tracking changes in a person’s heart rate can be a reliable indicator of emotional states. A jump in heart rate, coupled with the information that the wearer is stationary and in the office, could signal the wearer is under stress.

Zeroing in on facial expressions seems to be the holy grail of determining a human’s emotional states. Moore notes that camera technology has evolved to the point where it’s now possible for even smartphone cameras to pick up tiny signals from the mouth, eyes and eyebrows that provide great clues to what a person is feeling. It’s possible that laser technology might even be deployed to read facial expressions in the future.

The best human call center agents are ones that can quickly clue themselves into a caller’s emotional state. Today only sharp human agents are savvy enough to calm down a disgruntled customer and prevent them posting a flaming tweet or review. Will enterprise intelligent assistants have the tools to reign in angry customers in the near future? The possibility is certainly alluring.



Categories: Conversational Intelligence, Intelligent Assistants, Articles

Tags: ,