AUSTIN—There have never been more ways to interact with technology—touch screens, voice prompts, even the plain old keyboard. But all of these interfaces are essentially command driven. The system does what it is told without any context about your emotional state. It doesn’t know if you’re happy, angry, depressed, or anxious.
“Emotions matter, they influence everything,” el Kaliouby told me here at SXSW 2018. Everything from how we learn to how well we communicate is influenced by our emotional state. If a machine can’t detect emotion, a huge amount of information is being lost.
“About 55 percent of inferring your mental states is done through your face and gestures,” el Kaliouby says. “Another 38 percent is how you are saying it, your tone and your inflection. Only 7 percent is on your choice of words. That means most of this industry is ignoring 93 percent of the data available.”
Affectiva is mining that data and using it to enhance products; it has analyzed more than 6 million faces from people in 87 countries.
How does one train machines to read human emotions? First, you need large data sets—basically photos of faces from different ethnic backgrounds and varied ages that have been categorized by the emotion they are currently expressing. Affectiva then employs a small army of high-EQ (Emotional Quotient) workers to help build a database.
Some of the applications for emotional machines are obvious. As call centers automate, the online bots that manage consumer calls will need to understand rising frustration levels. The global consumer robot business was worth $3.6 billion in 2016, so layering in emotional intelligence will be key to growing that market even faster. But there are more mundane applications as well.
Watch the full interview with el Kaliouby above.
For more Fast Forward with Dan Costa, subscribe to the podcast. On iOS, download Apple’s Podcasts app, search for “Fast Forward” and subscribe. On Android, download the Stitcher Radio for Podcasts app via Google Play.