How AI understands emotion

By: Suzanne Guillette

The way computers know if you’re happy or sad

Full Transparency

Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.

Learn more

In science fiction, robots have had emotions for a long time. The film Her celebrates the romantic relationship between a heartbroken Joaquin Phoenix and Samantha, his virtual assistant who has the capacity to develop “love” using artificial intelligence (AI). The Hitchhiker’s Guide to the Galaxy makes numerous gags about the maudlin Marvin the Paranoid Android.

Thanks to recent breakthroughs in AI, emotional engagement with computers is going mainstream, and the reality is a lot different than the way it’s depicted in movies and books. Several industries are now employing virtual humans and AI that give real-time responses to the users’ emotional inputs.

Consider Lia, a digital assistant created by the startup company Soul Machines, who has a distinct personality and the ability to react to vocal and facial expressions. If Lia is displayed at a hotel’s “front desk” kiosk, and a guest vents frustration about his or her room, Lia will respond with empathy, understanding, and a practical solution, such as a room change. 

Recently Lia was on display at Verizon’s New York 5G Lab as part of an exhibit about AI technology.

Lia’s creator Soul Machines is developing digital humans, complete with digital brains, who are portrayed by actual humans. (Lia is played by New Zealand actress Shushila Takao.) Verizon’s Labs showcases innovators like Soul Machines to explore how 5G networks support cutting edge technology that contributes to the betterment of society.

Having the speed and bandwidth of a 5G connection is critical to ensuring that digital interactions feel humanized. In human-to-human engagement, the brain rapidly identifies and processes data points such as tone and non-verbal cues. In digital-to-human engagement, mimicking human-like interactions requires 5G’s bandwidth and speed.

Soul Machines gives digital assistants much higher fidelity to make the human-machine interaction more natural and empathetic. Digital humans like Lia are a realistic progression of earlier interactive digital characters, like Clippy, a Microsoft Office cartoon assistant that popped up when it sensed users might need help.

Virtual humans have demonstrated a keen ability to create positive social and emotional bonds for mental health patients. Researchers at the University of Southern California’s Institute for Creative Technologies, for example, found that soldiers returning from combat were more likely to open up about their mental health—and potential symptoms of post-traumatic stress disorder—to a virtual human than to a real-life therapist. In the elderly population, those who struggle with depression and social isolation benefit from human-like interactions that give them an uplifting sense of connection.

Emotional AI also stands to revolutionize business. In banking, a virtual teller can respond to word usage and syntax to assess whether or not a customer is happy or upset. In healthcare, a digital doctor will connect symptoms and causes to help diagnose a patient’s problem. In human resources, a virtual trainer may lead new employees through the on-boarding process, “seeing” furrowed brows via analyzed images and reacting appropriately.

Virtual beings can also teach children with autism how to navigate emotions in themselves and others, says Jared Peters, co-founder of Expressive, a start-up that develops emotionally responsive avatars. A former educator, Peters explains that autistic children often learn best with visual support. Expressive is developing a digital teacher who picks up on emotional expression and offers reinforcement and reward.

“The idea is to have these software-stylized human characters act as bridges to social communication between peers,” says Peters. For example, virtual assistants can help anticipate the need for two people to meet and set up an appointment.

Expressive is also looking into creating digital caregivers for Alzheimer patients to lighten the burden for human caregivers. Such virtual caregivers might monitor when patients are in need of help from doctors. He says, “The potential to help others is vast.”

These digital humans shouldn’t replace human interactions, he adds, but give people a chance to “scratch that social itch” and improve lives, in small and big ways.

For more information, see:

Soul Machines

Expressive

5G Labs

About the author:

Suzanne Guillette has previously written about technology for Verizon and her work has appeared in O Magazine, Quartz and the Rumpus. She lives in New York City.

Related Articles

07/13/2023

Verizon launched Sharkfest, an augmented reality experience that transports viewers into the captivating world of one of nature's most feared creatures.

11/15/2022

New Verizon report highlights how companies are using emerging technologies to improve customer loyalty, as well as the challenges companies face in using them to good effect.