The Case for Emotionally Intelligent AI

In Technology by Daniel NewmanLeave a Comment

The Case for Emotionally Intelligent AI

When you’re having a really bad day, chances are good your friends are going to notice, even if you don’t say a word. They’ll notice your facial expression … your sagging body language … a lack of energy or joy in your voice. Most of us prefer to be around people who notice these things—people who really understand us. And that’s why emotional intelligence (EQ) has become such an important spoke in the wheel of AI development.

Whether we realize it or not, most the communication we engage in daily is nonverbal. Subtext, tone, and body language all play a role in communicating what we think and feel. As we enter an age when AI robots are helping us at home, in our cars, on our phones, and in customer service, it’s increasingly important to focus on building AI with emotional intelligence so it can truly help us in the way we need it to. After all, a personal robot assistant who only understands what we say will never notice if we’re tired, cold, hungry, or lonely—unless we state those things out loud.

As noted by Rana el Kaliouby, cofounder and CEO of Affectiva, an MIT spinout company that sells emotional recognition tech, “If it’s interfacing with a human, it needs social and emotional skills.” By adding EQ to AI, we can improve how we work—and live—in the new digital landscape. We also can create better customer experiences all around. The following are a few reasons AI companies are increasingly focusing on the development of strong EQ.

Words Aren’t Everything.

Best-selling author Gary Marcus has spent a lot of time promoting the “anti-hype” of AI, largely for one reason: “Intelligence is a multi-dimensional variable,” he says. “There are lots of things that go into it.” But right now, AI is focusing on just one—language. And that leaves a lot of room for miscommunication, misunderstanding, and missed opportunity. By arming AI with better ways of understanding human nuances and physical communication, AI will be better prepared to respond to more complex humans needs.

We Can’t Form Relationships Without EQ

I have a friend who started online dating. In her profile, she indicated that her perfect partner has a high EQ. Why was that important to her? For many of us, it’s difficult forming relationships with people who can’t sense or understand our emotions. And since much of today’s consumerism is based on relationships, that makes EQ especially important. Nothing is more frustrating than speaking to someone who has no empathy or feeling. When we’re frustrated at a customer service agent, we want to feel that they understand how upset we are. If they parrot back a line from their customer service textbook, we’ll know it. And we likely won’t buy from them again. High EQ in our AI chatbots and service agents will go a long way in improving customer satisfaction.

AI Can Respond Faster—and Better—With a Bigger Arsenal of Understanding

The more tools our AI have to understand their human counterparts the better. If they are limited to voice and text recognition, there is a lot of information that can slip through the cracks or delay a timely solution from being developed. By empowering AI to understand changes in pitch, lulls in conversation, and facial expression, we’re giving it the key to truly understanding human behavior and how to respond to it. That means happier, more satisfied customers.

At the moment, messaging apps are better equipped than voice-enabled apps for detecting emotions simply because they have greater cues. But moving forward, especially with the growth of conversational systems, there will likely be an increased emphasis on improving voice-enabled apps, as well.

So how close are we to developing robots that truly “get” human beings? The jury is still out. As one writer explained, it’s easier to generate an AI’s human “persona” than it is to create a human “personality.” A persona looks and acts like a human but doesn’t come full circle in engaging with the subtler aspects of being human. Right now, AI has only unleashed the most basic forms of EQ. They’re impressive, but there is still a long way to go.

Additional Resources on This Topic:

IoT and AI: Improving Customer Satisfaction
Time for Chatbots to Get Smart

This article was first published on Forbes. 

Daniel Newman is the Principal Analyst of Futurum Research and the CEO of Broadsuite Media Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise. From Big Data to IoT to Cloud Computing, Newman makes the connections between business, people and tech that are required for companies to benefit most from their technology projects, which leads to his ideas regularly being cited in CIO.Com, CIO Review and hundreds of other sites across the world. A 5x Best Selling Author including his most recent “Building Dragons: Digital Transformation in the Experience Economy,” Daniel is also a Forbes, Entrepreneur and Huffington Post Contributor. MBA and Graduate Adjunct Professor, Daniel Newman is a Chicago Native and his speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

Leave a Comment