How Far Away Are Empathetic Machines?

Posted November 18, 2020 How Far Away Are Empathetic Machines?

Artificial empathy may prove to be the most important advance in AI systems in the coming decade, with the ability to understand and act on emotional context transforming the automated experience of many consumers from frustrated tolerance to delight.

What Is Empathy?

Empathy is a part of emotional intelligence. It is the ability to understand or feel what another person is experiencing within their frame of reference.

Empathy can be broken down into two main parts:

  • Cognitive empathy – understanding the other person’s frame of reference.
  • Affective empathy – the ability to respond with appropriate emotion.

When we talk about empathy, we often talk about it in terms of how we feel, how we react, and how we internalise the emotional stimuli that we receive.  This is different to sentiment analysis which aims to detect positive, neutral or negative feelings. Empathy looks to recognise a much broader spectrum of feelings. These feelings typically include: anger, disgust, fear, happiness, sadness and surprise.

Why Is Empathy Such a Crucial Factor in AI’s Evolution?

Imagine two restaurants, both serve the same food and offer it at the same price, but restaurant one has great waiting staff. Which one would you choose to return to? And where would you leave the largest tip? Our experience is materially influenced by how things are presented to us and how well people, or machines, react to our context.

A great waiter or waitress knows when you want help with the menu and when you want to be left alone. Humans are complicated and their context changes frequently – in a world where more and more interactions are automated, emotions play a vital role and go beyond a static customer profile. This is where machines with empathy can provide a material improvement in outcome and differentiate a product.

Already numerous solutions promise empathetic capabilities by leveraging context, NLP and deep learning neural networks to garner useful information and feed empathy engines within AI systems.

As the frequency of interaction with and the dependency on machines increases (whether chatbots or intelligent call agents) we miss human contact and “the machine” misses out on vital information that would require EQ as oppose to IQ. There are numerous examples that demonstrate this lack of empathy, and therefore of understanding, is the biggest frustration with human to machine interaction. In turn, leading to dissatisfied customers, terminated transactions and ultimately churn.

As AI continues to improve its efficacy in understanding meaning from text and speech, we are also seeing leaps in AI’s ability to demonstrate empathy and through that additional layer of understanding, make better, more accurate decisions on how to interact with us.

Let’s take an example of a customer service chatbot that can speak to the customers as a person would. This empathetic chatbot will be able to ask the scripted questions with appropriate inflection and tone and, based on the client’s responses, will be able to interpret complex emotions such as frustration or anger to better respond to the customer’s needs.

By understanding the emotional context, the chatbot might decide to forward the customer on to a live service agent or jump the customer to the head of the queue.

When an empathetic chatbot detects happiness and satisfaction from the customer, then the chatbot has meaningful evidence of satisfaction significantly beyond the usual NPS score and might better target additional sales or offers.

What Are the Challenges?

Providing a machine with effective empathy is not without its challenges. Labelling is a significant challenge as solutions like Mechanical Turk are unlikely to meet compliance hurdles and established data-sets are relatively few and limited, (ISEAR is the largest with only 2500 sentences relating to 5 categories of emotion).

Another significant concern is the risk of racial bias. A study reported on in HBR showed that emotional analysis technology assigns more negative emotions to people of certain ethnicities than to others.

In spite of these ethical concerns, the advancement of machines that benefit from empathy leans towards the positive side – put simply, empathy in machines will lead to better, more tailored outcomes.

What Does the Market Say?

A recent study that Forestreet delivered for a large UK bank highlighted the growth in solutions offering features that are driven by an understanding of empathy. Companies like IBM have made great progress with their Tone Analyzer which is being integrated into numerous call centres and chatbots.

NICE apply behavioural and emotional analysis to route calls within their inContact product. Behavioural Signals centre their solutions around emotional interpretation driven by advanced AI and modelling the intentions of callers from the emotions identified.

Investment is also growing in this area, with both traditional investment houses and corporate VCs investing in the technology, with noteworthy increases in the last five years, as the area of study has gained more traction and broken out of the realms of academia.


Academics broadly agree that the majority of indicators relating to emotion are non-verbal, which provides challenges for non-video based sources.

However, existing solutions have proven that leveraging tone and intonation offer additional richness when trying to understand and interact with customers.

With investment in this area growing and use cases ranging from banking to medical care already apparent, it is likely empathetic machines will play a large role in our future. Similar to the IVR, machine empathy is almost certain be adopted across a broad range of industries with varying degrees of success.

The time is not far when we will find jobs, similar to coaches that train soft skills to contact centre staff, where the main work will be teaching empathy to robots.

See Forestreet in action.