Emotion AI: How Technology Takes A Human Face 8 Emotion AI: How Technology Takes A Human Face 9 Emotion AI: How Technology Takes A Human Face 10

Emotion AI: How Technology Takes A Human Face

#Machine Learning

In our daily interactions, we use thousands of nonverbal cues such as facial expressions, intonations, gestures, posture, to communicate our emotions and feelings. For effective communication to take place, it is important to read and decode all these hidden signals. It has always been an easy task for humans, but can intelligent technology do the same? 

What is Emotional AI?

Artificial intelligence has already penetrated many areas of our daily lives and business activities and has gotten close to the sphere of human emotions and feelings. 

Artificial intelligence (AI) is a field of computer science that focuses on building machines that can replicate human behavior. AI has been one of the top tech trends during the last decade, and it is not going to give up its positions in the technology market in the coming years. Studies have shown that 41% of consumers believe that AI will improve their lives in some way, and already 77% of the devices we use have AI features in one form or another.

Emotion AI or emotional artificial intelligence is a subset of AI that enables computer systems and algorithms to recognize and interpret human emotions by tracking facial expressions, body language, or speech. Emotion AI can be explained as a tool that enables more natural interaction between machines and humans: it can analyze subtle cues in the person’s mimics (micro-expressions), voice patterns, gesticulation, and respond to them in a human-like way.

The global emotion detection and recognition market is expected to grow to USD 56.0 billion by 2024 from USD 21.6 billion in 2019. By software tool, the facial expression recognition technology is projected to grow at the highest rate during the forecast period.

Global Emotion Detection And Recognition Market

Facial Emotion Recognition Software

The development in computer vision and machine learning makes emotion expression recognition much more accurate and accessible to the general audience. Facial expression recognition, or face computing, is a sub-field of image processing. In most common cases it allows companies to detect the emotions of people passing by their cameras. It may be used for marketing purposes, in healthcare, robotics — basically, in any field that requires an in-depth understanding of the human emotional response to certain activities. Facial recognition technology can be as well applied in many security cases including access control, authentification, payment verification, as well as in interviews, or interrogations.  

How does it work?

Emotion AI in face detection measures the facial expressions using any optical sensor like a standard web/ smartphone camera, detecting a human face in real-time, in a pre-recorded video or images. Computer vision algorithms identify the main points of an individual's face: eyes, the tip of the nose, eyebrows, corners of the mouth, etc., and track their movement to decode emotions. By comparing this gathered data to a vast library of template images, facial expression detection software can determine the person’s feelings based on the combination of facial expressions. Advanced emotion AI solutions like those provided by Affectiva or Kairos can measure the following emotion metrics: joy, sadness, anger, contempt, disgust, fear, and surprise. Additional software features may include facial identification and verification, age and gender detection, ethnicity and multi-face detection, and much more.

Voice Emotion Recognition Software

Recognizing emotion from speech has become the next stage of natural language processing, adding new value to the human-computer interaction. Voice emotion recognition software enables to process audio files containing human voice and analyzes not what is said, but how it is said, by extracting the paralinguistic features and observing changes in tone, loudness, tempo, voice qualities to interpret these as human emotions, distinguish gender, age, etc. Voice analysis and emotion detection are already used by major brands in many industries, including market research, call centers, social robotics, healthcare, and many more.

How does it work?

Voice recognition software works similarly to facial emotion recognition. The underlying technology uses machine learning algorithms (deep learning with Python, convolutional neural networks in Keras/ TensorFlow, other deep learning algorithms) to recognize emotional states from acoustic speech and measure with high levels of accuracy whether the speaker is happy, sad, surprised, angry or has a neutral state of mind. Nemesysco developed a technology named Layered Voice Analysis (LVA) to detect stress and deception in the speech by leveraging above 150 uncontrollable biomarkers to trace the genuine emotion of the speaker regardless of their language and tone of voice. Insights this technology can provide are invaluable for customer experience management, forensic science, security and fraud protection in banking and insurance, and many other industries.

Multimodal Emotion Recognition: How Far Can AI Go?

According to the 7-38-55 rule of personal communication, words influence only 7% of our perception of the affective state. Body language accounts for 55%, and tone of voice accounts for 38% of our nonverbal messages.

Logically, emotionally intelligent machines will need to capture all verbal and nonverbal cues to estimate the emotional state of a person precisely, using face or voice, or both.

The majority of the emotion AI developers are unanimous in declaring that the main goal of multimodal emotion recognition is to make the human-machine communication more natural. However, there is a lot of controversy around this topic. Do we really want our emotions to be machine-readable? Let’s leave this question to data ethics, though. Today we shall focus on some positive examples of the emotional AI application.

Uses For Emotion AI Technology 


Emotional support. Nurse bots can remind older patients to take their medication and 'talk' with them every day to monitor their overall wellbeing. 

Mental health treatment. Emotion AI-powered chatbots can imitate a therapist or a counselor and help automate talk therapy and accessibility. There are also mood tracking apps like Woebot that help people manage mental health through short daily chat conversations, mood tracking, games, curated videos, etc. Another example of AI-powered technology for mental health is a wearable olfactory display developed by MIT Media Lab. It can track the wearer’s cardio-respiratory information and release different scent combinations when needed to treat certain psychological problems, such as stress or anxiety.

AI as medical assistants. Emotion AI can assist doctors with diagnosis and intervention and provide better care. Applications like Affectiva allow to measure a patient’s heart-rate without the need of wearing a sensor: the program can track color changes in the person’s face with every heartbeat.

Emotionally Responsive Virtual Assistants

Entirely virtual digital humans are not designed to simply answer questions like Siri or Alexa, but are supposed to look and act like humans, show emotions, have their unique personalities, learn and have real conversations. 



Understanding consumer emotional responses to brand content is crucial for reaching the marketing goals.

Advertising research. Emotion is the core of effective advertising: the shift from negative to positive emotions can ultimately increase sales. Emotional AI-powered solutions like Affdex by Affectiva allow marketers to remotely measure consumer emotional responses to ads, videos, and TV shows and better evaluate their relevance. 

Personalization. A better understanding of human emotional responses to marketing campaigns and the ability to deliver the right content through the right channel at the right time.

Public Service

Surveillance. Cameras in public places can detect people's facial expressions and understand the general mood of the population. China, the world's largest surveillance market, is attempting to predict crimes using AI to monitor the emotional state of the citizens.

Insurance. Emotion AI technologies allow companies to conduct the risk assessment and detect fraud in insurance claims in real-time, by using both voice and facial recognition.

Banks and financial institutions. Credit risk assessment, fraud intention detection, immediate fact verification, risk scoring. Emotion AI can be also used to offer personalized payment experience, set up biometric face recognition ATMs, etc.

Law enforcement. Emotion detection technology allows to perform real-time analysis of the reactions in the suspects during interrogations, and analyze audio and video recordings. Additionally, these techniques can be used during recruitment for sensitive job roles.

Humanoid Robots

The most prominent example of how advanced AI can be is the 'citizen robot' Sophia by Hanson Robotics. Human-like robots can be engaged in customer service: there are robots-receptionists welcoming guests in one of the hotels in Tokyo. Robotic avatars, like the T-HR3 by Toyota, can mimic the movements of their human operators. Educational robots can read emotions, allow customized teaching activities and can be an effective solution to inclusive education, like SoftBank Robotics’ Pepper and NAO. And there are plenty of other examples.

AI-Based Recognition Services by Evergreen

Evergreen has experience in implementing AI-based recognition services in several clients’ projects, and we know how to use the cutting-edge trends to bring innovation to your business.

Our team has introduced a facial recognition system for Kreditmarket — one of the leaders in the consumer credit market of Ukraine. The solution we created helps detect a person’s presence in a photo and compare the detected face with pictures already stored in the system. This solution helps both facilitate the loan granting process and use less manual labor, as well as identify potential fraudsters and assess the related risks.

Another example of Evergreen AI solutions is an API service for passport recognition based on the use of neural networks. We have created a fully secure and reliable electronic pass recognition system that can read the required fields from a photo, analyze and convert data into the text at or above 96% accuracy level. To learn more about the capabilities of our service and use cases, please contact us right away. 

Neural network training allowed us to create an object recognition system for one of our client’s projects. Our solution enables us to effectively detect an object in a photo in a matter of seconds, even taking into account factors such as poor lighting, shooting angle, bad background, etc.

In fact, it is impossible to describe all the capabilities of artificial intelligence and emotional AI in one article. Despite the controversy, it is a promising technology that will reshape our lives in the nearest future in ways that will definitely surprise us.

Are you interested in developing a project using artificial intelligence to deliver innovation to your business, improve your customer experience management, or optimize your business processes? Would you like to create a product that requires the use of recognition technologies? Send us a message or fill in the form, and our specialists will be happy to support your initiatives.

The images used in this article are taken from open sources and are used as illustrations.
Do you want to discuss your project or order development?