Have You Ever Read Your Clients’ Emotions Through the Monitor?

1000x1000px-profile-icon-only-grey
Codezilla
  • Date Published
  • Categories Blog
  • Reading Time 4-Minute Read

Check out how some of the big brands are using facial emotion reading APIs!

Have you ever thought that you can go beyond your monitor when talking to your audience? How about testing reactions to campaigns or ads? It may sound a bit like science fiction, but it’s easier than you might think.

How Emotions Reading APIs Change the Game

Starting from being a research subject, this tool is now a 20-dollar billion industry, says The Guardian. Emotion recognition can now stand as an individual tool for AI marketing with its own benefits, such as…

  • Imagine a world where testing products provides accurate and complete results. Emotions reading APIs can make this possible. Disney tested plenty of films using this option, including ”Star Wars: The Force Awakens” and ”Zootopia”, according to Financial Times. Even Coca-Cola and Intel tested how audiences respond to ads.
  • Brands can use them as tools to accomplish their goals. Ford, BMW and Kia Motors declared they want to use it to assess driver alertness, says the same source.
  • When you know a customer’s emotions, you get access to unique data that allows you to improve user experience.
  • Go beyond digital and test your outdoor billboards, just like London’s Piccadilly Circus did to analyze reactions to ads.
  • Big campaigns can benefit from tools that allow them to send a civic message. The UK Lincolnshire police use emotion recognition to identify suspicious people.
  • If you own a store, you can determine the focus point that attracts customers and if they provide positive feedback for the design.
  • Emotions reading APIS are easy to integrate into your app and can be useful for security or to provide CX data.

Some of the Questions Brands Might Ask

There are six basic emotions and the reader has been trained to recognize each of them – joy, surprise, anger, fear, sadness and disgust. We have reactions to all of them and our faces show each one.

Do people react the same? ScienceDirect also took into consideration the neutral state of mind, and developed a study that led to surprising results. The accuracy of emotions had a spectacular rate of 96%, while the satisfactory classification accuracy was at 73%. The tool was able to recognize the emotions. Yet, researchers admit that the classification accuracy may be influenced by the fact that people may react to a less or greater extent when feeling authentic emotions.

How can the database be accurate? Let’s take an example. Affectiva, the first business that markets “artificial emotional intelligence” works with 7.5 million faces from 87 countries to support their results.

Can people lie? Since recognition relies on algorithms, people can freely “fake” on purpose what they are feeling. However, when you are watching a two-hour-long movie, you might find it difficult to fake your reactions and stay relevant.

Limitations of Facial Emotions Readers

The Microsoft Case

There has been a fuss both when Microsoft released its Perceived Emotion Recognition Using the Face API and when the tech giant announced the tool will no longer be available to the audience. The first limitation comes from the brand itself.

It is important to note, however, that facial expressions alone may not necessarily represent the internal states of people.

In June 2022, Microsoft Corp. announced that it limits access to some facial recognition services and even removes others. It also retired the Face API’s ability to identify “attributes such as gender, age, smile, facial hair, hair and makeup”.

The company also restricted the usage of software mimicking voices through AI.

The Google Case

Alphabet Inc. blocked AI features that analyze emotions, in an initiative to protect itself from cultural insensitivity.

Google decided to block 13 planned emotions and review existing ones, like sorrow and joy. It decided instead to focus on frowning and smiling, but without mapping them to emotions.

Individual Studies

Some researchers claim that emotion technology might be biased. Why? Lauren Rhue from the University of Maryland says that emotion analysis tech assigns more negative emotions to black men’s faces than to white men.

What did others notice?

Tech and Marketing Remain Friends

Marketers can benefit from face emotions by reading APIs and integrating them into products and services. However, users must first agree to be analyzed by AI. Yet, tech has a long and shiny road ahead and emotions might just be the next tools you will trust when analyzing results.