The Definitive Case For Emotion AI
Intelligence Derived from Facial Coding Yields Concrete Outcomes.
The basis for using technology to analyze human emotions according to facial cues was inspired by Paul Eckman’s research into facial expressions and emotion in the 1960s and 70s. Intrigue and solutions based on the scientific link between emotion and facial expressions has continued ever since Eckman’s early published work. Debate and scrutiny are the foundation of scientific inquiry, and scientific conclusions and inferences are built on observation and data. Such is the case with facial coding and Emotion AI technology.
Recently, publications like Nature and the Verge have taken a close look at the merits and misconceptions of facial coding and emotion AI. On one hand, you have influential scientists – including Eckman himself – who have proved the premise that facial expressions can be analyzed to detect concrete aspects of a person’s emotional experience.
On the other hand, you have academics like Northeastern University Professor Lisa Barrett, who argue that facial expressions can’t be reliable indicators of a person’s inner emotional state.
Clarity often is hidden amidst the noise of misunderstanding, misinformation, and conflation. We should defer to observation, data, and truth.
Read more: The AI Gold Rush: How to Make Money off AI and Machine Learning!
It is in that spirit that companies like mine, Realeyes, use technology to gather data and observations from the real world at scale, and apply validated conclusions to real-world use cases. We use front-facing cameras and panels of adult, opt-in participants to anonymously measure their facial cues as they view Video content or interact with applications on computer devices.
We’ve built the world’s richest database of facial coding data to detect attention and key emotions in this specific context- 620 million emotion AI labels to date. Moreover, Realeyes’s technology and algorithms improve continuously on their ability to accurately detect attention and key emotions.
Indeed, there are some critics of emotion AI who say emotion detection technologies lack context. A person who is crying might be happy or sad, depending on the circumstances. Does emotion AI offer a universal, conclusive diagnosis of a single person’s subjective, inner experience, such as in cognitive behavioral therapy? No, of course not.
However, technology makes emotion detection reliable, scalable, and invaluable in many important use cases. In our case, that is understanding human response and consumer experience in the specific context of video content and interactive applications.
These data empower our customers — marketers and publishers — to better understand what elements of their video and interactive content are likely to resonate with their audiences, so they can create better, more engaging experiences and create more smiles. Simply put, it informs decisions and creates better business results and better experiences.
The usefulness of emotion AI is not limited to measuring human response and detecting emotions at face value. Improving detection capabilities enable the data to more accurately predict key behaviors like video completion rates or future social sharing of content.
In fact, we can predict the in-market view-through rate of an ad across different websites with 90% accuracy – all from a single pre-launch emotion detection study. This is happening today. Even if attention and emotion data are not interesting to you at face value, they yield intelligence which is highly predictive of other behavioral outcomes that do matter.
Read more: AiThority.Com Primer: AI Face Recognition Camera
The marketing and media industries, where we focus our applications of emotion detection, are flooded with companies claiming to provide “data-driven insights.” These industries are challenged by an addiction to self-reported survey research that is severely biased to professional survey responders and survey designs that yield answers that researchers want to hear.
Emotion detection yields passive, objective measurements that are immune to these deficiencies — observational, naturally occurring, and very granular. (I.e., try asking a survey responder to identify the exact frame in an ad that made her pay attention or laugh.) The science and data get stronger every day, and a growing body of companies across industries have adopted the efficacy of informing and optimizing video content according to emotion AI and facial coding.
The science delivers value in the form of better business outcomes and better experiences for people, and that’s validated by more than 150 of our own clients, including major brands and media companies like Mars Inc., WarnerMedia and Teads. And like any science-driven company, we welcome scrutiny and collaboration so we can continuously validate and improve the science, innovate and raise the bar.
Read more: AI: The End of Premature Deaths?
Comments are closed.