Computer Maps 21 Distinct Emotional Expressions - Even "Happily Disgusted"

Posted on April 4, 2014

Researchers at The Ohio State University have found a way for computers to recognize 21 distinct facial expressions - even expressions for complex or seemingly contradictory emotions such as "happily disgusted" or "sadly angry."

In the current issue of the Proceedings of the National Academy of Sciences, they report that they were able to more than triple the number of documented facial expressions that researchers can now use for cognitive analysis.

"We've gone beyond facial expressions for simple emotions like 'happy' or 'sad.' We found a strong consistency in how people move their facial muscles to express 21 categories of emotions," said Aleix Martinez, a cognitive scientist and associate professor of electrical and computer engineering at Ohio State. "That is simply stunning. That tells us that these 21 emotions are expressed in the same way by nearly everyone, at least in our culture."

The resulting computational model will help map emotion in the brain with greater precision than ever before, and perhaps even aid the diagnosis and treatment of mental conditions such as autism and post-traumatic stress disorder (PTSD).

Since at least the time of Aristotle, scholars have tried to understand how and why our faces betray our feelings - from happy to sad, and the whole range of emotions beyond. Today, the question has been taken up by cognitive scientists who want to link facial expressions to emotions in order to track the genes, chemicals, and neural pathways that govern emotion in the brain.

Until now, cognitive scientists have confined their studies to six basic emotions - happy, sad, fearful, angry, surprised and disgusted - mostly because the facial expressions for them were thought to be self-evident, Martinez explained.

But deciphering a person's brain functioning with only six categories is like painting a portrait with only primary colors, Martinez said: it can provide an abstracted image of the person, but not a true-to-life one.

What Martinez and his team have done is more than triple the color palette - with a suite of emotional categories that can be measured by the proposed computational model and applied in rigorous scientific study.

"In cognitive science, we have this basic assumption that the brain is a computer. So we want to find the algorithm implemented in our brain that allows us to recognize emotion in facial expressions," he said. "In the past, when we were trying to decode that algorithm using only those six basic emotion categories, we were having tremendous difficulty. Hopefully with the addition of more categories, we'll now have a better way of decoding and analyzing the algorithm in the brain."


Source material from Ohio State University