Goto

Collaborating Authors

We Have to Stop Doing AI Emotion Recognition

#artificialintelligence

Emotion recognition is a branch of artificial intelligence that aims at identifying emotion in human faces. In the last decade, it has seen increased interest both in academia and the industry, and the market is expected to grow to $85 billion by 2025. It has several applications, most of them at the very least ethically questionable. It allows employers to evaluate potential employees by scoring them on empathy or emotional intelligence, among other traits. It helps teachers remotely monitor students' engagement in schools or while they do classwork at home.


Don't look now: why you should be worried about machines reading your emotions

The Guardian

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short. While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train "behavior detection officers" to scan faces for signs of deception.


Machines are getting freakishly good at recognizing human emotions Digital Trends

#artificialintelligence

Until very recently we've had to interact with computers on their own terms. To use them, humans had to learn inputs designed to be understood by the computer: whether it was typing commands or clicking icons using a mouse. The rise of A.I. voice assistants like Siri and Alexa make it possible for machines to understand humans as they would ordinarily interact in the real world. Now researchers are reaching for the next Holy Grail: Computers that can understand emotions. Whether it's Arnold Schwarzenegger's T-1000 robot in Terminator 2 or Data, the android character in Star Trek: The Next Generation, the inability of machines to understand and properly respond to human emotions has long been a common sci-fi trope.


Empathy: The Killer App for Artificial Intelligence

#artificialintelligence

After studying the tribe, which was still living in the preliterate state it had been in since the Stone Age, Ekman believed he had found the blueprint for a set of universal human emotions and related expressions that crossed cultures and were present in all humans. A decade later he created the Facial Action Coding System, a comprehensive tool for objectively measuring facial movement. Ekman's work has been used by the FBI and police departments to identify the seeds of violent behavior in nonverbal expressions of sentiment. He has also developed the online Atlas of Emotions at the behest of the Dalai Lama. And today his research is being used to teach computer systems how to feel.


The Ethics of AI and Emotional Intelligence - The Partnership on AI

#artificialintelligence

The experimental use of AI spread across sectors and moved beyond the internet into the physical world. Stores used AI perceptions of shoppers' moods and interest to display personalized public ads. Schools used AI to quantify student joy and engagement in the classroom. Employers used AI to evaluate job applicants' moods and emotional reactions in automated video interviews and to monitor employees' facial expressions in customer service positions. It was a year notable for increasing criticism and governance of AI related to emotion and affect.