Emotion recognition is a branch of artificial intelligence that aims at identifying emotion in human faces. In the last decade, it has seen increased interest both in academia and the industry, and the market is expected to grow to $85 billion by 2025. It has several applications, most of them at the very least ethically questionable. It allows employers to evaluate potential employees by scoring them on empathy or emotional intelligence, among other traits. It helps teachers remotely monitor students' engagement in schools or while they do classwork at home.
Human faces pop up on a screen, hundreds of them, one after another. Some have their eyes stretched wide, others show lips clenched. Some have eyes squeezed shut, cheeks lifted and mouths agape. For each one, you must answer this simple question: is this the face of someone having an orgasm or experiencing sudden pain? Psychologist Rachael Jack and her colleagues recruited 80 people to take this test as part of a study1 in 2018.
Kate Crawford (pictured), a principal researcher at Microsoft, and author of Atlas of AI (2021), is warning at Nature that the COVID-19 pandemic "is being used as a pretext to push unproven artificial-intelligence tools into workplaces and schools." The software is touted as able to read the "six basic emotions" via analysis of facial expressions: During the pandemic, technology companies have been pitching their emotion-recognition software for monitoring workers and even children remotely. Take, for example, a system named 4 Little Trees. Developed in Hong Kong, the program claims to assess children's emotions while they do classwork. It maps facial features to assign each pupil's emotional state into a category such as happiness, sadness, anger, disgust, surprise and fear. It also gauges'motivation' and forecasts grades.
Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short. While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train "behavior detection officers" to scan faces for signs of deception.
As artificial intelligence is used to make more decisions about our lives, engineers have sought out ways to make it more emotionally intelligent. That means automating some of the emotional tasks that come naturally to humans -- most notably, looking at a person's face and knowing how they feel. To achieve this, tech companies like Microsoft, IBM, and Amazon all sell what they call "emotion recognition" algorithms, which infer how people feel based on facial analysis. For example, if someone has a furrowed brow and pursed lips, it means they're angry. If their eyes are wide, their eyebrows are raised, and their mouth is stretched, it means they're afraid, and so on.