What a simple question with so many complex answers yet most of us answer with "Good". Besides our culture, why do we default to a one-word expression to explain our feelings? We feel a range of emotions but to use them we must learn what they are and express them correctly. We've already discussed the 6 universal emotions; joy, surprise, sadness, anger, disgust, fear, and sometimes contempt, but there are many other emotions that stem off of them. Robert Plutchik, a well-known psychologist who studied emotions, created the wheel of emotions.
Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short. While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train "behavior detection officers" to scan faces for signs of deception.
As artificial intelligence is used to make more decisions about our lives, engineers have sought out ways to make it more emotionally intelligent. That means automating some of the emotional tasks that come naturally to humans -- most notably, looking at a person's face and knowing how they feel. To achieve this, tech companies like Microsoft, IBM, and Amazon all sell what they call "emotion recognition" algorithms, which infer how people feel based on facial analysis. For example, if someone has a furrowed brow and pursed lips, it means they're angry. If their eyes are wide, their eyebrows are raised, and their mouth is stretched, it means they're afraid, and so on.
Is emotional AI ready to be a key component of our cars and other devices? Analysts are predicting huge growth for emotional AI in the coming years, albeit with widely differing estimates. A 2018 study by Market Research Future (MRFR) predicted that the "emotional analytics" market, which includes video, speech, and facial analytics technologies among others, will be worth a whopping $25 billion globally by 2025. Tractica has made a more conservative estimate in its own analysis, but still predicted the "emotion recognition and sentiment analysis" market to reach $3.8 billion by 2025. Researchers at Gartner have predicted that by 2022 10 percent of all personal electronic devices will have emotion AI capabilities, either on the device itself or via cloud-based services.
For the past 30 years, Neuroscientist Lisa Barrett has been wondering what we've gotten so wrong about emotion and the brain. Sesame Street shows its itty bitty viewers what a picture-perfect sad face looks like – and how we ought to sound when we get mad. NBA teams trying to draft the next LeBron draw upon their own set of facial cues, hoping to assess a player's character or their "team chemistry." Barrett has always been bugged by this idea that our emotions should look or feel a certain way. As an eager graduate student in psychology, she set out to investigate–hoping the scientific method would help guide her back in the right direction.