Beyond the 6 Core Emotions and Our Expressions

#artificialintelligence

What a simple question with so many complex answers yet most of us answer with "Good". Besides our culture, why do we default to a one-word expression to explain our feelings? We feel a range of emotions but to use them we must learn what they are and express them correctly. We've already discussed the 6 universal emotions; joy, surprise, sadness, anger, disgust, fear, and sometimes contempt, but there are many other emotions that stem off of them. Robert Plutchik, a well-known psychologist who studied emotions, created the wheel of emotions.


Don't look now: why you should be worried about machines reading your emotions

The Guardian

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short. While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train "behavior detection officers" to scan faces for signs of deception.


AI "emotion recognition" can't be trusted

#artificialintelligence

As artificial intelligence is used to make more decisions about our lives, engineers have sought out ways to make it more emotionally intelligent. That means automating some of the emotional tasks that come naturally to humans -- most notably, looking at a person's face and knowing how they feel. To achieve this, tech companies like Microsoft, IBM, and Amazon all sell what they call "emotion recognition" algorithms, which infer how people feel based on facial analysis. For example, if someone has a furrowed brow and pursed lips, it means they're angry. If their eyes are wide, their eyebrows are raised, and their mouth is stretched, it means they're afraid, and so on.


What's the State of Emotional AI?

#artificialintelligence

Is emotional AI ready to be a key component of our cars and other devices? Analysts are predicting huge growth for emotional AI in the coming years, albeit with widely differing estimates. A 2018 study by Market Research Future (MRFR) predicted that the "emotional analytics" market, which includes video, speech, and facial analytics technologies among others, will be worth a whopping $25 billion globally by 2025. Tractica has made a more conservative estimate in its own analysis, but still predicted the "emotion recognition and sentiment analysis" market to reach $3.8 billion by 2025. Researchers at Gartner have predicted that by 2022 10 percent of all personal electronic devices will have emotion AI capabilities, either on the device itself or via cloud-based services.


Emotional Intelligence Needs a Rewrite - Issue 51: Limits

Nautilus

You've probably met people who are experts at mastering their emotions and understanding the emotions of others. When all hell breaks loose, somehow these individuals remain calm. They know what to say and do when their boss is moody or their lover is upset. It's no wonder that emotional intelligence was heralded as the next big thing in business success, potentially more important than IQ, when Daniel Goleman's bestselling book, Emotional Intelligence, arrived in 1995. After all, whom would you rather work with--someone who can identify and respond to your feelings, or someone who has no clue? Whom would you rather date?