Understanding the Hidden Bias in Emotion-Reading AIs

#artificialintelligence

Facial recognition technology has progressed to point where it now interprets emotions in facial expressions. This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions. Other programs scan the faces in crowds to identify threats to public safety. Unfortunately, this technology struggles to interpret the emotions of black faces.


Your dog will trust you less when you're angry: Canines lose confidence in people who show a negative attitude

Daily Mail - Science & tech

When humans have a bad attitude, their canine companions might not be so quick to follow their lead. In a recent study, researchers conducted a series of experiments to see the effects of human emotions on a dog's response to pointing cues. While adding a positive expression to the gesture wasn't found to increase a dog's ability to locate a treat, the dogs hesitated before exploring when responding to a person with a negative disposition. When humans have a bad attitude, their canine companions might not be so quick to follow their lead. In a recent study, researchers found that adding a positive expression to pointing didn't increase a dog's ability to locate a treat, but the dogs hesitated before exploring when responding to a person with a negative disposition Researchers studied the response of dogs to an unfamiliar adult gesturing toward two covered bowls.


Exploiting Emotion on Reviews for Recommender Systems

AAAI Conferences

Review history is widely used by recommender systems to infer users' preferences and help find the potential interests from the huge volumes of data, whereas it also brings in great concerns on the sparsity and cold-start problems due to its inadequacy. Psychology and sociology research has shown that emotion information is a strong indicator for users' preferences. Meanwhile, with the fast development of online services, users are willing to express their emotion on others' reviews, which makes the emotion information pervasively available. Besides, recent research shows that the number of emotion on reviews is always much larger than the number of reviews. Therefore incorporating emotion on reviews may help to alleviate the data sparsity and cold-start problems for recommender systems. In this paper, we provide a principled and mathematical way to exploit both positive and negative emotion on reviews, and propose a novel framework MIRROR, exploiting eMotIon on Reviews for RecOmmendeR systems from both global and local perspectives. Empirical results on real-world datasets demonstrate the effectiveness of our proposed framework and further experiments are conducted to understand how emotion on reviews works for the proposed framework.


Researchers improve AI emotion classification by combining speech and facial expression data

#artificialintelligence

Systems that can classify a person's emotion from their voice and facial tics alone are a longstanding goal of some AI researchers. Firms like Affectiva, which recently launched a product that scans drivers' faces and voices to monitor their mood, are moving the needle in the right direction. But considerable challenges remain, owing to nuances in speech and muscle movements. Researchers at the University of Science and Technology of China in Hefei claim to have made progress, though. In a paper published on the preprint server Arxiv.org this week ("Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio video Emotion Recognition"), they describe an AI system that can recognize a person's emotional state with state-of-the-art accuracy on a popular benchmark.


Is facial recognition tech RACIST? Expert says AI assign more negative emotions to black men's faces

Daily Mail

Facial recognition technology has progressed to point where it now interprets emotions in facial expressions. This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions. Other programs scan the faces in crowds to identify threats to public safety. Unfortunately, this technology struggles to interpret the emotions of black faces.