Facial recognition technology has progressed to point where it now interprets emotions in facial expressions. This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions. Other programs scan the faces in crowds to identify threats to public safety. Unfortunately, this technology struggles to interpret the emotions of black faces.
Sentiment Analysis is already widely used by different companies to gauge consumer mood towards their product or brand in the digital world. However, in the offline world, users are also interacting with the brands and products in retail stores, showrooms, etc., and solutions to measure users' reactions automatically under such settings has remained a challenging task. Emotion detection from facial expressions using AI can be a viable alternative to automatically measure consumers' engagement with their content and brands. In this post, we will discuss how such a technology can be used to solve a variety of real-world use-cases effectively. Car manufacturers around the world are increasingly focusing on making cars more personal and safe for us to drive.
Amazon announced a breakthrough from its AI experts Monday: Their algorithms can now read fear on your face, at a cost of $0.001 per image--or less if you process more than 1 million images. The news sparked interest because Amazon is at the center of a political tussle over the accuracy and regulation of facial recognition. Amazon sells a facial-recognition service, part of a suite of image-analysis features called Rekognition, to customers that include police departments. Another Rekognition service tries to discern the gender of faces in photos. The company said Monday that the gender feature had been improved--apparently a response to research showing it was much less accurate for people with darker skin.
As artificial intelligence is used to make more decisions about our lives, engineers have sought out ways to make it more emotionally intelligent. That means automating some of the emotional tasks that come naturally to humans -- most notably, looking at a person's face and knowing how they feel. To achieve this, tech companies like Microsoft, IBM, and Amazon all sell what they call "emotion recognition" algorithms, which infer how people feel based on facial analysis. For example, if someone has a furrowed brow and pursed lips, it means they're angry. If their eyes are wide, their eyebrows are raised, and their mouth is stretched, it means they're afraid, and so on.
By 2022, IDC predicts, 30 percent of enterprises will use interactive conversational speech technologies to power customer engagement, and affective computing will see a 25 percent jump in real-world applications. "It's not necessarily going to be everywhere," Sutherland says. "But we do expect to see a pickup in terms of moving from experimentation to actual production." As researchers and private companies teach machines to recognize differentiation in vocal inflection, facial expressions and other cues, experts say the field is ripe for applications in business. Companies are already using emotion AI for market research and political polling purposes.