Facial recognition technology is being tested by businesses and governments for everything from policing to employee timesheets. Even more granular results are on their way, promise the companies behind the technology: Automatic emotion recognition could soon help robots understand humans better, or detect road rage in car drivers. But experts are warning that the facial-recognition algorithms that attempt to interpret facial expressions could be based on uncertain science. The claims are a part of AI Now Institute's annual report (pdf), a nonprofit that studies the impact of AI on society. The report also includes recommendations for the regulation of AI and greater transparency in the industry.
The final step for many artificial intelligence (AI) researchers is the development of a system that can identify human emotion from voice and facial expressions. While some facial scanning technology is available, there is still a long way to go in terms of properly identifying emotional states due to the complexity of nuances in speech as well as facial muscle movement. The University of Science and Technology researchers in Hefei, China, believe that they have made a breakthrough. Their paper, "Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio-video Emotion Recognition," expresses how an AI system may be able to recognize human emotion through state-of-the-art accuracy on a popular benchmark. In their published paper, the researchers say, "Automatic emotion recognition (AER) is a challenging task due to the abstract concept and multiple expressions of emotion. Inspired by this cognitive process in human beings, it's natural to simultaneously utilize audio and visual information in AER … The whole pipeline can be completed in a neural network."
Researchers at the University of California San Diego have devised an artificial intelligence (AI) tool to predict the level of loneliness in adults, with 94% accuracy. The tool used Natural Language Processing (NLP) developed by IBM to process large amounts of unstructured natural speech and text data. It analysed factors like cognition, mobility, sleep and physical activity to understand the process of aging. This tool is an example of how AI can be used in devices to detect mental health conditions. Market research firm Gartner predicts, by 2022, your personal device will know more about your emotional state than your own family members.
What a simple question with so many complex answers yet most of us answer with "Good". Besides our culture, why do we default to a one-word expression to explain our feelings? We feel a range of emotions but to use them we must learn what they are and express them correctly. We've already discussed the 6 universal emotions; joy, surprise, sadness, anger, disgust, fear, and sometimes contempt, but there are many other emotions that stem off of them. Robert Plutchik, a well-known psychologist who studied emotions, created the wheel of emotions.