Goto

Collaborating Authors

Amazon and Microsoft claim AI can read human emotions. Experts say the science is shaky

#artificialintelligence

Facial recognition technology is being tested by businesses and governments for everything from policing to employee timesheets. Even more granular results are on their way, promise the companies behind the technology: Automatic emotion recognition could soon help robots understand humans better, or detect road rage in car drivers. But experts are warning that the facial-recognition algorithms that attempt to interpret facial expressions could be based on uncertain science. The claims are a part of AI Now Institute's annual report (pdf), a nonprofit that studies the impact of AI on society. The report also includes recommendations for the regulation of AI and greater transparency in the industry.


Micro-Facial Expression Recognition in Video Based on Optimal Convolutional Neural Network (MFEOCNN) Algorithm

arXiv.org Artificial Intelligence

Facial expression is a standout amongst the most imperative features of human emotion recognition. For demonstrating the emotional states facial expressions are utilized by the people. In any case, recognition of facial expressions has persisted a testing and intriguing issue with regards to PC vision. Recognizing the Micro-Facial expression in video sequence is the main objective of the proposed approach. For efficient recognition, the proposed method utilizes the optimal convolution neural network. Here the proposed method considering the input dataset is the CK+ dataset. At first, by means of Adaptive median filtering preprocessing is performed in the input image. From the preprocessed output, the extracted features are Geometric features, Histogram of Oriented Gradients features and Local binary pattern features. The novelty of the proposed method is, with the help of Modified Lion Optimization (MLO) algorithm, the optimal features are selected from the extracted features. In a shorter computational time, it has the benefits of rapidly focalizing and effectively acknowledging with the aim of getting an overall arrangement or idea. Finally, the recognition is done by Convolution Neural network (CNN). Then the performance of the proposed MFEOCNN method is analysed in terms of false measures and recognition accuracy. This kind of emotion recognition is mainly used in medicine, marketing, E-learning, entertainment, law and monitoring. From the simulation, we know that the proposed approach achieves maximum recognition accuracy of 99.2% with minimum Mean Absolute Error (MAE) value. These results are compared with the existing for MicroFacial Expression Based Deep-Rooted Learning (MFEDRL), Convolutional Neural Network with Lion Optimization (CNN+LO) and Convolutional Neural Network (CNN) without optimization. The simulation of the proposed method is done in the working platform of MATLAB.


Can AI Map Your Emotions?

#artificialintelligence

The final step for many artificial intelligence (AI) researchers is the development of a system that can identify human emotion from voice and facial expressions. While some facial scanning technology is available, there is still a long way to go in terms of properly identifying emotional states due to the complexity of nuances in speech as well as facial muscle movement. The University of Science and Technology researchers in Hefei, China, believe that they have made a breakthrough. Their paper, "Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio-video Emotion Recognition," expresses how an AI system may be able to recognize human emotion through state-of-the-art accuracy on a popular benchmark. In their published paper, the researchers say, "Automatic emotion recognition (AER) is a challenging task due to the abstract concept and multiple expressions of emotion. Inspired by this cognitive process in human beings, it's natural to simultaneously utilize audio and visual information in AER … The whole pipeline can be completed in a neural network."


What It Takes To Be Human (Paid Post by UBS From NYTimes.com)

#artificialintelligence

Decades have passed since Simon first explored the psychology of human cognition; today AI is more and more present in our lives, be it via customer service or pure entertainment. No matter what its application, the Holy Grail of any successful AI project is its ability to achieve seamless interaction with humans. And at the core is AI's capability to recognize and react to emotions. But first, what are the basic human emotions, and why are they so important? Identifying the key types – and number – of human emotions was tough even for Aristotle who, in the 4th century B.C., identified the following 14: confidence, anger, friendship, fear, calm, unkindness, shame, shamelessness, pity, kindness, indignation, emulation, enmity and envy.


What It Takes To Be Human (Paid Post by UBS From NYTimes.com)

#artificialintelligence

Decades have passed since Simon first explored the psychology of human cognition; today AI is more and more present in our lives, be it via customer service or pure entertainment. No matter what its application, the Holy Grail of any successful AI project is its ability to achieve seamless interaction with humans. And at the core is AI's capability to recognize and react to emotions. But first, what are the basic human emotions, and why are they so important? Identifying the key types – and number – of human emotions was tough even for Aristotle who, in the 4th century B.C., identified the following 14: confidence, anger, friendship, fear, calm, unkindness, shame, shamelessness, pity, kindness, indignation, emulation, enmity and envy.