Goto

Collaborating Authors

Researchers Take Computers A Step Closer to Perceive Human Emotions

#artificialintelligence

Humanity is in a constant race towards making computers and other gadgets smarter than humans and enabling them to carry out elaborate processes and activities that are only deemed possible by us mere mortals as of now. One such power is perceiving human emotions. Till date, only humans have been granted this ability to be able to detect and gauge the emotions of the people surrounding us and then act accordingly in a set environment. However now, computers could have been empowered enough to do the same and in quite an efficient manner! The researchers of MIT Media Lab have been successful in creating models of machine learning that can "read" facial expressions to understand the emotions of humans.


Mars "emotions" study shows which ads sell with 75% accuracy Netimperative - latest digital marketing news

#artificialintelligence

A study by Realeyes and Mars, Incorporated has revealed emotion measurement technology can distinguish between ads which deliver high or zero/low sales lift with 75% accuracy. The study involved 149 ads across 35 brands and 22,334 people in six countries. Realeyes measured how people felt while they watched the ads by using artificial intelligence to analyse their facial expressions through their webcams (with their consent). The study was designed in collaboration with the Mars Marketing Laboratory at the Ehrenberg-Bass Institute for Marketing Science. Realeyes' emotion data was cross-referenced with Mars, Incorporated's known sales lift data for each ad to investigate the relationship between emotions and sales performance.


Why is AI Considered a Misfit to Read Human Emotions?

#artificialintelligence

AI has been reigning in the industries and business ecosystems with its unending capabilities to accelerate automation and provide business intelligence. Disruptive technologies like artificial intelligence, machine learning, blockchain, etc. have enabled companies to create better user experiences and advance business growth. Emotional AI is a rather recent development in the field of modern technology, and it claims that AI systems can read facial expressions and analyze human emotions. This method is also known as affect recognition technology. Recently Article 19, a British human rights organization published a report stating the increasing use of AI-based emotion recognition technology in China by the law enforcement authorities, corporate bodies, and the state itself.


Robotic AI Helping Autistic Kids Read Emotions

#artificialintelligence

Kids with autistic conditions sometimes work with robots to help them better distinguish emotions. Now, scientists at MIT are using artificial intelligence to make sure these robots understand the children they're working with. When therapists use robots for kids with Autism Spectrum Disorders (ASD), they do so to demonstrate how to better understand emotions and socialize in a more general population. A study in 2012 showed several "potential advantages to using interactive robots" with ASD children. Robots like Milo or NAO can walk, talk and even mimic human facial expressions.


Micro-Facial Expression Recognition in Video Based on Optimal Convolutional Neural Network (MFEOCNN) Algorithm

arXiv.org Artificial Intelligence

Facial expression is a standout amongst the most imperative features of human emotion recognition. For demonstrating the emotional states facial expressions are utilized by the people. In any case, recognition of facial expressions has persisted a testing and intriguing issue with regards to PC vision. Recognizing the Micro-Facial expression in video sequence is the main objective of the proposed approach. For efficient recognition, the proposed method utilizes the optimal convolution neural network. Here the proposed method considering the input dataset is the CK+ dataset. At first, by means of Adaptive median filtering preprocessing is performed in the input image. From the preprocessed output, the extracted features are Geometric features, Histogram of Oriented Gradients features and Local binary pattern features. The novelty of the proposed method is, with the help of Modified Lion Optimization (MLO) algorithm, the optimal features are selected from the extracted features. In a shorter computational time, it has the benefits of rapidly focalizing and effectively acknowledging with the aim of getting an overall arrangement or idea. Finally, the recognition is done by Convolution Neural network (CNN). Then the performance of the proposed MFEOCNN method is analysed in terms of false measures and recognition accuracy. This kind of emotion recognition is mainly used in medicine, marketing, E-learning, entertainment, law and monitoring. From the simulation, we know that the proposed approach achieves maximum recognition accuracy of 99.2% with minimum Mean Absolute Error (MAE) value. These results are compared with the existing for MicroFacial Expression Based Deep-Rooted Learning (MFEDRL), Convolutional Neural Network with Lion Optimization (CNN+LO) and Convolutional Neural Network (CNN) without optimization. The simulation of the proposed method is done in the working platform of MATLAB.