Goto

Collaborating Authors


Modelling Individual Negative Emotion Spreading Process with Mobile Phones

AAAI Conferences

Individual mood is important for physical and emotional well-being, creativity and working memory. However, due to the lack of long-term real tracking daily data in individual level, most current works focus their efforts on population level and short-term small group. An ignored yet important task is to find the sentiment spreading mechanism in individual level from their daily behavior data. This paper studies this task by raising the following fundamental and summarization question, being not sufficiently answered by the literature so far:Given a social network, how the sentiment spread? The current individual-level network spreading models always assume one can infect others only when he/she has been infected. Considering the negative emotion spreading characters in individual level, we loose this assumption, and give an individual negative emotion spreading model. In this paper, we propose a Graph-Coupled Hidden Markov Sentiment Model for modeling the propagation of infectious negative sentiment locally within a social network. Taking the MIT Social Evolution dataset as an example, the experimental results verify the efficacy of our techniques on real-world data.


AI Series - Part Two - Programming Emotions

#artificialintelligence

Well done in kickstarting Azure Cognitive Services Emotions API. Remember that Emotions API(Project Oxford) is still in "Preview Stage" so not all your images are meant to work (Tried like 10 happiness emotion images and only 1 got processed).


Can AI Map Your Emotions?

#artificialintelligence

The final step for many artificial intelligence (AI) researchers is the development of a system that can identify human emotion from voice and facial expressions. While some facial scanning technology is available, there is still a long way to go in terms of properly identifying emotional states due to the complexity of nuances in speech as well as facial muscle movement. The University of Science and Technology researchers in Hefei, China, believe that they have made a breakthrough. Their paper, "Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio-video Emotion Recognition," expresses how an AI system may be able to recognize human emotion through state-of-the-art accuracy on a popular benchmark. In their published paper, the researchers say, "Automatic emotion recognition (AER) is a challenging task due to the abstract concept and multiple expressions of emotion. Inspired by this cognitive process in human beings, it's natural to simultaneously utilize audio and visual information in AER … The whole pipeline can be completed in a neural network."


Scientist Claims to be On the Verge of Making An AI That 'Feels' True Emotions

#artificialintelligence

AI has been making great strides in the past few years, beating humans at our own game, as well as augmenting and even replacing human controlled systems. However, some are still not impressed with these developments and feel more should be done. Such is the view of Professor Alexi Samsonovich, who announced that Russia "is on the verge" of a major AI milestone--robots that can feel human emotion! The announcement was made during the 2016 Annual International Conference on Biologically Inspired Cognitive Architectures (BICA) in New York City. Specifically, Samsonovich pointed to free thinking machines capable of feeling and understanding human emotions, understanding narratives and thinking in those narratives, as well as being capable to actively learn on their own.