Affective Computing and Applications of Image Emotion Perceptions

AAAI Conferences

Images can convey rich semantics and evoke strong emotions in viewers. The research of my PhD thesis focuses on image emotion computing (IEC), which aims to predict the emotion perceptions of given images. The development of IEC is greatly constrained by two main challenges: affective gap and subjective evaluation. Previous works mainly focused on finding features that can express emotions better to bridge the affective gap, such as elements-of-art based features and shape features. According to the emotion representation models, including categorical emotion states (CES) and dimensional emotion space (DES), three different tasks are traditionally performed on IEC: affective image classification, regression and retrieval. The state-of-the-art methods on the three above tasks are image-centric, focusing on the dominant emotions for the majority of viewers. For my PhD thesis, I plan to answer the following questions: (1) Compared to the low-level elements-of-art based features, can we find some higher level features that are more interpretable and have stronger link to emotions? (2) Are the emotions that are evoked in viewers by an image subjective and different? If they are, how can we tackle the user-centric emotion prediction? (3) For image-centric emotion computing, can we predict the emotion distribution instead of the dominant emotion category?


Personality, Affect and Emotion Taxonomy for Socially Intelligent Agents

AAAI Conferences

In this article, we describe an Affective Knowledge Representation (AKR) scheme to represent emotion schemata to be used in the design a variety of socially intelligent artificial agents.


A deep learning technique for context-aware emotion recognition

#artificialintelligence

A team of researchers at Yonsei University and École Polytechnique Fédérale de Lausanne (EPFL) has recently developed a new technique that can recognize emotions by analyzing people's faces in images along with contextual features. They presented and outlined their deep learning-based architecture, called CAER-Net, in a paper pre-published on arXiv. For several years, researchers worldwide have been trying to develop tools for automatically detecting human emotions by analyzing images, videos or audio clips. These tools could have numerous applications, for instance, improving robot-human interactions or helping doctors to identify signs of mental or neural disorders (e.g.,, based on atypical speech patterns, facial features, etc.). So far, the majority of techniques for recognizing emotions in images have been based on the analysis of people's facial expressions, essentially assuming that these expressions best convey humans' emotional responses.


A deep learning technique for context-aware emotion recognition

#artificialintelligence

A team of researchers at Yonsei University and École Polytechnique Fédérale de Lausanne (EPFL) has recently developed a new technique that can recognize emotions by analyzing people's faces in images along with contextual features. They presented and outlined their deep learning-based architecture, called CAER-Net, in a paper pre-published on arXiv. For several years, researchers worldwide have been trying to develop tools for automatically detecting human emotions by analyzing images, videos or audio clips. These tools could have numerous applications, for instance, improving robot-human interactions or helping doctors to identify signs of mental or neural disorders (e.g.,, based on atypical speech patterns, facial features, etc.). So far, the majority of techniques for recognizing emotions in images have been based on the analysis of people's facial expressions, essentially assuming that these expressions best convey humans' emotional responses.


Emotions for Strategic Real-Time Systems

AAAI Conferences

Strategic real-time systems are of high potential and their applications are growing, although they are mostly prevalent in video games, military training, and military planning. We propose a paradigm to advance current systems by introducing emotions into the simulated agents that make decisions and solve situations cooperatively. By utilizing emotional reactions and communication, we hope to advance these systems so that the decision process better mimics human behavior. Since our system allows sharing of emotions with nearby agents it utilizes both internal and external emotional control.