Goto

Collaborating Authors

Emotion


Time to regulate AI that interprets human emotions

#artificialintelligence

During the pandemic, technology companies have been pitching their emotion-recognition software for monitoring workers and even children remotely. Take, for example, a system named 4 Little Trees. Developed in Hong Kong, the program claims to assess children's emotions while they do classwork. It maps facial features to assign each pupil's emotional state into a category such as happiness, sadness, anger, disgust, surprise and fear. It also gauges'motivation' and forecasts grades.


Ethics Sheet for Automatic Emotion Recognition and Sentiment Analysis

arXiv.org Artificial Intelligence

The importance and pervasiveness of emotions in our lives makes affective computing a tremendously important and vibrant line of work. Systems for automatic emotion recognition (AER) and sentiment analysis can be facilitators of enormous progress (e.g., in improving public health and commerce) but also enablers of great harm (e.g., for suppressing dissidents and manipulating voters). Thus, it is imperative that the affective computing community actively engage with the ethical ramifications of their creations. In this paper, I have synthesized and organized information from AI Ethics and Emotion Recognition literature to present fifty ethical considerations relevant to AER. Notably, the sheet fleshes out assumptions hidden in how AER is commonly framed, and in the choices often made regarding the data, method, and evaluation. Special attention is paid to the implications of AER on privacy and social groups. The objective of the sheet is to facilitate and encourage more thoughtfulness on why to automate, how to automate, and how to judge success well before the building of AER systems. Additionally, the sheet acts as a useful introductory document on emotion recognition (complementing survey articles).


Fusion with Hierarchical Graphs for Mulitmodal Emotion Recognition

arXiv.org Artificial Intelligence

Automatic emotion recognition (AER) based on enriched multimodal inputs, including text, speech, and visual clues, is crucial in the development of emotionally intelligent machines. Although complex modality relationships have been proven effective for AER, they are still largely underexplored because previous works predominantly relied on various fusion mechanisms with simply concatenated features to learn multimodal representations for emotion classification. This paper proposes a novel hierarchical fusion graph convolutional network (HFGCN) model that learns more informative multimodal representations by considering the modality dependencies during the feature fusion procedure. Specifically, the proposed model fuses multimodality inputs using a two-stage graph construction approach and encodes the modality dependencies into the conversation representation. We verified the interpretable capabilities of the proposed method by projecting the emotional states to a 2D valence-arousal (VA) subspace. Extensive experiments showed the effectiveness of our proposed model for more accurate AER, which yielded state-of-the-art results on two public datasets, IEMOCAP and MELD.


The Importance of Emotional AI in Business in the 21st Century

#artificialintelligence

The world is moving towards a future dependent on artificial intelligence; therefore, the need for emotional intelligence is now greater than ever. Our relationship with advanced technologies is growing increasingly complex. Mobiles, tablets, and laptops have become an inherent part of our lives. We rely on technology like never before. Emotional intelligence in business management is important because it helps manage the utilization of AI in business and analyzes the marketing efforts for a better customer experience. Several large and small-scale organizational are adopting AI technologies to amplify their growth and surpass prior business results.


How are you feeling? AI wants to know

#artificialintelligence

How are you feeling today? This is the question that a new generation of artificial intelligence is getting to grips with. Referred to as emotional AI, these technologies use a variety of advanced methods including computer vision, speech recognition and natural language processing to gauge human emotion and respond accordingly. Prof Alan Smeaton, lecturer and researcher in the school of computing, Dublin City University (DCU), and founding director of the Insight Centre for Data Analytics, is working on the application of computer vision to detect a very specific state: inattention. Necessity is the mother of invention and Help Me Watch was developed at DCU during the pandemic in response to student feedback on the challenges of online lectures.


New architectural installation responds to emotions

#artificialintelligence

Morphogenesis Lab, a team specialized in interactive architectural systems, developed Wisteria, an emotive intelligent installation that performs concurrent responses to people's emotions based on biological and neurological data. In this project, visitors can change the color and form of the installation using their brains and emotions. Artificial intelligence, wearable technology, sensory environments, and adaptive architecture were integrated to create an emotional bond between a space and its occupants. The space is filled with a forest of cylindrical fabric shrouds that suspend from the ceiling. Upon sensing the presence of an occupant, using a programmable material called shape-memory-alloy (SMA), the shrouds begin to fluctuate, expanding and contracting the volume of the space in rhythm and sequence.


#IROS2020 Plenary and Keynote talks focus series #5: Nikolaus Correll & Cynthia Breazeal

Robohub

As part of our series showcasing the plenary and keynote talks from the IEEE/RSJ IROS2020 (International Conference on Intelligent Robots and Systems), this week we bring you Nikolaus Correll (Associate Professor at the University of Colorado at Boulder) and Cynthia Breazeal (Professor of Media Arts and Sciences at MIT). Nikolaus' talk is on the topic of robot manipulation, while Cynthia's talk is about the topic of social robots. Bio: Nikolaus Correll is an Associate Professor at the University of Colorado at Boulder. He obtained his MS in Electrical Engineering from ETH Zürich and his PhD in Computer Science from EPF Lausanne in 2007. From 2007-2009 he was a post-doc at MIT's Computer Science and Artificial Intelligence Lab (CSAIL).


Emotional Intelligence: A Hidden Key to Career and Workplace Success

#artificialintelligence

When the term "intelligence" comes up in regular conversation, most of us associate it with a person's capacity to acquire knowledge and new skills. This type of intelligence can be measured with IQ, which helps us determine if the test taker is closer to a Stephen Hawking or a Lloyd Christmas on the smarts scale. And certainly, given no other data, a hiring manager would likely prefer to choose someone on the Hawking end of the spectrum. But while IQ is useful, it's also clear that emotional intelligence (EQ) can be a difference maker in any professional role. Have you ever met an entrepreneur with so much empathy and awareness, that they can read people in every situation and relate?


CoMPM: Context Modeling with Speaker's Pre-trained Memory Tracking for Emotion Recognition in Conversation

arXiv.org Artificial Intelligence

As the use of interactive machines grow, the task of Emotion Recognition in Conversation (ERC) became more important. If the machine generated sentences reflect emotion, more human-like sympathetic conversations are possible. Since emotion recognition in conversation is inaccurate if the previous utterances are not taken into account, many studies reflect the dialogue context to improve the performances. We introduce CoMPM, a context embedding module (CoM) combined with a pre-trained memory module (PM) that tracks memory of the speaker's previous utterances within the context, and show that the pre-trained memory significantly improves the final accuracy of emotion recognition. We experimented on both the multi-party datasets (MELD, EmoryNLP) and the dyadic-party datasets (IEMOCAP, DailyDialog), showing that our approach achieve competitive performance on all datasets.


Artificial emotional intelligence: a safer, smarter future with 5G and emotion recognition

#artificialintelligence

With the advent of 5G communication technology and its integration with AI, we are looking at the dawn of a new era in which people, machines, objects, and devices are connected like never before. This smart era will be characterized by smart facilities and services such as self-driving cars, smart UAVs, and intelligent healthcare. This will be the aftermath of a technological revolution. But the flip side of such technological revolution is that AI itself can be used to attack or threaten the security of 5G-enabled systems which, in turn, can greatly compromise their reliability. It is, therefore, imperative to investigate such potential security threats and explore countermeasures before a smart world is realized.