Perception and expression of emotion are key factors to the success of dialogue systems or conversational agents. However, this problem has not been studied in large-scale conversation generation so far. In this paper, we propose Emotional Chatting Machine (ECM) that can generate appropriate responses not only in content (relevant and grammatical) but also in emotion (emotionally consistent). To the best of our knowledge, this is the first work that addresses the emotion factor in large-scale conversation generation. ECM addresses the factor using three new mechanisms that respectively (1) models the high-level abstraction of emotion expressions by embedding emotion categories, (2) captures the change of implicit internal emotion states, and (3) uses explicit emotion expressions with an external emotion vocabulary. Experiments show that the proposed model can generate responses appropriate not only in content but also in emotion.
Some are starting to use facial expression technology alongside artificial intelligence to identify the best candidates in job interviews in the UK. Applicants get filmed by phone or laptop while asked a set of job-related questions. The AI technology is then used to analyze the response in terms of the language, tone and facial expressions of the applicant. AI algorithms choose the best candidate. It compares candidate performance in the video against some 25,000 pieces of facial and linguistic information that they compile from previous interviews.
Karthigayan Muthukaruppanof Manipal International University in Selangor, Malaysia, and co-workers have developed a system using a genetic algorithm that gets better and better with each iteration to match irregular ellipse fitting equations to the shape of the human mouth displaying different emotions. They have used photos of individuals from South-East Asia and Japan to train a computer to recognize the six commonly accepted human emotions -- happiness, sadness, fear, angry, disgust, surprise -- and a neutral expression. The upper and lower lip is each analyzed as two separate ellipses by the algorithm. "In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression," the team explains. Earlier researchers have developed an understanding that allows emotion to be recreated by manipulating a representation of the human face on a computer screen.
In the early 1990s, Lisa Feldman Barrett had a problem. She was running an experiment to investigate how emotions affect self-perception, but her results seemed to be consistently wrong. She was studying for a PhD in the psychology of the self at the University of Waterloo, Ontario, Canada. As part of her research, she tested some of the textbook assumptions that she had been taught, including the assumption that people feel anxiety or depression when, despite living up to their own expectations, they do not live up to the expectations of others. But after designing and running her experiment, she discovered that her test subjects weren't distinguishing between anxiety and depression.
Decades have passed since Simon first explored the psychology of human cognition; today AI is more and more present in our lives, be it via customer service or pure entertainment. No matter what its application, the Holy Grail of any successful AI project is its ability to achieve seamless interaction with humans. And at the core is AI's capability to recognize and react to emotions. But first, what are the basic human emotions, and why are they so important? Identifying the key types – and number – of human emotions was tough even for Aristotle who, in the 4th century B.C., identified the following 14: confidence, anger, friendship, fear, calm, unkindness, shame, shamelessness, pity, kindness, indignation, emulation, enmity and envy.