After studying the tribe, which was still living in the preliterate state it had been in since the Stone Age, Ekman believed he had found the blueprint for a set of universal human emotions and related expressions that crossed cultures and were present in all humans. A decade later he created the Facial Action Coding System, a comprehensive tool for objectively measuring facial movement. Ekman's work has been used by the FBI and police departments to identify the seeds of violent behavior in nonverbal expressions of sentiment. He has also developed the online Atlas of Emotions at the behest of the Dalai Lama. And today his research is being used to teach computer systems how to feel.
Artificial intelligence that reads and responds to our emotions is the killer app of the digital economy. It will make customers and employees happier--as long as it learns to respect our boundaries. When psychologist Dr. Paul Ekman visited the Fore tribe in the highlands of Papua New Guinea in 1967, he probably didn't imagine that his work would become the foundation for some of the latest developments in artificial intelligence (AI). After studying the tribe, which was still living in the preliterate state it had been in since the Stone Age, Ekman believed he had found the blueprint for a set of universal human emotions and related expressions that crossed cultures and were present in all humans. A decade later he created the Facial Action Coding System, a comprehensive tool for objectively measuring facial movement.
ABSTRACT Mirror Ritual is an interactive installation that challenges the existing paradigms in our understanding of human emotion and machine perception. In contrast to prescriptive interfaces, the work's real-time affective interface engages the audience in the iterative conceptualisation of their emotional state through the use of affectively-charged machine generated poetry. The audience are encouraged to make sense of the mirror's poetry by framing it with respect to their recent life experiences, effectively'putting into words' their felt emotion. This process of affect labelling and contextualisation works to not only regulate emotion, but helps to construct the rich personal narratives that constitute human identity. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page.
In this paper, we present a visual emulator of the emotions seen in characters in stories. This system is based on a simplified view of the cognitive structure of emotions proposed by Ortony, Clore and Collins (OCC Model). The goal of this paper is to provide a visual platform that allows us to observe changes in the characters' different emotions, and the intricate interrelationships between: 1) each character's emotions, 2) their affective relationships and actions, 3) The events that take place in the development of a plot, and 4) the objects of desire that make up the emotional map of any story. This tool was tested on stories with a contrasting variety of emotional and affective environments: Othello, Twilight, and Harry Potter, behaving sensibly and in keeping with the atmosphere in which the characters were immersed.
Nogueira, Pedro Alves (University of Porto) | Rodrigues, Rui (INESC-TEC, University of Porto) | Oliveira, Eugénio (University of Porto) | Nacke, Lennart E. (University of Ontario Institute of Technology)
Designing adaptive games for individual emotional experiences is a tricky task, especially when detecting a player’s emotional state in real time requires physiological sensing hardware and signal processing software. There is currently a lack of software that can identify and learn how emotional states in games are triggered. To address this problem, we developed a system capable of understanding the fundamental relations between emotional responses and their eliciting events. We propose time-evolving Affective Reaction Models (ARM), which learn new affective reactions and manage conflicting ones. These models are then meant to provide information on how a set of predetermined game parameters (e.g., enemy and item spawning, music and lighting effects) should be adapted, to modulate the player’s emotional state. In this paper, we propose and describe a framework for modulating player emotions and the main components involved in regulating players’ affective experience. We expect our technique will allow game designers to focus on defining high-level rules for generating gameplay experiences instead of having to create and test different content for each player type.