In this paper we present a novel approach to a grounded synthesis of emotional appraisal, based on a multicausal model of the appraisal process. We investigate the functional nature of emotion by implementing a robotic model in a predator/prey scenario which is able to discriminate and anticipate outcomes through emotional appraisal. The robots evolve to react in apparently emotional ways, showing how the functionality of emotion can emerge naturally. We demonstrate through this implementation the value of emotion appraisal as a form of anticipation. This supports the view that emotional behavior can often be seen as an effective alternative to rational cognition. Our effort here is to build a model that can be simultaneously seen as belonging to both NCS and more classical theorizing based on cognitions and representations, understandable both mechanically and subjectively from a human standpoint.
We describe our computational Affective Knowledge Representation (AKR) the hierarchical model of affect - including affect, mood, emotion, and personality - which we have developed for the design of socially intelligent agents. We describe a script implementation of emotion concepts used which we applied to create an intelligent user interface agent endowed with 1) affect sensory capacities as well as 2) general affective knowledge.
Emotions are an essential part of our lives, they influence how we think and behave, and how we communicate with others. Several researchers have acknowledged their importance in human thinking [Minsky 1986; Toda 19931, and recent neurological evidence seems to support these ideas [LeDoux 1996; Damasio 19941. Recent developments in the areas of synthetic agents, [Maes 1995; Blumberg 1994; Bates 1994; Elliot 1992; Reilly 1996;] and affective computing [Picard 19951, have promoted the study of emotions and their influences in behavior and learning [Velasquez 1996; Kitano 19951. Nevertheless, up to date, relatively few computational models of emotion have been proposed. For a review of some of these models see [Pfeifer 19881. This paper describes a computational model that focuses on different aspects of the generation of emotions and their influence on the behavior of synthetic agents.
The past 15 years have witnessed a rapid growth in computational models of emotion and affective architectures. Researchers in cognitive science, AI, HCI, robotics, and gaming are developing'models of emotion' for theoretical research regarding the nature of emotion, as well as a range of applied purposes: to create more believable and effective synthetic characters and robots, and to enhance human-computer interaction. Yet in spite of the many stand-alone emotion models, and the numerous affective agent and robot architectures developed to date, there is a lack of consistency, and lack of clarity, regarding what exactly it means to'model emotions'. 'Emotion modeling' can mean the dynamic generation of emotion via black-box models that map specific stimuli onto associated emotions. It can mean generating facial expressions, gestures, or movements depicting specific emotions in synthetic agents or robots.