--While interacting with another person, our reactions and behavior are much affected by the emotional changes within the temporal context of the interaction. Our intrinsic affective appraisal comprising perception, self-assessment, and the affective memories with similar social experiences will drive specific, and in most cases addressed as proper, reactions within the interaction. This paper proposes the roadmap for the development of multimodal research which aims to empower a robot with the capability to provide proper social responses in a Human-Robot Interaction (HRI) scenario. Our capabilities of both perceiving and reacting to the affective behavior of other persons are fine-tuned based on the observed social response of our interaction peers. We usually perceive how others are behaving towards us by reading their affective behavior through the processing of audio/visual cues .
Affective events are events that impact people in positive or negative ways. When people discuss an event, people understand not only the affective polarity but also the reason for the event being positive or negative. In this paper, we aim to categorize affective events based on the reasons why events are affective. We propose that an event is affective to people often because the event describes or indicates the satisfaction or violation of certain kind of human needs. For example, the event "I broke my leg" affects people negatively because the need to be physically healthy is violated. "I play computer games" has a positive affect on people because the need to have fun is probably satisfied. To categorize affective events in narrative human language, we define seven common human need categories and introduce a new data set of randomly sampled affective events with manual human need annotations. In addition, we explored two types of methods: a LIWC lexicon based method and supervised classifiers to automatically categorize affective event expressions with respect to human needs. Experiments show that these methods achieved moderate performance on this task.
Gordon, Goren (Tel Aviv-University) | Spaulding, Samuel (Massachusetts Institute of Technology) | Westlund, Jacqueline Kory (Massachusetts Institute of Technology) | Lee, Jin Joo (Massachusetts Institute of Technology) | Plummer, Luke (Massachusetts Institute of Technology) | Martinez, Marayna (Massachusetts Institute of Technology) | Das, Madhurima (Massachusetts Institute of Technology) | Breazeal, Cynthia (Massachusetts Institute of Technology)
Though substantial research has been dedicated towards using technology to improve education, no current methods are as effective as one-on-one tutoring. A critical, though relatively understudied, aspect of effective tutoring is modulating the student's affective state throughout the tutoring session in order to maximize long-term learning gains. We developed an integrated experimental paradigm in which children play a second-language learning game on a tablet, in collaboration with a fully autonomous social robotic learning companion. As part of the system, we measured children's valence and engagement via an automatic facial expression analysis system. These signals were combined into a reward signal that fed into the robot's affective reinforcement learning algorithm. Over several sessions, the robot played the game and personalized its motivational strategies (using verbal and non-verbal actions) to each student. We evaluated this system with 34 children in preschool classrooms for a duration of two months. We saw that (1) children learned new words from the repeated tutoring sessions, (2) the affective policy personalized to students over the duration of the study, and (3) students who interacted with a robot that personalized its affective feedback strategy showed a significant increase in valence, as compared to students who interacted with a non-personalizing robot. This integrated system of tablet-based educational content, affective sensing, affective policy learning, and an autonomous social robot holds great promise for a more comprehensive approach to personalized tutoring.
This paper describes design and results of a human-robot interaction study aimed at determining the extent to which affective robotic behavior can influence participants' compliance with a humanoid robot’s request in the context of a mock-up search-and-rescue setting. The results of the study argue for inclusion of affect into robotic systems, showing that nonverbal expressions of negative mood (nervousness) and fear by the robot improved the participants' compliance with its request to evacuate, causing them to respond earlier and faster.