Plotting

 Paiva, Ana


Associate Latent Encodings in Learning from Demonstrations

AAAI Conferences

We contribute a learning from demonstration approach for robots to acquire skills from multi-modal high-dimensional data. Both latent representations and associations of different modalities are proposed to be jointly learned through an adapted variational auto-encoder. The implementation and results are demonstrated in a robotic handwriting scenario, where the visual sensory input and the arm joint writing motion are learned and coupled. We show the latent representations successfully construct a task manifold for the observed sensor modalities. Moreover, the learned associations can be exploited to directly synthesize arm joint handwriting motion from an image input in an end-to-end manner. The advantages of learning associative latent encodings are further highlighted with the examples of inferring upon incomplete input images. A comparison with alternative methods demonstrates the superiority of the present approach in these challenging tasks.


Me and You Together: A Study on Collaboration in Manipulation Tasks

AAAI Conferences

This paper presents an ongoing study in the area of Human-Robot Collaboration, more precisely collaborative manipulation tasks between one robot and multiple people. We study how different trajectories influence people’s perception of the robot’s goal. To achieve this, we propose an approach based on Probabilistic Motor Primitives and the notion of legibility and predictability of trajectories to create the movements the robot performs during task execution. In this approach we also propose combining of legible and predictable trajectories depending on the state of the task in order to diminish the drawbacks associated with each type of trajectory.


Psychological Science in HRI: Striving for a More Integrated Field of Research

AAAI Conferences

Human-Robot Interaction (HRI) is a highly multidisciplinary endeavor. However, it often still appears to be an effort driven primarily by technical aims and concerns. We outline some of the major challenges for fruitful interdisciplinary collaboration in HRI, arguing for an improved integration of psychology and applied social sciences and their genuine research agendas. Based on our own disciplinary backgrounds, we discuss these issues from vantage points mostly originating in applied engineering and psychology, but also from relevant related fields such as sociology, communication sciences, philosophy, arts, and design. We take a project-case as an example to discuss grounded and practical challenges in HRI research, and to propose how a combination of artificial intelligence advances and a better conceptual definition of the role of social sciences in HRI research may prove to be beneficial. Our goal is to strengthen the impact and effectiveness of social scientists working in HRI, and thereby better prepare the field for future challenges.


The SERA Ecosystem: Socially Expressive Robotics Architecture for Autonomous Human-Robot Interaction

AAAI Conferences

Based on the development of several different HRI scenarios using different robots, we have been establishing the SERA ecosystem. SERA is composed of both a model and tools for integrating an AI agent with a robotic embodiment, in humanrobot interaction scenarios. We present the model, and several of the reusable tools that were developed, namely Thalamus, Skene and Nutty Tracks. Finally we exemplify how such tools and model have been used and integrated in five different HRI scenarios using the NAO, Keepon and EMYS robots. Figure 1: Our methodology as an intersection of CGI animation, Human-robot interaction (HRI) systems are spreading as a IVA and robotics techniques.


More May Be Less: Emotional Sharing in an Autonomous Social Robot

AAAI Conferences

We report a study performed with a social robot that autonomously plays a competitive game. By relying on an emotional agent architecture (using an appraisal mechanism) the robot was built with the capabilities of emotional appraisal and thus was able to express and share its emotions verbally throughout the game. Contrary to what was expected, emotional sharing in this context seemed to damage the social interaction with the users.


"It's Amazing, We Are All Feeling It!" — Emotional Climate as a Group-Level Emotional Expression in HRI

AAAI Conferences

Emotions are a key element in all human interactions. It is well documented that individual- and group-level interactions have different emotional expressions and humans are by nature extremely competent in perceiving, adapting and reacting to them. However, when developing social robots, emotions are not so easy to cope with. In this paper we introduce the concept of emotional climate applied to human-robot interaction (HRI) to define a group-level emotional expression at a given time. By doing so, we move one step further in developing a new tool that deals with group emotions within HRI.


Expressive Lights for Revealing Mobile Service Robot State

AAAI Conferences

Autonomous mobile service robots move in our buildings, carrying out different tasks and traversing multiple floors. While moving and performing their tasks, these robots find themselves in a variety of states. Although speech is often used for communicating the robot’s state to humans, such communication can often be ineffective, due to the transient nature of speech. In this paper, we investigate the use of lights as a persistent visualization of the robot’s state in relation to both tasks and environmental factors. Programmable lights offer a large degree of choices in terms of animation pattern, color and speed. We present this space of choices and introduce different animation profiles that we consider to animate a set of programmable lights on the robot. We conduct experiments to query about suitable animations for three representative scenarios of an autonomous symbiotic service robot, CoBot. Our work enables CoBot to make its states persistently visible to the humans it interacts with.


Make Way for the Robot Animators! Bringing Professional Animators and AI Programmers Together in the Quest for the Illusion of Life in Robotic Characters

AAAI Conferences

We are looking at new ways of building algorithms for synthesizing and rendering animation in social robots that can keep them as interactive as necessary, while still following on principles and practices used by professional animators. We will be studying the animation process side by side with professional animators in order to understand how these algorithms and tools can be used by animators to achieve animation capable of correctly adapting to the environment and the artificial intelligence that controls the robot. Figure 1: Two example scenarios featuring a touch-based Robotic characters are becoming widespread as useful multimedia application, sensors, and different robots.


An Embodied Empathic Tutor

AAAI Conferences

The two applications under development The EMOTE project (http://www.emote-project.eu/) is are a Treasure Hunt exercise designed to teach mapreading working towards the development of an empathic robot tutor skills, and a multi-player game Enercities-2 designed to be used with the 11-14 group and a multi-touch table to teach aspects of sustainable urban development.


Meet Me Halfway: Eye Behaviour as an Expression of Robot's Language

AAAI Conferences

Eye contact is a crucial behaviour in human communication and therefore an essencial feature in human-robot interaction. A study regarding the development of an eye behaviour model for a robotic tutor in a task-oriented environment is presented, along with a description of how our proposed model is being used to implement an autonomous robot in the EMOTE project.