Goto

Collaborating Authors

 interacnullon


Using the Pepper Robot to Support Sign Language Communication

Botta, Giulia, Botta, Marco, Gena, Cristina, Mazzei, Alessandro, Donini, Massimo, Lillo, Alberto

arXiv.org Artificial Intelligence

Social robots are increasingly experimented in public and assistive settings, but their accessibility for Deaf users remains quite underexplored. Italian Sign Language (LIS) is a fully-fledged natural language that relies on complex manual and non-manual components. Enabling robots to communicate using LIS could foster more inclusive human robot interaction, especially in social environments such as hospitals, airports, or educational settings. This study investigates whether a commercial social robot, Pepper, can produce intelligible LIS signs and short signed LIS sentences. With the help of a Deaf student and his interpreter, an expert in LIS, we co-designed and implemented 52 LIS signs on Pepper using either manual animation techniques or a MATLAB based inverse kinematics solver. We conducted a exploratory user study involving 12 participants proficient in LIS, both Deaf and hearing. Participants completed a questionnaire featuring 15 single-choice video-based sign recognition tasks and 2 open-ended questions on short signed sentences. Results shows that the majority of isolated signs were recognized correctly, although full sentence recognition was significantly lower due to Pepper's limited articulation and temporal constraints. Our findings demonstrate that even commercially available social robots like Pepper can perform a subset of LIS signs intelligibly, offering some opportunities for a more inclusive interaction design. Future developments should address multi-modal enhancements (e.g., screen-based support or expressive avatars) and involve Deaf users in participatory design to refine robot expressivity and usability.


The Impact of Adaptive Emotional Alignment on Mental State Attribution and User Empathy in HRI

Buracchio, Giorgia, Callegari, Ariele, Donini, Massimo, Gena, Cristina, Lieto, Antonio, Lillo, Alberto, Mattutino, Claudio, Mazzei, Alessandro, Pigureddu, Linda, Striani, Manuel, Vernero, Fabiana

arXiv.org Artificial Intelligence

The paper presents an experiment on the effects of adaptive emotional alignment between agents, considered a prerequisite for empathic communication, in Human-Robot Interaction (HRI). Using the NAO robot, we investigate the impact of an emotionally aligned, empathic, dialogue on these aspects: (i) the robot's persuasive effectiveness, (ii) the user's communication style, and (iii) the attribution of mental states and empathy to the robot. In an experiment with 42 participants, two conditions were compared: one with neutral communication and another where the robot provided responses adapted to the emotions expressed by the users. The results show that emotional alignment does not influence users' communication styles or have a persuasive effect. However, it significantly influences attribution of mental states to the robot and its perceived empathy


How Age Influences the Interpretation of Emotional Body Language in Humanoid Robots -- long paper version

Consoli, Ilaria, Mattutino, Claudio, Gena, Cristina, de Carolis, Berardina, Palestra, Giuseppe

arXiv.org Artificial Intelligence

There is a general consensus that body movements and postures provide important cues for idennullfying emonullonal states, parnullcularly when facial and vocal signals are unavailable [1]. Emonullonal Body Language (EBL) is rapidly emerging as a significant area of research within cogninullve and affecnullve neuroscience. According to De Gelder [10], numerous valuable insights into human emonullon and its neurobiological foundanullons have been derived from the study of facial expressions. Indeed certain emonullons are more effecnullvely conveyed through facial expressions, while others are benuller commun icated through body movements or a combinanullon of both. Gestures provide observable cues that can be instrumental in recognizing and interprenullng a user's emonullonal state, especially in the absence of verbal or facial signals.


On the usability of generative AI: Human generative AI

Ravera, Anna, Gena, Cristina

arXiv.org Artificial Intelligence

Generative AI systems are transforming content creation, but their usability remains a key challenge. This paper examines usability factors such as user experience, transparency, control, and cognitive load. Common challenges include unpredictability and difficulties in fine-tuning outputs. We review evaluation metrics like efficiency, learnability, and satisfaction, highlighting best practices from various domains. Improving interpretability, intuitive interfaces, and user feedback can enhance usability, making generative AI more accessible and effective.