interacnullon
Using the Pepper Robot to Support Sign Language Communication
Botta, Giulia, Botta, Marco, Gena, Cristina, Mazzei, Alessandro, Donini, Massimo, Lillo, Alberto
Social robots are increasingly experimented in public and assistive settings, but their accessibility for Deaf users remains quite underexplored. Italian Sign Language (LIS) is a fully-fledged natural language that relies on complex manual and non-manual components. Enabling robots to communicate using LIS could foster more inclusive human robot interaction, especially in social environments such as hospitals, airports, or educational settings. This study investigates whether a commercial social robot, Pepper, can produce intelligible LIS signs and short signed LIS sentences. With the help of a Deaf student and his interpreter, an expert in LIS, we co-designed and implemented 52 LIS signs on Pepper using either manual animation techniques or a MATLAB based inverse kinematics solver. We conducted a exploratory user study involving 12 participants proficient in LIS, both Deaf and hearing. Participants completed a questionnaire featuring 15 single-choice video-based sign recognition tasks and 2 open-ended questions on short signed sentences. Results shows that the majority of isolated signs were recognized correctly, although full sentence recognition was significantly lower due to Pepper's limited articulation and temporal constraints. Our findings demonstrate that even commercially available social robots like Pepper can perform a subset of LIS signs intelligibly, offering some opportunities for a more inclusive interaction design. Future developments should address multi-modal enhancements (e.g., screen-based support or expressive avatars) and involve Deaf users in participatory design to refine robot expressivity and usability.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States (0.04)
- Europe > Slovenia (0.04)
- (2 more...)
The Impact of Adaptive Emotional Alignment on Mental State Attribution and User Empathy in HRI
Buracchio, Giorgia, Callegari, Ariele, Donini, Massimo, Gena, Cristina, Lieto, Antonio, Lillo, Alberto, Mattutino, Claudio, Mazzei, Alessandro, Pigureddu, Linda, Striani, Manuel, Vernero, Fabiana
The paper presents an experiment on the effects of adaptive emotional alignment between agents, considered a prerequisite for empathic communication, in Human-Robot Interaction (HRI). Using the NAO robot, we investigate the impact of an emotionally aligned, empathic, dialogue on these aspects: (i) the robot's persuasive effectiveness, (ii) the user's communication style, and (iii) the attribution of mental states and empathy to the robot. In an experiment with 42 participants, two conditions were compared: one with neutral communication and another where the robot provided responses adapted to the emotions expressed by the users. The results show that emotional alignment does not influence users' communication styles or have a persuasive effect. However, it significantly influences attribution of mental states to the robot and its perceived empathy
- Europe > Italy > Piedmont > Turin Province > Turin (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- (2 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Education (1.00)
- Health & Medicine > Therapeutic Area (0.68)
How Age Influences the Interpretation of Emotional Body Language in Humanoid Robots -- long paper version
Consoli, Ilaria, Mattutino, Claudio, Gena, Cristina, de Carolis, Berardina, Palestra, Giuseppe
There is a general consensus that body movements and postures provide important cues for idennullfying emonullonal states, parnullcularly when facial and vocal signals are unavailable [1]. Emonullonal Body Language (EBL) is rapidly emerging as a significant area of research within cogninullve and affecnullve neuroscience. According to De Gelder [10], numerous valuable insights into human emonullon and its neurobiological foundanullons have been derived from the study of facial expressions. Indeed certain emonullons are more effecnullvely conveyed through facial expressions, while others are benuller commun icated through body movements or a combinanullon of both. Gestures provide observable cues that can be instrumental in recognizing and interprenullng a user's emonullonal state, especially in the absence of verbal or facial signals.
- North America > United States > South Carolina > Greenville County > Greenville (0.04)
- Europe > Italy > Sardinia > Cagliari (0.04)
- Europe > Italy > Piedmont > Turin Province > Turin (0.04)
- Europe > Italy > Apulia > Bari (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (0.93)
- Education (0.94)
- Health & Medicine > Therapeutic Area > Neurology (0.54)
On the usability of generative AI: Human generative AI
Generative AI systems are transforming content creation, but their usability remains a key challenge. This paper examines usability factors such as user experience, transparency, control, and cognitive load. Common challenges include unpredictability and difficulties in fine-tuning outputs. We review evaluation metrics like efficiency, learnability, and satisfaction, highlighting best practices from various domains. Improving interpretability, intuitive interfaces, and user feedback can enhance usability, making generative AI more accessible and effective.
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- Europe > Italy > Piedmont > Turin Province > Turin (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (4 more...)
- Overview (0.86)
- Research Report (0.84)