Perception of Emotions in Human and Robot Faces: Is the Eye Region Enough?
Mishra, Chinmaya, Skantze, Gabriel, Hagoort, Peter, Verdonschot, Rinus
–arXiv.org Artificial Intelligence
The increased interest in developing next-gen social robots has raised questions about the factors affecting the perception of robot emotions. This study investigates the impact of robot appearances (humanlike, mechanical) and face regions (full-face, eye-region) on human perception of robot emotions. A between-subjects user study (N = 305) was conducted where participants were asked to identify the emotions being displayed in videos of robot faces, as well as a human baseline. Our findings reveal three important insights for effective social robot face design in Human-Robot Interaction (HRI): Firstly, robots equipped with a back-projected, fully animated face - regardless of whether they are more human-like or more mechanical-looking - demonstrate a capacity for emotional expression comparable to that of humans. Secondly, the recognition accuracy of emotional expressions in both humans and robots declines when only the eye region is visible. Lastly, within the constraint of only the eye region being visible, robots with more human-like features significantly enhance emotion recognition.
arXiv.org Artificial Intelligence
Oct-18-2024
- Country:
- Europe > Netherlands (0.29)
- North America > United States
- California (0.28)
- Genre:
- Questionnaire & Opinion Survey (1.00)
- Research Report
- Experimental Study > Negative Result (0.46)
- New Finding (1.00)
- Industry:
- Health & Medicine (1.00)
- Leisure & Entertainment (0.68)
- Media > Film (0.93)
- Technology:
- Information Technology > Artificial Intelligence
- Cognitive Science > Emotion (1.00)
- Issues > Social & Ethical Issues (1.00)
- Robots (1.00)
- Vision > Face Recognition (1.00)
- Information Technology > Artificial Intelligence