Walk along: An Experiment on Controlling the Mobile Robot 'Spot' with Voice and Gestures
Zhang, Renchi, van der Linden, Jesse, Dodou, Dimitra, Seyffert, Harleigh, Eisma, Yke Bauke, de Winter, Joost C. F.
–arXiv.org Artificial Intelligence
Walk along: An Experiment on Controlling the Mobile Robot'Spot' with Voice and Gestures Abstract Robots are becoming increasingly intelligent and can autonomously perform tasks such as navigating between locations. However, human oversight remains crucial. This study compared two handsfree methods for directing mobile robots: voice control and gesture control. These methods were tested with the human stationary and walking freely. We hypothesized that walking with the robot would lead to higher intuitiveness ratings and better task performance due to increased stimulus-response compatibility, assuming humans align themselves with the robot. In a 2 2 within-subject design, 218 participants guided the quadrupedal robot Spot using 90 rotation and walk-forward commands. After each trial, participants rated the intuitiveness of the command mapping, while post-experiment interviews were used to gather the participants' preferences. Results showed that voice control combined with walking with Spot was the most favored and intuitive, while gesture control while standing caused confusion for left/right commands. Despite this, 29% of participants preferred gesture control, citing task engagement and visual congruence as reasons. An odometry-based analysis revealed that participants aligned behind Spot, particularly in the gesture control condition, when allowed to walk. In conclusion, voice control with walking produced the best outcomes. Improving physical ergonomics and adjusting gesture types could improve the effectiveness of gesture control. Introduction Robots have traditionally been viewed as devices designed to efficiently perform repetitive tasks, mainly in industrial settings and logistical operations. However, with the advancement of AI, robots increasingly take on new roles. Modern robots can understand and adapt to their surroundings, paving the way for mobile robotics. The human-machine interface (HMI) plays a vital role in the control of mobile robots, as these robots are not yet capable of fully autonomous operation in open-ended environments (e.g., Endsley, 2017; Ezenkwu & Starkey, 2019; Hatanaka et al., 2023; Pianca & Santucci, 2023).
arXiv.org Artificial Intelligence
Jul-17-2024
- Country:
- Asia
- Europe
- Germany
- Italy > Lazio
- Rome (0.04)
- Netherlands > South Holland
- Delft (0.04)
- Norway > Eastern Norway
- Oslo (0.04)
- North America
- Canada > Ontario
- Toronto (0.04)
- United States
- Colorado > Boulder County
- Boulder (0.04)
- Hawaii > Honolulu County
- Honolulu (0.04)
- Massachusetts > Suffolk County
- Boston (0.04)
- Utah > Cache County
- Logan (0.04)
- Colorado > Boulder County
- Canada > Ontario
- Genre:
- Questionnaire & Opinion Survey (1.00)
- Research Report
- Experimental Study (1.00)
- New Finding (1.00)
- Industry:
- Government > Military (0.46)
- Health & Medicine
- Consumer Health (0.66)
- Therapeutic Area (0.68)
- Information Technology > Robotics & Automation (0.46)
- Technology:
- Information Technology > Artificial Intelligence
- Robots > Locomotion (1.00)
- Speech > Speech Recognition (0.91)
- Vision > Gesture Recognition (1.00)
- Information Technology > Artificial Intelligence