Task Matters: Investigating Human Questioning Behavior in Different Household Service for Learning by Asking Robots

Hu, Yuanda, Jiani, Hou, Junyu, Zhang, Ge, Yate, Sun, Xiaohua, Guo, Weiwei

arXiv.org Artificial Intelligence 

-- Learning by Asking (LBA) enables robots to identify knowledge gaps during task execution and acquire the missing information by asking targeted questions. However, different tasks often require different types of questions, and how to adapt questioning strategies accordingly remains under-explored. This paper investigates human questioning behavior in two representative household service tasks: a Goal-Oriented task (refrigerator organization) and a Process-Oriented task (cocktail mixing). Through a human-human study involving 28 participants, we analyze the questions asked using a structured framework that encodes each question along three dimensions: acquired knowledge, cognitive process, and question form. Our results reveal that participants adapt both question types and their temporal ordering based on task structure. Goal-Oriented tasks elicited early inquiries about user preferences, while Process-Oriented tasks led to ongoing, parallel questioning of procedural steps and preferences. These findings offer actionable insights for developing task-sensitive questioning strategies in LBA-enabled robots for more effective and personalized human-robot collaboration. Active learning has become an increasingly influential paradigm in robotics, enabling robots to iteratively query human users (oracles) for labels on informative samples during human-robot interaction. This process reduces uncertainty by enabling the robot to selectively acquire information about ambiguous or unfamiliar situations through human input [1].