HRT1: One-Shot Human-to-Robot Trajectory Transfer for Mobile Manipulation
Allu, Sai Haneesh, P, Jishnu Jaykumar, Khargonkar, Ninad, Summers, Tyler, Yao, Jian, Xiang, Yu
–arXiv.org Artificial Intelligence
Illustrations of several tasks that our system enables a mobile robot to perform. Abstract-- We introduce a novel system for human-to-robot trajectory transfer that enables robots to manipulate objects by learning from human demonstration videos. The system consists of four modules. The first module is a data collection module that is designed to collect human demonstration videos from the point of view of a robot using an AR headset. The second module is a video understanding module that detects objects and extracts 3D human-hand trajectories from demonstration videos. The third module transfers a human-hand trajectory into a reference trajectory of a robot end-effector in 3D space. The last module utilizes a trajectory optimization algorithm to solve a trajectory in the robot configuration space that can follow the end-effector trajectory transferred from the human demonstration. Consequently, these modules enable a robot to watch a human demonstration video once and then repeat the same mobile manipulation task in different environments, even when objects are placed differently from the demonstrations. Building autonomous robots that can help people perform various tasks is the dream of every roboticist. To achieve this goal, we need to enable robots to manipulate objects. Traditionally, roboticists built manipulation systems by integrating perception, planning, and control.
arXiv.org Artificial Intelligence
Oct-27-2025
- Country:
- Asia > Japan
- Honshū > Chūbu > Ishikawa Prefecture > Kanazawa (0.04)
- Europe > Switzerland (0.04)
- North America > United States
- California
- San Diego County > San Diego (0.04)
- Santa Clara County > Santa Clara (0.04)
- Texas > Dallas County
- Richardson (0.04)
- California
- Asia > Japan
- Genre:
- Research Report > New Finding (0.67)
- Technology: