Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing
Yuan, Ying, Che, Haichuan, Qin, Yuzhe, Huang, Binghao, Yin, Zhao-Heng, Lee, Kang-Won, Wu, Yi, Lim, Soo-Chul, Wang, Xiaolong
–arXiv.org Artificial Intelligence
Executing contact-rich manipulation tasks necessitates the fusion of tactile and visual feedback. However, the distinct nature of these modalities poses significant challenges. In this paper, we introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation. Specifically, we propose Robot Synesthesia, a novel point cloud-based tactile representation inspired by human tactile-visual synesthesia. This approach allows for the simultaneous and seamless integration of both sensory inputs, offering richer spatial information and facilitating better reasoning about robot actions. The method, trained in a simulated environment and then deployed to a real robot, is applicable to various in-hand object rotation tasks. Comprehensive ablations are performed on how the integration of vision and touch can improve reinforcement learning and Sim2Real performance. Our project page is available at https://yingyuan0414.github.io/visuotactile/ .
arXiv.org Artificial Intelligence
Dec-10-2023
- Country:
- North America > United States
- California > Alameda County
- Berkeley (0.14)
- Illinois > Champaign County (0.14)
- California > Alameda County
- North America > United States
- Genre:
- Research Report (1.00)
- Industry:
- Information Technology > Services (0.48)
- Technology:
- Information Technology > Artificial Intelligence > Robots > Manipulation (1.00)