iTeach: Interactive Teaching for Robot Perception using Mixed Reality
P, Jishnu Jaykumar, Salvato, Cole, Bomnale, Vinaya, Wang, Jikai, Xiang, Yu
–arXiv.org Artificial Intelligence
We introduce iTeach, a Mixed Reality (MR) framework to improve robot perception through real-time interactive teaching. By allowing human instructors to dynamically label robot RGB data, iTeach improves both the accuracy and adaptability of robot perception to new scenarios. The framework supports on-the-fly data collection and labeling, enhancing model performance, and generalization. Applied to door and handle detection for household tasks, iTeach integrates a HoloLens app with an interactive YOLO model. Furthermore, we introduce the IRVLUTD DoorHandle dataset. DH-YOLO, our efficient detection model, significantly enhances the accuracy and efficiency of door and handle detection, highlighting the potential of MR to make robotic systems more capable and adaptive in real-world environments. The project page is available at https://irvlutd.github.io/iTeach.
arXiv.org Artificial Intelligence
Oct-1-2024
- Country:
- North America > United States > Texas (0.04)
- Genre:
- Research Report (0.50)
- Workflow (0.70)
- Industry:
- Education > Educational Setting > Online (0.87)
- Technology:
- Information Technology > Artificial Intelligence > Robots (1.00)