MRHaD: Mixed Reality-based Hand-Drawn Map Editing Interface for Mobile Robot Navigation

Taki, Takumi, Kobayashi, Masato, Iglesius, Eduardo, Chiba, Naoya, Shirai, Shizuka, Uranishi, Yuki

arXiv.org Artificial Intelligence 

-- Mobile robot navigation systems are increasingly relied upon in dynamic and complex environments, yet they often struggle with map inaccuracies and the resulting inefficient path planning. This paper presents MRHaD, a Mixed Reality-based Hand-drawn Map Editing Interface that enables intuitive, real-time map modifications through natural hand gestures. By integrating the MR head-mounted display with the robotic navigation system, operators can directly create hand-drawn restricted zones (HRZ), thereby bridging the gap between 2D map representations and the real-world environment. Comparative experiments against conventional 2D editing methods demonstrate that MRHaD significantly improves editing efficiency, map accuracy, and overall usability, contributing to safer and more efficient mobile robot operations. The proposed approach provides a robust technical foundation for advancing human-robot collaboration and establishing innovative interaction models that enhance the hybrid future of robotics and human society. I. INTRODUCTION Recent advances in autonomous mobile robots have opened up new opportunities for human-robot collaboration in various application domains, including logistics, healthcare, and public spaces [1], [2], [3]. Typically, these robots use pre-constructed environmental maps and dynamically adjust their paths based on real-time environmental sensing with various onboard sensors. Path planning methods are generally divided into two categories: global planning and local planning [4].