Robot Motion Planning using One-Step Diffusion with Noise-Optimized Approximate Motions

Aizu, Tomoharu, Oba, Takeru, Kondo, Yuki, Ukita, Norimichi

arXiv.org Artificial Intelligence 

Robot Motion Planning using One-Step Diffusion with Noise-Optimized Approximate Motions Tomoharu Aizu 1, Takeru Oba 1, Y uki Kondo 1, and Norimichi Ukita 1 Abstract -- This paper proposes an image-based robot motion planning method using a one-step diffusion model. While the diffusion model allows for high-quality motion generation, its computational cost is too expensive to control a robot in real time. T o achieve high quality and efficiency simultaneously, our one-step diffusion model takes an approximately generated motion, which is predicted directly from input images. This approximate motion is optimized by additive noise provided by our novel noise optimizer . Unlike general isotropic noise, our noise optimizer adjusts noise anisotropically depending on the uncertainty of each motion element. Our experimental results demonstrate that our method outperforms state-of-the-art methods while maintaining its efficiency by one-step diffusion. I NTRODUCTION For robot motion planning, we have to compare the current state of a robot with its surrounding environment. Among various sensors for observing the environment, including the robot, optical sensors such as RGB and RGB-Depth cameras are widely used because of their wide availability, wide observation ranges, and so on. We call robot motion planning using camera images image-based robot motion planning [4], [10], [17].