Radar and Event Camera Fusion for Agile Robot Ego-Motion Estimation

Lyu, Yang, Zou, Zhenghao, Li, Yanfeng, Guo, Xiaohu, Zhao, Chunhui, Pan, Quan

arXiv.org Artificial Intelligence 

Abstract--Achieving reliable ego motion estimation for agile robots, e.g., aerobatic aircraft, remains challenging because most robot sensors fail to respond timely and clearly to highly dynamic robot motions, often resulting in measurement blurring, distortion, and delays. In this paper, we propose an IMU-free and feature-association-free framework to achieve aggressive ego-motion velocity estimation of a robot platform in highly dynamic scenarios by combining two types of exteroceptive sensors, an event camera and a millimeter wave radar, First, we used instantaneous raw events and Doppler measurements to derive rotational and translational velocities directly. Without a sophisticated association process between measurement frames, the proposed method is more robust in texture-less and structureless environments and is more computationally efficient for edge computing devices. Then, in the back-end, we propose a continuous-time state-space model to fuse the hybrid time-based and event-based measurements to estimate the ego-motion velocity in a fixed-lagged smoother fashion. In the end, we validate our velometer framework extensively in self-collected experiment datasets featured by aggressive motion and HDR light conditions. The results indicate that our IMU-free and association-free ego motion estimation framework can achieve reliable and efficient velocity output in challenging environments. Reliable ego-motion estimation is fundamental to autonomous robotic platforms. Early solutions rely on GNSS/INS, while more recent SLAM-based methods integrate diverse sensors such as cameras, LiDARs, and radars, making them more adaptable and widely applicable.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found