A2DO: Adaptive Anti-Degradation Odometry with Deep Multi-Sensor Fusion for Autonomous Navigation

Lai, Hui, Chen, Qi, Zhang, Junping, Pu, Jian

arXiv.org Artificial Intelligence 

Central to this promise is the ability to achieve real-time, precise localization, which is crucial for navigation and collision avoidance. Odometry stands out as a pivotal technology that empowers vehicles to determine their position and construct a map of the environment in real-time, without the need for pre-existing maps [1]. Despite its potential, traditional odometry systems often struggle to maintain localization accuracy under challenging conditions such as low-light scenarios, inclement weather, or obstructions. These scenarios underscore the pressing need for more robust SLAM solutions that can reliably operate under diverse real-world conditions. Multi-sensor fusion effectively addresses sensor degradation by combining data from complementary sensors, including cameras, LiDARs, and IMUs. Individual sensors may fail under specific conditions, such as LiDAR in rainy scenarios, cameras in low-light scenarios, and IMUs suffering from drift fusion. Previous geometric-based methods such as [2], [3] perform well in various scenarios. However, the reliance on rule-based approaches[4] for degraded sensor data makes these systems less effective in complex scenarios and requires significant manual calibration and tuning.