Robust sensor fusion against on-vehicle sensor staleness

Fan, Meng, Zuo, Yifan, Blaes, Patrick, Montgomery, Harley, Das, Subhasis

arXiv.org Artificial Intelligence 

Sensor fusion is crucial for a performant and robust Perception system in autonomous vehicles, but sensor staleness--where data from different sensors arrives with varying delays--poses significant challenges. T emporal misalignment between sensor modalities leads to inconsistent object state estimates, severely degrading the quality of trajectory predictions that are critical for safety. W e present a novel and model-agnostic approach to address this problem via (1) a per-point timestamp offset feature (for LiDAR and radar both relative to camera) that enables fine-grained temporal awareness in sensor fusion, and (2) a data augmentation strategy that simulates realistic sensor staleness patterns observed in deployed vehicles. Our method is integrated into a perspective-view detection model that consumes sensor data from multiple LiDARs, radars and cameras. W e demonstrate that while a conventional model shows significant regressions when one sensor modality is stale, our approach reaches consistently good performance across both synchronized and stale conditions.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found