Robust sensor fusion against on-vehicle sensor staleness
Fan, Meng, Zuo, Yifan, Blaes, Patrick, Montgomery, Harley, Das, Subhasis
–arXiv.org Artificial Intelligence
Sensor fusion is crucial for a performant and robust Perception system in autonomous vehicles, but sensor staleness--where data from different sensors arrives with varying delays--poses significant challenges. T emporal misalignment between sensor modalities leads to inconsistent object state estimates, severely degrading the quality of trajectory predictions that are critical for safety. W e present a novel and model-agnostic approach to address this problem via (1) a per-point timestamp offset feature (for LiDAR and radar both relative to camera) that enables fine-grained temporal awareness in sensor fusion, and (2) a data augmentation strategy that simulates realistic sensor staleness patterns observed in deployed vehicles. Our method is integrated into a perspective-view detection model that consumes sensor data from multiple LiDARs, radars and cameras. W e demonstrate that while a conventional model shows significant regressions when one sensor modality is stale, our approach reaches consistently good performance across both synchronized and stale conditions.
arXiv.org Artificial Intelligence
Jun-9-2025
- Country:
- North America > United States > California > San Mateo County > Foster City (0.04)
- Genre:
- Research Report (0.64)
- Industry:
- Leisure & Entertainment > Games > Chess (0.40)
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning (0.95)
- Representation & Reasoning > Information Fusion (0.93)
- Robots > Autonomous Vehicles (0.85)
- Vision (1.00)
- Data Science > Data Integration (0.93)
- Sensing and Signal Processing (1.00)
- Artificial Intelligence
- Information Technology