General, Single-shot, Target-less, and Automatic LiDAR-Camera Extrinsic Calibration Toolbox
Koide, Kenji, Oishi, Shuji, Yokozuka, Masashi, Banno, Atsuhiko
–arXiv.org Artificial Intelligence
Abstract-- This paper presents an open source LiDARcamera calibration toolbox that is general to LiDAR and camera projection models, requires only one pairing of LiDAR and camera data without a calibration target, and is fully automatic. Glue image matching pipeline to find 2D-3D correspondences between LiDAR and camera data and estimate the LiDARcamera transformation via RANSAC. The experimental results show that the proposed toolbox enables calibration of any combination of spinning and non-repetitive scan LiDARs and pinhole and omnidirectional cameras, and shows better calibration accuracy and robustness than those of the state-of-the-art edge-alignment-based calibration method. It is necessary for LiDAR-camera Figure 1: We present a complete LiDAR-camera calibration sensor fusion and is required for many applications, including framework that can handle various LiDAR and camera autonomous vehicle localization, environmental mapping, models and calibrate the transformation between them from and surrounding-object recognition. The pixel-level direct alignment algorithm enables studied over the last decade, the robotics community still high-quality LiDAR-camera data fusion. Target-less: The proposed calibration algorithm does geometry-rich environment carefully [3].
arXiv.org Artificial Intelligence
Feb-10-2023
- Country:
- Asia > Japan
- Honshū > Kantō > Ibaraki Prefecture > Tsukuba (0.04)
- Europe > United Kingdom
- England > Oxfordshire > Oxford (0.04)
- Asia > Japan
- Genre:
- Research Report > New Finding (0.34)
- Technology: