CPI: LiDAR-Camera Extrinsic Calibration Based on Feature Points with Reflection Intensity

2021 
Autonomous navigation and unmanned systems are gaining increasingly attention by the research community, especially towards solving problems related to multi-sensory data fusion for navigation and positioning in complex scenarios. In this paper, a novel calibration method and experimental setup are proposed to calibrate the extrinsic parameters between a LiDAR and a camera. This method is based on the directly extracted LiDAR feature points considering reflection intensity and visual feature points derived from ArUco over a calibration board. Therefore multiple calculation points correspondences between the camera and the LiDAR frames are retrieved to achieve an accurate extrinsic calibration. Aiming at proving the quality of the proposed method, setup, process and experimental results are described and commented in this paper. In our experiments we achieved a LiDAR calculation point extraction error of less than 0.25 cm, obtaining an indirect binocular camera extrinsic calibration error over the translation about 0.4 cm and 0.7^{\circ } error over the rotation. The proposed method compared to the other state-of-the-art method, results improved to more than 3 times on the accuracy and about 5 times on the convergence level. In addition, after transforming the LiDAR frame into the camera frame with our calibration results, the image and the LiDAR point cloud can match and coincide perfectly.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    0
    Citations
    NaN
    KQI
    []