Using Line Feature Based on Reflection Intensity for Camera-LiDAR Extrinsic Calibration

2022 
With the extensive research and application of various unmanned systems, for instance, unmanned aerial vehicle and self-driving cars, people are increasingly aware of the importance of multi-sensor data fusion. In this paper, a novel calibration method and corresponding experimental setup are proposed to accurately estimate the extrinsic parameters between LiDAR and camera. Proposed method introduces the reflection intensity to extract LiDAR line features on a self-developed calibration board. Intersections of LiDAR point clouds are calculated with the extracted LiDAR line features and intersections of visual lines are derived from ArUco Marker. Therefore multiple intersection correspondences between LiDAR frame and camera frame are given to calculate extrinsic parameters. We proved through experiments using a binocular camera with known intrinsic parameters. The results show that extrinsic calibration errors are within 0.7\(^{\circ }\) over rotation and within 1 cm over translation. The method is compared to the state-of-the-art method, the accuracy and convergence improve to about 3 times.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    0
    Citations
    NaN
    KQI
    []