Learning to Match 2D Images and 3D LiDAR Point Clouds for Outdoor Augmented Reality

2020 
Large-scale Light Detection and Ranging (LiDAR) point clouds provide basic 3D information support for Augmented Reality (AR) in outdoor environments. Especially, matching 2D images across to 3D LiDAR point clouds can establish the spatial relationship of 2D and 3D space, which is a solution for the virtual-real registration of AR. This paper first provides a precise 2D-3D patch-volume dataset, which contains paired matching 2D image patches and 3D LiDAR point cloud volumes, by using the Mobile Laser Scanning (MLS) data from the urban scene. Second, we propose an end-to-end network, Siam2D3D-Net, to jointly learn local feature representations for 2D image patches and 3D LiDAR point cloud volumes. Experimental results indicate the proposed Siam2D3D-Net can match and establish 2D-3D correspondences from the query 2D image to the 3D LiDAR point cloud reference map. Finally, an application is used to evaluate the possibility of the proposed virtual-real registration of AR in outdoor environments.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    7
    Citations
    NaN
    KQI
    []