New SLAM Fusion Algorithm based on Lidar/IMU Sensors

2021 
As an important research field of mobile robots, SLAM (Simultaneous Localization and Mapping) technology is the core technology to realize intelligent autonomous mobile robots. Aiming at the problems of low positioning accuracy of Lidar (Light detection and ranging) SLAM with non-linear and non-Gaussian noise characteristics, this paper presents a mobile robot SLAM method which combines Lidar and IMU (Inertial Measurement Unit) to set up a multi-sensor integrated system, and uses a RKF (Rank Kalman Filtering) to estimate the robot motion trajectory through IMU and Lidar observations. RKF is similar to the Gaussian deterministic point sampling filtering algorithm in structure, but it does not need to meet the assumptions of Gaussian distribution. It completely calculates the sampling points and the sampling points weights based on the correlation principle of rank statistics. It is suitable for non-linear and non-Gaussian systems. With multiple experimental tests of real scenes, it is verified that the accuracy of the algorithm proposed in this paper has been improved relative to the alone Lidar SLAM algorithm. The new algorithm reduces the mean error of the indoor mobile robot in the X direction from 0.0928m to 0.0451m, with an improved accuracy rate of 46.39%, and the mean error in the Y direction from 0.0772m to 0.0405m, which improves the accuracy rate of 48.40%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []