SLAM: Depth image information for mapping and inertial navigation system for localization

2016 
Environment mapping is important to help mobile robots accomplish tasks independently. In recent years, Simultaneous Localization and Mapping (SLAM), such as visual SLAM (V-SLAM) and laser-based SLAM, has attracted more and more attention. However, the computation's high complexity and cost have hindered extensive applications. This paper proposes a new method based on inertial navigation systems, and depth image information to reduce computation times and total cost. First, this work examines a design for a strap down inertial unit navigation system for a mobile robot. This design can explore probe gyroscope data, accelerometers, and electronic compasses. Due to the random drift error of accelerometers and gyroscopes, a Kalman filter data fusion algorithm was adopted to fuse the accelerometer data, gyroscope, and electronic compass. Gesture angles and position information have been calculated accordingly. Secondly, for detecting depth, a low cost non-laser sensor was utilized. Next, the depth information was transformed to the 3D pixel point coordinates to build the feature map. Thirdly, an overall design of simultaneous localization and mapping has been described. Some relatively simple arithmetic based on the extended Kalman filter SLAM method (EKF-SLAM) was implemented. Finally, the proposed method was evaluated and verified by the experimental results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    4
    Citations
    NaN
    KQI
    []