Drift-correcting self-calibration for visual-inertial SLAM

2017 
We present a solution for online simultaneous localization and mapping (SLAM) self-calibration in the presence of drift in calibration parameters in order to support accurate long-term operation. Calibration parameters such as the camera focal length or camera-to-IMU extrinsics are frequently subject to drift over long periods of operation, inducing cumulative error in the reconstruction. The key contributions are modeling calibration parameters as a spatiotemporal quantity: sensor-to-sensor spatial calibration and sensor intrinsic parameters are continuously time-varying, with statistical tests for change detection and regression. An analysis of the long term effects of inappropriately modeling time-varying sensor calibration is also provided. Constant-time operation is achieved by selecting only a fixed number of informative segments of the trajectory for calibration parameter estimation, giving the added benefit of avoiding early linearization errors by not rolling past measurements into a prior distribution. Our approach is validated with simulated and real-world data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    9
    Citations
    NaN
    KQI
    []