Multi-Robot Joint Visual-Inertial Localization and 3-D Moving Object Tracking

2020 
In this paper, we present a novel distributed algorithm to track a moving object’s state by utilizing a heterogenous mobile robot network in a three-dimensional (3-D) environment, wherein the robots’ poses (positions and orientations) are unknown. Each robot is equipped with a monocular camera and an inertial measurement unit (IMU), and has the ability to communicate with its neighbors. Rather than assuming a known common global frame for all the robots (which is often the case in the literature regarding multi-robot systems), we allow each robot to perform motion estimation locally. For localization, we propose a multi-robot visual-inertial navigation systems (VINS) where one robot builds a prior map and then the map is used to bound the long-term drifts of the visual-inertial odometry (VIO) running on the other robots. Moreover, a novel distributed Kalman filter is introduced and employed to cooperatively track the six degree-of-freedom (6-DoF) motion of the object which is represented as a point cloud. Further, the object can be totally invisible to some robots during the tracking period. The proposed algorithm is extensively validated in Monte-Carlo simulations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    1
    Citations
    NaN
    KQI
    []