Fast and Robust Bio-inspired Teach and Repeat Navigation

2020 
Fully autonomous mobile robots have a multitude of potential applications, but guaranteeing robust navigation performance remains an open research problem. For many tasks such as repeated infrastructure inspection, item delivery, or inventory transport, a route repeating capability can be sufficient and offers potential practical advantages over a full navigation stack. Previous teach and repeat research has achieved high performance in difficult conditions predominantly by using sophisticated, expensive sensors, and has often had high computational requirements. Biological systems, such as small animals and insects like seeing ants, offer a proof of concept that robust and generalisable navigation can be achieved with extremely limited visual systems and computing power. In this work we create a novel asynchronous formulation for teach and repeat navigation that fully utilises odometry information, paired with a correction signal driven by much more computationally lightweight visual processing than is typically required. This correction signal is also decoupled from the robot's motor control, allowing its rate to be modulated by the available computing capacity. We evaluate this approach with extensive experimentation on two different robotic platforms, the Consequential Robotics Miro and the Clearpath Jackal robots, across navigation trials totalling more than 6000 metres in a range of challenging indoor and outdoor environments. Our approach continues to succeed when multiple state-of-the-art systems fail due to low resolution images, unreliable odometry, or lighting change, while requiring significantly less compute. We also -- for the first time -- demonstrate versatile cross-platform teach and repeat without changing parameters, in which we learn to navigate a route with one robot and repeat that route using a completely different robot.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []