Where to Look? Predictive Perception With Applications to Planetary Exploration

2018 
Planetary rovers exploring the surface of Mars rely on vision-based localization and navigation algorithms to estimate their state and plan their motion during autonomous traverses. The accurate estimation of rover's motion enables safe navigation across environments with potentially hazardous terrain that would otherwise require careful human intervention. The accuracy of these vision-based localization and navigation methods are directly related to the amount of visual texture observed by the rover's sensors. This poses a challenge for Mars navigation, where texture-limited surfaces such as smooth sand is prevalent. To overcome this issue, we propose making use of a rover's ability to actively steer its visual sensors with the goal of maximizing actionable visual information content. This letter answers the question of where and when to look by presenting a method to predict the sensor trajectory that maximizes rover localization performance. This is accomplished through an online search of possible trajectories using synthetic, future camera views created from observed data. Proposed trajectories are quantified and chosen based on expected localization performance. We validate our algorithm in a high-fidelity simulation of a Mars-analogue environment and show how intelligently choosing where to look during a traverse can increase navigation accuracy compared to traditional fixed-sensor configurations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    18
    Citations
    NaN
    KQI
    []