Autonomous Navigation via Deep Imitation and Transfer Learning: A Comparative Study

2020 
End to end learning for autonomous navigation and driving has become a growing research trend in both industry and academia in recent years. Its promise is in treating the whole driving pipeline as the development of a deep neural network (DNN). Its Achilles’ heel is access to thousands of images required for training of the DNN. This paper comprehensively investigates the applicability of the deep transfer learning for the specific task of end to end learning of autonomous navigation. Five state of the art DNNs including ResNet, AlexNet, and Densenet are applied here for extracting features from images taken by the front-facing camera of a mobile robot. Extracted features have different information values as DNNs have different architectures and learning capabilities. These features are then processed by a multilayer fully connected neural network to estimate the robot angular velocity. Obtained results for different DNNs indicate that the transfer learning-based models show a promising performance for accurately estimating the angular velocity purely using visual information. According to obtained results, AlexNet-base model outperforms others in terms of the estimation accuracy and the performance consistency.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    1
    Citations
    NaN
    KQI
    []