Classifier Adaptive Fusion: Deep Learning for Robust Outdoor Vehicle Visual Tracking

2019 
Deep auto-encoder (DAE) models have been successfully used in object tracking due to its strong capability of feature representation. However, single deep auto-encoder model would not be robust enough to represent the appearance model of outdoor vehicle for its harsh working environment, such as illumination variation, occlusion, cluttered background and so on. In this paper, a novel multiple-DAE-based tracking approach, that is, classifier adaptive fusion for robust outdoor vehicle visual tracking approach is proposed under particle filter framework. Firstly, two deep auto-encoders are offline trained by gray-scale image and gradient image of the raw training images, respectively to obtain the stronger feature representation of gray-scale image and gradient image. Secondly, two classifiers are constructed using the encoder of the two well-trained deep auto-encoders and the output of the each classifier is used to compute the confidence of the corresponding particles. Finally, the confidence output of the two classifiers is fused and applied in online tracking, where, the fusion weight of the each classifier is computed according to the distribution of particles represented by different classifier. Extensive tracking experiments conducted on visual tracking benchmark (VTB) show that the proposed tracking algorithm outperforms 9 popular tracking algorithms in the challenge scenes of outdoor vehicle tracking such as illumination variation, occlusion, cluttered background and scale variation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    1
    Citations
    NaN
    KQI
    []