Shift-Invariant Convolutional Network Search

2020 
The development of Neural Architecture Search (NAS) makes Convolutional Neural Networks (CNN) more diverse and effective. But previous NAS approaches don’t pay attention to the shift-invariant of CNN. Without the shift-invariant, convolutional network is not robust enough when input data is disturbed or damaged. Besides, taking accuracy as the only optimization goal of NAS cannot meet the increasingly diverse needs. In this paper, we propose Shift-Invariant Convolutional Network Search (SICNS). It uses one-shot NAS to search for shift-invariant convolutional network by incorporating the low-pass filter into the one-shot model. Furthermore, SICNS optimizes multiple indicators simultaneously through the multi-objective evolutionary algorithm. Through training one-shot model and evolving the architecture, we obtain convolutional networks which are robust and powerful on image classification task. Especially, our work can achieve 4.52% test error on CIFAR-10 with 0.7M parameters. And in case the input data are disturbed, the accuracy of searched network is 2.96% higher than network without low-pass filter.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    4
    Citations
    NaN
    KQI
    []