Scalable NAS with factorizable architectural parameters

2022 
Neural Architecture Search (NAS) replaces manually designed networks with automatically searched networks and has become a hot topic in machine learning and computer vision. One key factor of NAS, scaling up the search space by adding operators, could help bring about more possibilities for effective architectures. However, existing works require high search costs and are prone to make confused selections caused by competition among similar operators. This paper presents a scalable NAS that utilizes a factorized method, which improves existing art in the following aspects. (1) Our work explores a broader search space. We construct an ample search space through the of activation operators and regular operators. (2) Our work experiences a limited computation burden even though searching in a more extensive search space. We a search space into two subspaces and adopt different architectural parameters to control corresponding subspaces. (3) Our work avoids competition among similarly combined operators (, & and & and & ). That is because our architectural parameters are optimized sequentially. Experimental results show that our approach achieves SOTA on CIFAR10 ( test error) and ImageNet ( test error). Furthermore, the excellent performance of () and () on COCO further proves the superiority of our factorized method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []