Fine-grained neural architecture search for image super-resolution

2022 
Designing efficient deep neural networks has achieved great interest in image super-resolution (SR). However, exploring diverse network structures is computationally expensive. More importantly, each layer in a network has a distinct role that leads to the design of a specialized structure. In this work, we present a novel neural architecture search (NAS) algorithm that efficiently explores layer-wise structures. Specifically, we construct a supernet allowing flexibility in choosing the number of channels and per-channel activation functions according to the role of each layer. The search process runs efficiently via channel pruning since gradient descent jointly optimizes the Mult-Adds and the accuracy of the searched models. We facilitate estimating the model Mult-Adds in a differentiable manner using relaxations in the backward pass. The searched model, named FGNAS, outperforms the state-of-the-art NAS-based SR methods by a large margin.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []