WILDA: Wide Learning of Diverse Architectures for Classification of Large Datasets

2021 
In order to address scalability issues, which can be a challenge for Deep Learning methods, we propose Wide Learning of Diverse Architectures—a model that scales horizontally rather than vertically, enabling distributed learning. We propose a distributed version of a quality-diversity evolutionary algorithm (MAP-Elites) to evolve an architecturally diverse ensemble of shallow networks, each of which extracts a feature vector from the data. These features then become the input to a single shallow network which is optimised using gradient descent to solve a classification task. The technique is shown to perform well on two benchmark classification problems (MNIST and CIFAR). Additional experiments provide insight into the role that diversity plays in contributing to the performance of the repertoire.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    1
    Citations
    NaN
    KQI
    []