Synaptic Strength For Convolutional Neural Network

2018 
Convolutional Neural Networks(CNNs) are both computation and memory intensive which hindered their deployment in many resource efficient devices. Inspired by neural science research, we propose the synaptic pruning: a data-driven method to prune connections between convolution layers with a newly proposed class of parameters called Synaptic Strength. Synaptic Strength is designed to capture the importance of a synapse based on the amount of information it transports. Experimental results show the effectiveness of our approach empirically. On CIFAR-10, we can prune various CNN models with up to 96% connections removed, which results in significant size reduction and computation saving. Further evaluation on ImageNet demonstrates that synaptic pruning is able to discover efficient models which are competitive to state-of-the-art compact CNNs such as MobileNet-V2 and NasNet-Mobile. Our contribution is summarized as follows: (1) We introduce Synaptic Strength, a new class of parameters for convolution layer to indicate the importance of each connection. (2) Our approach can prune various CNN models with high compression without compromising accuracy. (3) Further investigation shows, the proposed Synaptic Strength is a better indicator for kernel pruning compare with the previous approach both in empirical results and theoretical analysis.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    13
    Citations
    NaN
    KQI
    []