CSHE: network pruning by using cluster similarity and matrix eigenvalues

2021 
Although deep convolutional neural networks (CNNs) have achieved significant success in computer vision applications, the real-world deployment of CNNs is often limited by computing resources and memory constraints. As a mainstream deep model compression technology, neural network pruning offers a promising prospect to reduce models’ parameters and calculation. In this paper, we proposed a novel filter pruning method that combines convolution filters and feature maps information for convolutional neural network compression, namely network pruning by using cluster similarity and large eigenvalues (CSHE). First, based on the convolution operation, we explore the similarity relationship of feature maps generated by the corresponding filters. Concretely, the clustering algorithm is used to classify the similarity of filter to guide the classification of feature map. Secondly, the proposed method utilizes the information of the large eigenvalues of the feature maps to rank the importance of filters. Finally, we prune the low-ranking filters and remain the high-ranking ones. The proposed method eliminates redundancy in convolution filters by applying large eigenvalues of feature maps based on filters similarity. In this way, most of the representative information in the network can be retained and the pruned results can be easily reproduced. Experiments show that the accuracy of the pruned sparse deep network obtained by the CSHE method in the classification tasks of CIFAR-10 and ImageNet ILSVRC-12 is almost the same as that of the reference network without any additional constraints.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    0
    Citations
    NaN
    KQI
    []