Joint sparse neural network compression via multi-application multi-objective optimization

2021 
Over the pass decade, deep neural network (DNN) has been widely applied in various applications. To alleviate the storage and computation requirement of the complicated DNNs, network compression methods are developed. The sparse structure learning methods based on multi-objective optimization have been proven to be valid to balance the sparsity of the network model and network performance. However, when multiple applications are deployed on one single platform simultaneously, these methods become inefficient because each network model for each application needs to be trained and optimized individually. In this article, a multi-objective, multi-application sparse learning model is proposed to optimize multiple targets from a set of applications together. The joint network structure is first proposed. After a pre-training of the network model, a joint multi-objective evolutionary algorithm is derived to solve the optimization problems. Note that an improved initialization method for parent model generation is also developed. Finally, based on the joint loss between the objectives, fine tuning is used to compute the final models with good performance. The proposed method is evaluated under different datasets with a comparison to the state-of-the-art approaches, and experimental results demonstrate that the multi-application optimization model can give much better performance than the single-application optimization ones, especially in the case that different datasets are involved simultaneously.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    50
    References
    3
    Citations
    NaN
    KQI
    []