Efficient Technique to Accelerate Neural Network Training by Freezing Hidden Layers

2019 
Deep neural networks have the least parameters in early layers but it take up very complex and long computation. This paper propose a method in which a set number of training run are used to train the hidden layers of neural networks. The hidden layers are freezing out one by one and excluding from backward pass. Several experiments are carried out on CIFER; we practically show that Freeze Out technique gives 3% loss in accuracy for DenseNets while saving of up to 20% wall-clock time during training, with no loss of accuracy for ResNets while speed up to 20% and no improvement for VGG networks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []