language-icon Old Web
English
Sign In

Local Feature Normalization.

2021 
In deep learning, Batch Normalization (BN) is a widely used fundamental technique in Convolutional Neural Networks (CNNs) to improve the training speed and generalization capability of CNNs due to its effectiveness and simplicity. However, BN only focuses on global features and normalizes the whole feature map while ignoring the importance of the local feature. In this paper, we proposed the local feature normalization layer (LFN) to solve this problem by enhancing the features’ local area competition. These CNNs with LFN can leverage local regions with rich feature information. After normalized by LFN, if a feature of the local region is more expressive, its value will be bigger, and conversely, the value will be smaller. We also discussed how to use LFN better in CNNs in detail and solve some algorithm conflict problems. The LFN layer should be used after the ReLU in the first few layers. And LFN should not be used in front of the Max-pooling layer. Experimental results show that various CNN+LFN achieved better accuracy on the image classification tasks CIFAR dataset and ImageNet dataset than the same neural network with other popular normalization methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    18
    References
    0
    Citations
    NaN
    KQI
    []