FLOP-Reduction Through Memory Allocations Within CNN for Hyperspectral Image Classification

2020 
Convolutional neural networks (CNNs) have proven to be a powerful tool for the classification of hyperspectral images (HSIs). The CNN kernels are able to naturally include spatial information to smooth out the spectral variability and the noise present in HSI data. However, these kernels are composed of a large number of learning parameters that must be correctly adjusted to achieve good performance. This forces the model to consume a large amount of training data, being prone to overfitting when limited labeled samples are available. In addition, the execution of kernels is computationally very expensive, increasing quadratically with respect to the size of the convolution filter. This significantly reduces the performance of the model. To overcome the aforementioned limitations, this work presents a new few-parameter CNN (based on shift operations) for HSI classification that dramatically reduces both the number of parameters and the computational complexity of the model in terms of floating-point operations (FLOPs). The operational module combines a shift kernel (which adjusts the input data in particular directions without involving any parameters nor FLOPs) with pointwise convolutions that perform the feature extraction stage. The newly developed shift-based CNN has been employed to conduct HSI classification over five widely used and challenging data sets, achieving very promising results in terms of computational performance and classification accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    65
    References
    9
    Citations
    NaN
    KQI
    []