A Collaborative Correlation-Matching Network for Multimodality Remote Sensing Image Classification

2022 
Recently, with the increasing availability of the high-quality panchromatic (PAN) and multispectral (MS) remote sensing (RS) images, the inherent complementarity between PAN and MS images provides a wide development prospect for the multimodality RS image classification task. However, how to cleverly relieve the modal differences and effectively integrate the single-modality PAN and MS features is still a challenge. In this article, we design a collaborative correlation-matching network (CCM-Net) for multimodality RS image classification. Concretely, we first propose a bidirectional dominant feature supervision (Bi-DFS) learning, it utilizes single-modality dominant features as supplementary supervision information to establish the joint optimization loss function, thereby adaptively narrowing the differences between modalities before the feature extraction. In the feature extraction stage, the interactive correlation feature matching (ICFM) learning, composing the spatial feature matching (Spa-FM) and spectral feature matching (Spe-FM) strategies, is proposed to establish interactive matching and enhancement between multimodality strong correlation features from the perspective of spatial and spectral, respectively, thereby effectively alleviating the semantic deviation of multimodality features. Finally, we aggregate finer multilevel multimodality features to obtain top-level features with high discrimination. The effectiveness of the proposed algorithm has been verified on multiple datasets. Our code is available at: https://github.com/Momuli/CCM-Net.git .
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    0
    Citations
    NaN
    KQI
    []