A Mutual Information Domain Adaptation Network for Remotely Sensed Semantic Segmentation

2022 
Although deep learning has made semantic segmentation of very-high-resolution (VHR) remote sensing (RS) images practical and efficient, its large-scale application is still limited. Given the diversity of imaging sensors, acquisition conditions, and regional styles, a deep learning network well-trained on one source domain dataset often suffers from drastic performance drops when applied to other target domain datasets. Thus, we propose a novel end-to-end mutual information domain adaptation network (MIDANet) that can shift between semantic segmentation domains by integrating multitask learning in the convolutional neural networks within an entropy adversarial learning (EAL) framework. Through the joint learning of semantic segmentation and elevation estimation, the features extracted by MIDANet can concentrate more on the elevation clues while dropping the domain-variant information (i.e., texture, spectral information). First, one encoder is applied to excavate general semantic features. Two decoders that share the same architecture are used to perform pixel-level classification and digital surface model (DSM) regression. Second, feature interaction modules (FIMs) and a mutual information attention unit (MIAU) are designed to mine the latent relationships between the two tasks and enhance their feature representations. Finally, a final MIDANet is obtained for semantic segmentation that does not require any semantic segmentation labels in the target domain after the adversarial learning of the classification entropy at the output level. Extensive comparative experiments and ablation studies were conducted on the International Society for Photogrammetry and Remote Sensing (ISPRS) Potsdam and Vaihingen test datasets. The results show that MIDANet outperforms other state-of-the-art domain adaptation (DA) methods in both evaluation metrics and visual assessment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []