Selective Adversarial Adaptation-Based Cross-Scene Change Detection Framework in Remote Sensing Images

2020 
Supervised change detection methods always face a big challenge that the current scene (target domain) is fully unlabeled. In remote sensing, it is common that we have sufficient labels in another scene (source domain) with a different but related data distribution. In this article, we try to detect changes in the target domain with the help of the prior knowledge learned from multiple source domains. To achieve this goal, we propose a change detection framework based on selective adversarial adaptation. The adaptation between multisource and target domains is fulfilled by two domain discriminators. First, the first domain discriminator regards each scene as an individual domain and is designed for identifying the domain to which each input sample belongs. According to the output of the first domain discriminator, a subset of important samples is selected from multisource domains to train a deep neural network (DNN)-based change detection model. As a result, not only the positive transfer is enhanced but also the negative transfer is alleviated. Second, as for the second domain discriminator, all the selected samples are thought from one domain. Adversarial learning is introduced to align the distributions of the selected source samples and the target ones. Consequently, it further adapts the knowledge of change from the source domain to the target one. At the fine-tuning stage, target samples with reliable labels and the selected source ones are used to jointly fine-tune the change detection model. As the target domain is fully unlabeled, homogeneity- and boundary-based strategies are exploited to make the pseudolabels from a preclassification map reliable. The proposed method is evaluated on three SAR and two optical data sets, and the experimental results have demonstrated its effectiveness and superiority.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    61
    References
    0
    Citations
    NaN
    KQI
    []