Unsupervised Change Detection From Heterogeneous Data Based on Image Translation

2021 
It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detect changes by direct comparison of heterogeneous images. In this article, a new unsupervised change detection (USCD) method is proposed based on image translation. The cycle-consistent adversarial networks (CycleGANs) are employed to learn the subimage to subimage mapping relation using the given pair (i.e., before and after the event) of heterogeneous images from which the changes will be detected. Then, we can translate one image (e.g., SAR) from its original feature space (e.g., SAR) to another space (e.g., optical). By doing this, the pair of images can be represented in a common feature space (e.g., optical). The pixels with close pattern values in the before-event image may have quite different values in the after-event image if the change happens on some ones. Thus, we can generate the difference map between the translated before-event image and the original after-event image. Then, the difference map is divided into changed and unchanged parts. However, these detection results are not very reliable. We will select some significantly changed and unchanged pixel pairs from the two parts with the clustering technique (i.e., K-means). These selected pixel pairs are used to learn a binary classifier, and the other pixel pairs will be classified by this classifier to obtain the final CD results. Experimental results on different real datasets demonstrate the effectiveness of the proposed USCD method compared with several other related methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    5
    Citations
    NaN
    KQI
    []