Selective Kernel Res-Attention UNet: Deep Learning for Generating Decorrelation Mask With Applications to TanDEM-X Interferograms

2021 
Decorrelation is one of the main limitations for synthetic aperture radar interferometry. Masking decorrelated pixels is crucial for retrieving information from SAR interferograms. However, for traditional masking methods, manually drawing masks is time-consuming and may be unfeasible when decorrelation areas are with complicated and blurred boundaries. Setting a single coherence threshold is also difficult, if not impossible, to mask out all decorrelated pixels without losing valid phases. Here, we propose a deep-learning segmentation network (Mask Net) based on Selective Kernel Res-Attention UNet, for generating decorrelation masks with applications to TanDEM-X interferograms. We conduct several experiments to determine the training strategy and parameters, including sample size, batch size, loss function, and downsampling scheme, to optimize network performance. Afterwards, we compare the performance of Mask Net with other classical segmentation networks. Our evaluation metrics show that Mask Net outperforms the best performance of other segmentation networks by IoU of 6.32% and F1 Score of 3.97%, respectively. It also possesses the fastest inferring speed, 0.4505 s on sample size of 1024-by-1024 pixels, which is at least ∼50% faster than other segmentation networks. We applied Mask Net to three TanDEM-X interferograms of Ki $\bar{\imath}$ lauea crater in Hawaii, metropolitan region of Wuhan, and Muztagata Glacier in China. Our results show that comparing with coherence threshold method, Mask Net can clearly mask out all decorrelation regions, rarely causing loss of valid phases. It also exhibits better segmentation performance than other deep-learning segmentation networks, especially for those complex decorrelation boundaries, with less computational time.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    1
    Citations
    NaN
    KQI
    []