Hybrid cGAN: Coupling Global and Local Features for SAR-to-Optical Image Translation

2022 
Synthetic aperture radar (SAR) has the advantage of all-weather observation, but its imaging principle based on the backscattering of electromagnetic waves makes its information less interpretable. One feasible approach is to convert SAR images into optical images, which not only improves the interpretability of SAR images but also fills the gaps in information captured by optical sensors due to weather and light limitations. Since conditional generative adversarial network (cGAN) has the powerful ability to generate images, many studies have started to apply it to image translation tasks. For SAR-to-optical translation, some specialized cGAN models have been proposed, but most of them struggle to process SAR images with widely varying styles, often generating images with poor quality. To this end, we propose a hybrid cGAN that combines the advantages of convolutional neural network (CNN) and vision transformer (ViT). With the advantage of ViT to capture long-distance feature dependencies, the global features can be extracted and then fused with the local features extracted by CNN to improve the representation capabilities of our generator. Moreover, we expand the receptive field of the residual blocks in CNN by hierarchical convolution. Perceptual loss and classification loss are added for training to further improve the fidelity of the generated images. Finally, we introduce the multiscale strategy into the discriminator to balance its learning ability with that of the generator. Both visual and quantitative experiments are conducted with other state-of-the-art methods. The results show that our method not only achieves the optimal results in all the evaluation metrics but also generates images that are more consistent with the human visual system. In addition, the potential of our method to process multitype SAR images with significant style differences is also experimentally demonstrated.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []