Mutual-Feed Learning for Super-Resolution and Object Detection in Degraded Aerial Imagery

2022 
The resolution degradation poses a huge challenge for object detection (OD) in aerial imagery. The existing methods use super-resolution (SR) based on generative adversarial network (GAN) to restore texture details in degraded images. However, constrained detection results are still acquired due to object feature difference between restored and clear images. Therefore, we propose a simple yet effective learning method called mutual-feed learning (MFL) to solve the problem in this article. A closed-loop structure is designed via building the feedback connection based on the feedforward connection between the two tasks. It effectively delivers the object spatial and feature information from OD to SR and provides restoration-enhanced images from SR to OD. Specifically, a feedback of region of interest (FROI) module is introduced to realize a region-level discrimination under the guidance of object information. It guides the discrimination process of SR. Furthermore, a multiscale object information (MSOI) module is developed to implement a feature-level restoration by narrowing the differences in object-related features. It improves the generation process of SR. Then OD can be performed in restoration-enhanced images to obtain more accurate results. Extensive experiments over Northwestern Polytechnical University (NWPU) Very High Resolution (VHR)-10, Cars Overhead With Context (COWC), and Fine-grained Object Recognition (FAIR)1M datasets show that the method can achieve state-of-the-art results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    43
    References
    0
    Citations
    NaN
    KQI
    []