Aligning Discriminative and Representative Features: An Unsupervised Domain Adaptation Method for Building Damage Assessment.

2020 
Building assessment is highly prioritized during rescue operations and damage relief after hurricane disasters. Although machine learning has made remarkable improvement in building damage classification, it remains challenging because classifiers must be trained using a massive amount of labeled data. Furthermore, data labeling is labor intensive, costly, and unavailable after a disaster. To address this issue, we propose an unsupervised domain adaptation method with aligned discriminative and representative features (ADRF), which leverage a substantial amount of labeled data of relevant disaster scenes for new classification tasks. The remote sensing imageries of different disasters are collected using different sensors, viewpoints, times, even at various places. Compared with the public datasets used in the domain adaptation community, the remote sensing imageries are more complicated which exhibit characteristics of lower discrimination between categories and higher diversity within categories. As a result, pursuing domain invariance is a huge challenge. To achieve this goal, we build a framework with ADRF to improve the discriminative and representative capability of the extracted features to facilitate the classification task. The ADRF framework consists of three pipelines: a classifier for the labeled data of the source domain and one autoencoder each for the source and target domains. The latent variables of autoencoders are forced to observe unit Gaussian distributions by minimizing the maximum mean discrepancy (MMD), whereas the marginal distributions of both domains are aligned via the MMD. As a case study, two challenging transfer tasks using the hurricane Sandy, Maria, and Irma datasets are investigated. Experimental results demonstrate that ADRF achieves overall accuracy of 71.6% and 84.1% in the transfer tasks from dataset Sandy to dataset Maria and dataset Irma, respectively.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    6
    Citations
    NaN
    KQI
    []