BDANet: Multiscale Convolutional Neural Network With Cross-Directional Attention for Building Damage Assessment From Satellite Images

2021 
Fast and effective responses are required when a natural disaster (e.g., earthquake and hurricane) strikes. Building damage assessment from satellite imagery is critical before relief effort is deployed. With a pair of predisaster and postdisaster satellite images, building damage assessment aims at predicting the extent of damage to buildings. With the powerful ability of feature representation, deep neural networks have been successfully applied to building damage assessment. Most existing works simply concatenate predisaster and postdisaster images as input of a deep neural network without considering their correlations. In this article, we propose a novel two-stage convolutional neural network for building damage assessment, called BDANet. In the first stage, a U-Net is used to extract the locations of buildings. Then, the network weights from the first stage are shared in the second stage for building damage assessment. In the second stage, a two-branch multiscale U-Net is employed as the backbone, where predisaster and postdisaster images are fed into the network separately. A cross-directional attention module is proposed to explore the correlations between predisaster and postdisaster images. Moreover, CutMix data augmentation is exploited to tackle the challenge of difficult classes. The proposed method achieves state-of-the-art performance on a large-scale dataset--xBD. The code is available at https://github.com/ShaneShen/BDANet-Building-Damage-Assessment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []