Sea-land Segmentation Method for SAR Images Based on Improved BiSeNet

2020 
Sea–land segmentation is a basic step in coastline extraction and nearshore target detection. Because of poor segmentation accuracy and complicated parameter adjustment, the traditional sea–land segmentation algorithm is difficult to adapt in practical applications. Convolutional neural networks, which can extract multiple hierarchical features of images, can be used as an alternative technical approach for sea–land segmentation tasks. Among them, BiSeNet exhibits good performance in the semantic segmentation of natural scene images and effectively balances segmentation accuracy and speed. However, for the sea–land segmentation of SAR images, BiSeNet cannot extract the contextual semantic and spatial information of SAR images; thus, the segmentation effect is poor. To address the aforementioned problem, this study reduced the number of convolution layers in the spatial path to reduce the loss of spatial information and selected the ResNet18 lightweight model as the backbone network for the context path to reduce the overfitting phenomenon and provide a broad receptive field. At the same time, strategies for edge enhancement and loss function are proposed to improve the segmentation performance of the network in the land and sea boundary region. Experimental results based on GF3 data showed that the proposed method effectively improves the prediction accuracy and segmentation rate of the network. The segmentation accuracy and F1 score of the proposed method are 0.9889 and 0.9915, respectively, and the processing rate of SAR image slices with the resolution of 1024 × 1024 is 12.7 frames/s, which are better than those of other state-of-the-art approaches. Moreover, the size of the network is more than half of that of BiSeNet and smaller than that of U-Net. Thus, the network exhibits strong generalization performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []