Attention-enabled 3D boosted convolutional neural networks for semantic CT segmentation using deep supervision

2019 
: A deeply supervised attention-enabled boosted convolutional neural network (DAB-CNN) is presented as a superior alternative to current state-of-the-art convolutional neural networks (CNNs) for semantic CT segmentation. Spatial attention gates (AGs) were incorporated into a novel 3D cascaded CNN framework to prioritize relevant anatomy and suppress redundancies within the network. Due to the complexity and size of the network, incremental channel boosting was used to decrease memory usage and facilitate model convergence. Deep supervision was used to encourage semantically meaningful deep features and mitigate local minima traps during training. The accuracy of DAB-CNN is compared to seven architectures: a variation of U-Net (UNet), attention-enabled U-Net (A-UNet), boosted U-Net (B-UNet), deeply-supervised U-Net (D-UNet), U-Net with ResNeXt blocks (ResNeXt), life-long learning segmentation CNN (LL-CNN), and deeply supervised attention-enabled U-Net (DA-UNet). The accuracy of each method was assessed based on Dice score compared to manually delineated contours as the gold standard. One hundred and twenty patients who had definitive prostate radiotherapy were used in this study. Training, validation, and testing followed Kaggle competition rules, with 80 patients used for training, 20 patients used for internal validation, and 20 test patients used to report final accuracies. Comparator p -values indicate that DAB-CNN achieved significantly superior Dice scores than all alternative algorithms for the prostate, rectum, and penile bulb. This study demonstrated that attention-enabled boosted convolutional neural networks (CNNs) using deep supervision are capable of achieving superior prediction accuracy compared to current state-of-the-art automatic segmentation methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    23
    Citations
    NaN
    KQI
    []