A deep learning-based auto-segmentation system for organs-at-risk on whole-body computed tomography images for radiation therapy.

2021 
Abstract Background and purpose Delineating organs at risk (OARs) on computed tomography (CT) images is an essential step in radiation therapy; however, it is notoriously time-consuming and prone to inter-observer variation. Herein, we report a deep learning-based automatic segmentation (AS) algorithm (WBNet) that can accurately and efficiently delineate all major OARs in the entire body directly on CT scans. Materials and methods We collected 755 CT scans of the head and neck, thorax, abdomen, and pelvis and manually delineated 50 OARs on the CT images. The CT images with contours were split into training and test sets consisting of 505 and 250 cases, respectively, to develop and validate WBNet. The volumetric Dice similarity coefficient (DSC) and 95th-percentile Hausdorff distance (95% HD) were calculated to evaluate delineation quality for each OAR. We compared the performance of WBNet with three AS algorithms: one commercial multi-atlas-based automatic segmentation (ABAS) software, and two deep learning-based AS algorithms, namely, AnatomyNet and nnU-Net. We have also evaluated the time saving and dose accuracy of WBNet. Results WBNet achieved average DSCs of 0.84 and 0.81 on in-house and public datasets, respectively, which outperformed ABAS, AnatomyNet, and nnU-Net. WBNet could reduce the delineation time significantly and perform well in treatment planning, with clinically acceptable dose differences compared with those in manual delineation. Conclusion This study shows the feasibility and benefits of using WBNet in clinical practice.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    4
    Citations
    NaN
    KQI
    []