Automated contour propagation of the prostate from pCT to CBCT images via deep unsupervised learning.

2021 
PURPOSE To develop and evaluate a deep unsupervised learning (DUL) framework based on a regional deformable model for automated prostate contour propagation from planning computed tomography (pCT) to cone-beam CT (CBCT). METHODS We introduce a DUL model to map the prostate contour from pCT to on-treatment CBCT. The DUL framework used a regional deformable model via narrow-band mapping to augment the conventional strategy. Two hundred and fifty-one anonymized CBCT images from prostate cancer patients were retrospectively selected and divided into three sets: 180 were used for training, 12 for validation, and 59 for testing. The testing dataset was divided into two groups. Group 1 contained 50 CBCT volumes, with one physician-generated prostate contour on CBCT image. Group 2 contained nine CBCT images, each including prostate contours delineated by four independent physicians and a consensus contour generated using the STAPLE method. Results were compared between the proposed DUL and physician-generated contours through the Dice similarity coefficients (DSCs), the Hausdorff distances, and the distances of the center-of-mass. RESULTS The average DSCs between DUL-based prostate contours and reference contours for test data in group 1 and group 2 consensus were 0.83 ± 0.04, and 0.85 ± 0.04, respectively. Correspondingly, the mean center-of-mass distances were 3.52 mm ± 1.15 mm, and 2.98 mm ± 1.42 mm, respectively. CONCLUSIONS This novel DUL technique can automatically propagate the contour of the prostate from pCT to CBCT. The proposed method shows that highly accurate contour propagation for CBCT-guided adaptive radiotherapy is achievable via the deep learning technique.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    1
    Citations
    NaN
    KQI
    []