|
|
Application of deep learning in automatic segmentation of clinical target volume in brachytherapy after surgery for endometrial carcinoma |
XUE Xian1, WANG Kaiyue2, LIANG Dazhu3, DING Jingjing4, JIANG Ping2, SUN Quanfu1, CHENG Jinsheng1, DAI Xiangkun4, FU Xiaosha5, ZHU Jingyang6, ZHOU Fugen7 |
1. National Institute for Radiological Protection, Chinese Center for Disease Control and Prevention (CDC), Beijing 100088 China; 2. Department of Radiotherapy, Peking University Third Hospital, Beijing 100089 China; 3. Northeastern University, Shenyang 110819 China; 4. Department of Radiotherapy, Chinese People’s Liberation Army (PLA) General Hospital, Beijing 100039 China; 5. Biomedical Research Centre, Sheffield Hallam University, Sheffield S11WB UK; 6. Department of radiation oncology, Zhongcheng Cancer center, Beijing 100160 China; 7. Beihang University, Beijing 100083 China |
|
|
Abstract Objective To evaluate the application of three deep learning algorithms in automatic segmentation of clinical target volumes (CTVs) in high-dose-rate brachytherapy after surgery for endometrial carcinoma. Methods A dataset comprising computed tomography scans from 306 post-surgery patients with endometrial carcinoma was divided into three subsets: 246 cases for training, 30 cases for validation, and 30 cases for testing. Three deep convolutional neural network models, 3D U-Net, 3D Res U-Net, and V-Net, were compared for CTV segmentation. Several commonly used quantitative metrics were employed, i.e., Dice similarity coefficient, Hausdorff distance, 95th percentile of Hausdorff distance, and Intersection over Union. Results During the testing phase, CTV segmentation with 3D U-Net, 3D Res U-Net, and V-Net showed a mean Dice similarity coefficient of 0.90 ± 0.07, 0.95 ± 0.06, and 0.95 ± 0.06, a mean Hausdorff distance of 2.51 ± 1.70, 0.96 ± 1.01, and 0.98 ± 0.95 mm, a mean 95th percentile of Hausdorff distance of 1.33 ± 1.02, 0.65 ± 0.91, and 0.40 ± 0.72 mm, and a mean Intersection over Union of 0.85 ± 0.11, 0.91 ± 0.09, and 0.92 ± 0.09, respectively. Segmentation based on V-Net was similarly to that performed by experienced radiation oncologists. The CTV segmentation time was < 3.2 s, which could save the work time of clinicians. Conclusion V-Net is better than other models in CTV segmentation as indicated by quantitative metrics and clinician assessment. Additionally, the method is highly consistent with the ground truth, reducing inter-doctor variability and treatment time.
|
Received: 12 March 2024
|
|
|
|
|
[1] Xia CF, Dong XS, Li H, et al. Cancer statistics in China and United States, 2022: profiles, trends, and determinants[J]. Chin Med J (Engl), 2022, 135(5): 584-590. [2] Sundar S, Balega J, Crosbie E, et al. BGCS uterine cancer guidelines: Recommendations for practice[J]. Eur J Obstet Gynecol Reprod Biol, 2017, 213: 71-97. [3] Wortman BG, Astreinidou E, Laman MS, et al. Brachytherapy quality assurance in the PORTEC-4a trial for molecular-integrated risk profile guided adjuvant treatment of endometrial cancer[J]. Radiother Oncol, 2021, 155: 160-166. [4] Ma CY, Zhou JY, Xu XT, et al. Deep learning-based auto-segmentation of clinical target volumes for radiotherapy treatment of cervical cancer[J]. J Appl Clin Med Phys, 2022, 23(2): e13470. [5] Liu ZK, Liu X, Guan H, et al. Development and validation of a deep learning algorithm for auto-delineation of clinical target volume and organs at risk in cervical cancer radiotherapy[J]. Radiother Oncol, 2020, 153: 172-179. [6] Liang F, Qian PJ, Su KH, et al. Abdominal, multi-organ, auto-contouring method for online adaptive magnetic resonance guided radiotherapy: An intelligent, multi-level fusion approach[J]. Artif Intell Med, 2018, 90: 34-41. [7] Sharp G, Fritscher KD, Pekar V, et al. Vision 20/20: perspectives on automated image segmentation for radiotherapy[J]. Med Phys, 2014, 41(5): 050902. [8] Nouranian S, Mahdavi SS, Spadinger I, et al. A multi-atlas-based segmentation framework for prostate brachytherapy[J]. IEEE Trans Med Imaging, 2015, 34(4): 950-961. [9] Pekar V, Mcnutt TR, Kaus MR. Automated model-based organ delineation for radiotherapy planning in prostatic region[J]. Int J Radiat Oncol Biol Phys, 2004, 60(3): 973-980. [10] Cao RF, Pei X, Ge N, et al. Clinical target volume auto-segmentation of esophageal cancer for radiotherapy after radical surgery based on deep learning[J]. Technol Cancer Res Treat, 2021, 20: 15330338211034284. DOI: 10.1177/15330338211034284. [11] 陶森. 基于深度学习的肾脏肿瘤分割方法研究[D]. 西安: 西安电子科技大学, 2021. Tao S. Research on segmentation method of kidney tumor based on deep learning[D]. Xi’an: Xidian University, 2021. [12] 徐思则, 刘威. 基于UNet网络的乳腺癌肿瘤细胞图像分割[J]. 电子设计工程,2022,30(12):63-66,73. Xu SZ, Liu W. UNet-based image segmentation of breast cancer tumor cells[J]. Electron Des Eng, 2022, 30(12): 63-66,73. [13] 徐旺旺, 许良凤, 李博凯, 等. TransAS-UNet: 融合Swin Transformer和UNet的乳腺癌区域分割[J]. 中国图象图形学报,2024,29(3):741-754. Xu WW, Xu LF, Li BK, et al. TransAS—UNet: regional segmentation of breast cancer Swin Transformer and of UNet algorithm[J]. J Image Graph, 2024, 29(3): 741-754. [14] Yu H, Li JQ, Zhang LX, et al. Design of lung nodules segmentation and recognition algorithm based on deep learning[J]. BMC Bioinformatics, 2021, 22(S5): 314. [15] Zhao C, Shi S, He Z, et al. Spatial-temporal V-Net for automatic segmentation and quantification of right ventricle on gated myocardial perfusion SPECT images[J]. Med Phys, 2023, 50(12): 7415-7426. [16] Yan JF, Qin X, Qiao CX, et al. Auto-segmentation of the clinical target volume using a domain-adversarial neural network in patients with gynaecological cancer undergoing postoperative vaginal brachytherapy[J]. Precis Radiat Oncol, 2023, 7(3): 189-196. [17] Xue X, Liang DZ, Wang KY, et al. A deep learning-based 3D Prompt-nnUnet model for automatic segmentation in brachytherapy of postoperative endometrial carcinoma[J]. J Appl Clin Med Phys, 2024: e14371. DOI: 10.1002/acm2.14371. [18] Small W Jr, Beriwal S, Demanes DJ, et al. American Brachytherapy Society consensus guidelines for adjuvant vaginal cuff brachytherapy after hysterectomy[J]. Brachytherapy, 2012, 11(1): 58-67.
|
|
|
|