Room: Stars at Night Ballroom 2-3
Purpose: MR guided radiotherapy provides images not only for patient positioning but also for online adaptive radiotherapy. Accurate delineation of organs-at-risk (OARs) on Head and Neck (H&N) MR images is valuable to both initial treatment planning and adaptive planning, but manual contouring is laborious and inconsistent. A novel method based on the generative adversarial network (GAN) with shape constraint (SC-GAN) is developed for fully automated H&N OARs segmentation on low-field MRI.
Methods: A deep supervised fully convolutional DenseNet is employed as the segmentation network for voxel-wise prediction. A CNN based discriminator network is then utilized to correct predicted errors and image-level inconsistency between the prediction and ground truth. An additional shape representation loss between the prediction and ground truth is integrated into the objective function to reduce false positivity and constrain the predicted shapes. The proposed segmentation method was evaluated on 25 0.35T MR images obtained from an MR guided radiotherapy system. The performance of the proposed SC-GAN was compared with GAN alone and GAN with SRM but without the DenseNet (GAN-SRM) to quantify the contributions of shape constraint and DenseNet in the proposed method.
Results: The following average Dice’s indices were obtained using SC-GAN: 0.916 (Brainstem), 0.589 (Optical chiasm), 0.816 (Mandible), 0.703 (Optical nerves), 0.799 (Larynx), 0.706 (Pharynx), and 0.845 (Parotid glands). The average surface distances ranged from 0.68mm (Brainstem) to 1.70mm (Larynx). The SC-GAN performance is superior to GAN-SRM, which is more accurate than GAN alone. The segmentation time for one patient is 14 seconds using a single GPU.
Conclusion: With the novel segmentation method, we showed that the low field MR images acquired on a MR guided radiation radiotherapy system can support accurate and fully automated segmentation of both bony and soft tissue OARs for adaptive radiotherapy.
Funding Support, Disclosures, and Conflict of Interest: This work was supported by the CSC Chinese Government Scholarship, NIH Grants (R01CA188300).
Not Applicable / None Entered.
Not Applicable / None Entered.