Room: AAPM ePoster Library
Purpose: MRI allows accurate and reliable target and soft-tissue organ delineation for many disease sites in radiation therapy due to its superior soft tissue contrast. Organ-at-risk (OAR) delineation is labor-intensive, time-consuming and subjective. This study aims to develop a deep learning-based multi-organ MRI segmentation to deal with this issue during the treatment planning process for head-and-neck (HN) cancer radiotherapy.
Methods: We propose a novel regional convolutional neural network (R-CNN) which is supervised by a mask which is obtained from pyramid features and the estimated volumes-of-interest (VOIs), using a deep attention feature pyramid network (DAFPN) as the backbone. The location and shape of each VOI are predicted by the DAFPN combined with a regional proposal network (RPN), while the features are extracted by the DAFPN followed by a pyramid feature collection model. The final delineation of OARs is achieved using these VOI information (location and shape) and the feature maps. MR images of 45 HN cancer patients and their manual contours by experienced physicians as ground truth were used to train and test our proposed model. We used a leave-one-out cross-validation method to evaluate the proposed method.
Results: The mean Dice similarity coefficients of left parotid, right parotid, mandible, oral cavity, spinal cord, esophagus, larynx, and pharynx are 0.82±0.04, 0.81±0.04, 0.85±0.04, 0.89±0.03, 0.89±0.02, 0.84±0.04, 0.79±0.06, and 0.85±0.06, respectively. After the model training, all OAR can be segmented within 10 seconds.
Conclusion: We have proposed and investigated a novel deep learning-based fully automatic HN multi-organ segmentation algorithm for MRI of HN cancer patients. The accurate HN OAR delineation enables further development of MRI-only based radiotherapy workflow for HN cancer.
Not Applicable / None Entered.
Not Applicable / None Entered.