Purpose: To develop and improve fully automated multimodality registration in the head-and-neck using deep-learning derived images and contours.
Methods: 25 head and neck patients with paired CT-MR images were acquired with ground truth contours on each modality. 19 of these patients were used to train one neural network to synthesize CTâ€™s from MR images, and another network to generate automatic contours on both the MR and CT images. The contours were brainstem, chiasm, cord, larynx, mandible, optic nerve, parotids, and pharynx. 6 of the patients were used for testing. For each of the 6 testing MR images, we found a separate CT image with different head pose. A distance transform map was calculated for binarized contours, which were automatically segmented by the deep learning neural network. Registrations were performed in both directions between MR and CT-unaligned, as well as between synthetic CT and CT-unaligned, simultaneously registering the images and their contour-derived distance transforms. Deformable registrations used a multi-resolution b-spline and multimetric approach, with the image volumes using mutual information and the contours using mean-square-error. A landmark analysis using Euclidean error between target and deformed source was performed. The 11 landmarks were the 7 cervical vertebrae, dens of C2, eyes, chin, and nose.
Results: The average landmark error between the rigidly aligned MR and CT-unaligned was 13mm. Deformable registration reduced this error to 11mm (MR-to-CT direction) and 8mm (CT-to-MR direction). Replacing the MR with a synthetic CT and using contours gave an improvement of 5mm in the MR-to-CT direction (11mm vs. 6mm) and 2mm in the CT-to-MR direction (8mm vs. 6mm). The nose was the largest landmark error.
Conclusion: We showed a fully automated deep-learning approach to significantly improve multimodality deformable registration accuracy in the head and neck. Future work will aim to register the contours as separate volumes.
Funding Support, Disclosures, and Conflict of Interest: NIH R44CA183390, NIH R01CA188300