Room: AAPM ePoster Library
Purpose: Deformable Image Registration (DIR) suffers from three main challenges: (1) Low registration accuracy, (2) registration results depend on manual parameter tuning, which is tedious, and (3) lack of robustness to data variation. This work proposes and validates a novel DIR method which uses neural networks to combine segmentation and deformation into a seamless pipeline.
Methods: head-and-neck CT images were acquired from the TCIA database (409 for training and 64 for testing). A previously trained segmentation network was used to generate 18 different contours for each volume. The network-derived body contour was used to automatically remove background equipment. Images were resized to 128x128x128 and normalized between 0-1. For each training iteration, randomly selected source and target images (along with contours) were concatenated together as input into the network, and the output was a deformation vector field. The network architecture was a 3D-Unet based on others’ work with Voxelmorph. A custom dice loss function was incorporated to leverage the automated segmentations, and provides the user with a method to weigh different organs based on segmentation confidence and clinical priority. Total loss was a combination of weighted dice, flow-field smoothness (L2), and deformed-source-to-target similarity (normalized cross-correlation).
Results: for 200 epochs took only 3 hours 10 minutes on a single GPU. Registering on the trained network took 3 seconds. Upon visualization, the network-based registrations qualitatively matched better in the neck, shoulder, and mouth when compared to a traditional B-spline registration (which took 1 minute 57 seconds per registration). The average dice and 95% Hausdorff were 0.70 and 7mm, respectively for the larger organs. Smaller organs had poorer results. This could be due to the down sampling of the volumes from 512x512 to 128x128.
Conclusion: successfully created an end-to-end pipeline for segmentation and deformable registration. Future work will extend this research to multi-modality registration.
Funding Support, Disclosures, and Conflict of Interest: R44CA183390, R43CA183390, R01CA188300, R01CA230278