MENU

Click here to

×

Are you sure ?

Yes, do it No, cancel

Multimodal Image Fusion for Prostate Biopsy and Focal Brachytherapy with Convolutional Neural Network Autosegmentation and Anatomical Landmark-Based Registration

S Sultana , A Robinson , D Song , J Lee*, Johns Hopkins University, Baltimore, MD

Presentations

(Thursday, 7/18/2019) 7:30 AM - 9:30 AM

Room: 225BCD

Purpose: To develop a deformable registration algorithm to fuse multimodal images (MRI or PET/CT) to intraoperative transrectal ultrasound (TRUS) for prostate biopsy and brachytherapy. Accurate registration of multimodal images to intraoperative TRUS will allow physicians to accurately biopsy the target and optimize focal treatment planning.

Methods: The proposed approach was applied to PET/CT-TRUS registration. CT is automatically segmented by a convolutional neural network based on 3D U-net augmented with generative adversarial network (GAN), and TRUS is contoured by the physician during the routine clinical procedure. The segmented prostate masks on CT and TRUS are rigidly registered by maximizing the volume overlap, followed by anatomical landmark extraction from the boundary as well as within the gland. Deformable registration is computed by thin plate spline using the extracted landmarks as control points. Finally, the PET/CT images are deformed to the TRUS space by using the computed transformation.

Results: The proposed algorithm was evaluated on 7 prostate cancer patients treated by low-dose-rate brachytherapy. Each patient had an intraoperative TRUS and 3-4 x-rays from which the implanted seed locations were computed. A post-implant CT was taken 1 day after the implantation where the implanted seeds were segmented semi-automatically. The post-implant CT was registered to TRUS using the proposed algorithm and target registration errors (TREs) were computed based on the seed locations between the TRUS and registered CT. Mean±sd TRE was 2.01±1.43 mm, which outperforms the state-of-the-art methods and is comparable to the registration accuracy based on manual segmentations (1.98±1.34 mm). The registration takes less than a minute including the autosegmentation.

Conclusion: We propose a deformable registration approach to fuse multimodal images to intraoperative TRUS. The proposed algorithm is computationally efficient and produces quality registration of prostate boundary as well as internal gland, thus suitable for intraoperative use in prostate biopsy and brachytherapy.

Funding Support, Disclosures, and Conflict of Interest: This work was supported by the NIH/NCI under the grant R01CA151395.

Keywords

Registration, Brachytherapy, Image Fusion

Taxonomy

IM/TH- Image registration : Multi-modality registration

Contact Email