Room: AAPM ePoster Library
Purpose:
Ultrasound image noise and artifacts present unique challenges for traditional image analysis. Deep learning represents a new paradigm in computer vision and provides an exciting opportunity for advancements in ultrasound-guided brachytherapy procedures. We developed the first iteration of a deep-learning segmentation model to localize implanted catheters in ultrasound-based high-dose-rate prostate brachytherapy.
Methods:
The U-NET deep learning architecture for biomedical image segmentation was used as a model to localize implanted catheters on transrectal ultrasound images. Over 23,000 transverse images from 181 retrospective prostate brachytherapy patients were available from a single institution. The corresponding catheter label was generated from manually localized catheters in the treatment plan.
80 % of the dataset was used for training (11 epochs) and required 6 hours on an Intel Xeon E3-1245 3.7GHz CPU. No GPU was used. Evaluation results were reported for the test set (20 % of dataset). Due to inconsistent catheter visibility in the transverse view, the model prediction incorporated results from nearby correlated images.
Results:
73 % of catheter predictions were within 0.8 mm centroid-to-centroid of the manually localized catheter, averaged across all patients. Prediction accuracy for individual patients ranged from 58 % to 88 % (95 % C.I.), likely due to inconsistent ultrasound scan quality and patient-specific factors. Prediction time for an individual slice was 150 ms and was less than a minute for a patient.
Obvious erroneous results were observed with predicted catheters far away from the prostate, suggesting potential improvements if a prior clinical information can be incorporated into the model prediction.
Conclusion:
We applied an existing deep-learning segmentation architecture for use in ultrasound-based HDR prostate brachytherapy. The catheter localization accuracy is respectable given no major adjustments to the original U-NET model. Refinements and adjustments to the model have the potential to further improve performance.
Funding Support, Disclosures, and Conflict of Interest: This project was funded by the Saskatchewan Cancer Agency Operating Grant #421376.