Room: Track 2
Purpose: To increase the accuracy and efficiency of ultrasound (US) based high-dose-rate (HDR) brachytherapy treatment for prostate cancer, we proposed a deep-learning based method to assist the automatic segmentation of the needles.
Methods: With the institutional review board approval, the US images and HDR treatment plan files with needle positions were collected from the clinical database to construct the network training, testing and validation dataset. Data processing including transferring images to polar coordinates, batching, cropping, and data augmentation were performed for efficient training and avoiding overfitting. The neural network for needle track segmentation was based on U-Net with an encoder-decoder and shortcut structure. Rectified Linear Units functions, softmax and categorical cross-entropy loss function were applied for the training. Another neural network with the architecture of Vgg16 was developed to detect needle tips. The mean-squared error was used as the cost function and the network structure and parameters were optimized.
Results: Clinical data from 840 prostate HDR treatments with over 80000 US images were utilized. The needle trajectories were successfully auto segmented with mean errors of 0.668 mm in the left-right (X) direction and 0.319 mm in the anterior-posterior (Y) direction compared to the manual digitization. Histogram analysis revealed that along the needle trajectories, 95.4% and 99.2% of the auto-digitization points were within 2 mm agreement to the manual points in the X and Y directions, respectively. The needle tip detection network was successfully trained with positioning accuracies of 0.810 mm in the axial direction, and 1.877 mm in the superior-inferior directions.
Conclusion: The proposed deep-learning based approach successfully segmented HDR brachytherapy needles in US images automatically. This method can be applied in the clinical procedure as a valuable tool to potentially improve the efficiency and quality of treatment.
Funding Support, Disclosures, and Conflict of Interest: This research is partially supported by a Varian Research Grant