Room: AAPM ePoster Library
Purpose: To accurately estimate the proton stopping-power-ratio (SPR) with dual-energy CT imaging by employing a convolutional neural network (CNN) approach.
Methods: Data were prepared by using 120 kVp CT images of prostate and head-and-neck (HN) patients as a template. Reference material and density map was created from the template using a custom-defined density and material translation curve. 80 kVp and 150 kVp/Sn DECT image pairs were obtained by ray tracing simulation and its corresponding SPR was calculated using the reference. Images of computational phantoms were also generated this way and used as the training data for the CNN method. A U-net model was trained with DECT images as the input and SPR map as the output using 1) 3440 patient images (ideal scenario) or 2) 1200 phantom images (realistic scenario). These scenarios were tested on 400 prostate and 400 HN images (20 images per patient). SPR maps generated with the Unet methods were quantitatively compared with a popular conventional model.
Results: Compared to a conventional parametric model, the U-net trained with the computational phantom reduced the SPR estimation uncertainty of the prostate patient from 1.10% to 0.71% and from 2.11% to 1.20%. With the U-net trained with patient images (ideal scenario) uncertainty values were 0.32% and 0.42% for prostate and HN patients, respectively. Enhancements were more prominent on bone tissues and especially HN data which is more prone to beam hardening due to its higher bone content.
Conclusion: The imaging artifact arising from beam hardening is considered one of the most problematic sources of error in DECT-based SPR estimation. In this study, the influence of this artifact to range estimation was successfully mitigated in the image domain with the use of CNN. The result suggests that CNN has great potential to improve the accuracy of SPR estimation in proton therapy.
Funding Support, Disclosures, and Conflict of Interest: This work was financially supported by the Cancer Prevention and Research Institute of Texas grant (RP160661).
Not Applicable / None Entered.