Room: AAPM ePoster Library
Purpose: The purpose of this study was to develop a machine-learning event classification model to exclude false coincident PG data, partial energy deposition and/or scatter data. Compton cameras (CC) used for proton range verification face significant challenges at realistic clinical dose rates; this is primarily due to detected interactions that contribute only noise to the data, such as false double or triple scatter prompt gamma (PG) interactions, and are thus not useful for reconstruction. Additionally, true coincidence PG data can be degraded due to partial energy deposition in the detector, or due to the PG scattering within the patient (phantom) before interacting with the detector.
Methods: Using a Monte Carlo simulation of our preclinical CC and a 150 MeV clinical proton pencil beam irradiating a tissue-equivalent phantom, we generated realistic CC data that included true/false double and triple scattered PG interactions, full/partial energy deposition, and scattered/unscatterer PGs. We then trained a fully connected deep neural network (NN) to detect and individually classify each of these three types of events within the data. We then tested the fully trained an NN using PG datasets containing 150,000 events to determine their ability to correctly identify true coincidence PG that can be used for CC image reconstruction.
Results: For identifying true/false double and triple scatter events, our NN achieved 89% accuracy. For full/partial energy deposition events, the NN also achieved 89% accuracy. For unscattered/scattered PGs, it achieved 80% accuracy.
Conclusion: Neural networks offer an encouraging method to improve the quality and usability of CC data used for image reconstruction. However, more refinement and sophisticated machine learning techniques, such as convolutional neural networks, can likely improve the results. Additionally, we are further investigating what reconstruction improvements can result from better classification.
Funding Support, Disclosures, and Conflict of Interest: Funded by National Institutes of Health National Cancer Institute Award R01CA187416.
IM/TH- Image Analysis Skills (broad expertise across imaging modalities): Machine Learning