Room: Track 3
Purpose: To apply an image metric analysis, the structural similarity index (SSIM), to electronic portal imaging device (EPID) based quality assurance (QA) images on intensity modulated radiotherapy (IMRT). To develop a method to detect and classify the errors using radiomic features extracted from SSIM images for the measured portal dose (PD) images.
Methods: Sixty sliding-window IMRT beams from three head and neck and four abdominal patient treatments were measured with EPID. Four image channels including the three sub-indices of the SSIM: luminance, contrast and structure index and a gaussian-transformed difference map were calculated for each beam using the predicted PD image and measured PD image under various scenarios: a delivery without errors, and deliveries with simulated systematic or random multileaf collimator (MLC) mispositioning or machine output (MU) errors. The radiomic features were extracted from the four channels for each case. A logistic regression model was trained to predict the error types based on the features from the four channels. The performance of the model was evaluated and then compared to the conventional gamma analysis with a 3 mm/3% criteria.
Results: 82% of simulated errors in the whole testing set were correctly classified. Excellent performance of the model was found on classifying the MU, random and systematic errors: the sensitivities were 0.92, 1.0 and 0.94, respectively. However, 8% of the error-free cases were identified and 92% was misclassified as MU errors. Using the gamma analysis, 35.6%, 78.3% and 81.7% of the systematic, MU and random error cases, respectively, were misidentified as error free.
Conclusion: The feasibility of error classification in EPID-based IMRT QA using radiomic analysis of sub-indices of SSIM has been demonstrated. Comparing to gamma analysis, our model has better performance both on error detection and classification. Further study will be done to improve the identification of the error-free cases.
Not Applicable / None Entered.
Not Applicable / None Entered.