Room: AAPM ePoster Library
The RBE of x-rays and gamma radiation increases substantially with decreasing beam energy. This trend affects the efficacy of medical applications of this type of radiation. We develop a model based on a survey of experimental data that can reliably predict this trend.
In our model, parameters a and ß of a cell survival curve are simple functions of the frequency-average LET (LF) of delta-electrons. The choice of these functions was guided by a microdosimetry-based model. The model justifies the use frequency-averaging instead of more common dose-averaging. To calculate LF we used an innovative algorithm in which LF accounts only for those electrons that actually reach a sensitive-to-radiation volume (SV) within the cell. We determined model parameters by fitting the model to 139 measured (a, ß) pairs.
The microdosimetric model has a discrete parameter that determines how a and ß depend on LF. We found that the best agreement was achieved with va and ß being linear functions of LF. Because we account only for those electrons that entered SV, LF depends on its size. This dependence is very weak, therefore we were able to estimate only that the SV diameter was on the order of 0.1-1 µm. We also found that a, ß, and the a/ß ratio increased with increasing LF. Explaining high RBEs of ultra-soft x-rays (E<10 keV) has been a challenge for many models. Our data offer a simple answer: they are caused by a high ionization density in the SV that is represented by LF.
By combining a microdosimetric model of cell survival with an innovative method for calculating the frequency-average LET (LF) we developed a model that is consistent with extensive experimental data involving photon energies from 0.27 keV to 1.25 MeV.
Funding Support, Disclosures, and Conflict of Interest: This study was supported by NIH/NCI grants: R01 CA225961, P30 CA016672