Room: AAPM ePoster Library
Purpose:
To investigate the feasibility of tracking targets in 2D fluoro projection images using a novel deep learning network.
Methods:
Our model design aims to capture the consistent motion of tumors by adopting several most advanced techniques. Specifically, the model is trained by generative methods, which consists of a generator and a discriminator. The generator is a coarse-to-fine architecture design. Convolutional LSTM modules are introduced in our network to account for the time correlation between different frames of the fluoro images. The model was trained and tested using a digital X-CAT phantom to demonstrate its feasibility. 150 phantoms of different scales, tumor positions, sizes, and respiration amplitudes are generated in X-CAT. Our model was trained and tested using 110 and 30 phantoms respectively. Another 100 phantoms were generated with fixed body and tumor sizes but different respiration amplitude to investigate the effects of motion amplitude on the tracking accuracy. In this dataset, the model was trained and tested using 80 and 20 phantoms. The tracking accuracy was quantitatively evaluated using intersection over union (IOU) and centroid of mass difference (COMD).
Results:
On massive samples, the IOU achieves 0.92 while the centroid of mass difference is merely 0.16 cm and 0.07 cm in vertical and horizontal direction. The generalization ability of our model is proved. On unique set of samples, the IOU is even higher to be 0.98. Mean centroid of mass difference is 0.03 and 0.01 cm in vertical and horizontal direction. The result is very robust and prove our assumption, thus having a very bright future.
Conclusion:
Our study shows the feasibility to use deep learning to track targets in x-ray fluoro projection images without aid of markers. The technique can be valuable for both pre- and during-treatment real time target verification using fluoro imaging in lung SBRT treatments.
Funding Support, Disclosures, and Conflict of Interest: supported by NIH grants R01-CA184173 and R01-EB028324
Not Applicable / None Entered.
Not Applicable / None Entered.