Room: AAPM ePoster Library
Purpose: To validate the performance of a pupil-tracking algorithm designed to monitor eye position and orientation during radiotherapy of ocular tumors.
Methods: A custom frame was developed, integrating a 2-megapixel infrared digital camera installed above the supine patient. While the patient stared at a fixed target, the system captured eye motion and computed the deviation between the actual pupil position and its reference position as established during treatment planning. The tracking images and metrics are displayed in real-time on a user-friendly interface. Any deviation exceeding a pre-set tolerance is flagged, allowing the operating staff to interrupt treatment. The Python-based software was implemented with a template-matching technique from the Open Computer Vision (OpenCV) library. Validation of the algorithm was performed on 30-second video acquisitions from a volunteer cohort (n=6) with variations in eye color, morphology and blinking pattern. We evaluated (1) the accuracy of pupil motion tracking, (2) the optimal pupil detection threshold for best template-matching performances, and (3) the overall robustness for various patients and luminosity levels.
Results: (1) The average difference between the tracking data with the actual pupil position was of (0.3±0.2) mm for a wide range of pupil displacements, thus ensuring submillimetric positional accuracy of the algorithm. (2) The average pupil detection level reached (93±5) % when the eye was both visible and open; (66±7) % when it was either closed or hidden. A detection threshold of 75% was found to be optimal for managing eye blinking while maintaining good sensitivity. Additionally, (3) tracking performances has shown to be invariant to eye color and ambient luminosity, thanks to the near-infrared spectral range of the camera.
Conclusion: The overall performance of our tool allows us to continue with clinical tests for live position monitoring and to develop new tracking applications on CyberKnife.