Room: AAPM ePoster Library
To build an automation platform for various steps in radiotherapy workflow by creating a data-driven pipeline into a data lake, and centralizing the workflow monitor across different vendor systems.
Radiotherapy (RT) workflow involves multistage processes including patient consultation, simulation, segmentation, treatment planning, plan QA, treatment, and follow-ups. Each stage involves the consumption and production of varieties of data from different systems. We use the data lake approach to integrate RT and non-RT EMR/EHR data to facilitate the data flow across process stages. We leverage Big Data technologies by streaming data using Apache Nifi and constructing an ETL (extract, transform and load) pipeline using in-memory computation framework based on Apache Spark and the FHIR protocol. The resulting curated data can be interactively analyzed and retrieved using the high-performance real-time database Druid. On the front-end, we built a workflow monitor dashboard. The end-to-end pipeline is triggered by various workflow events.
We built a vendor-neutral RT automation platform, integrating data from five sources including RT objects, EHR records, clinical notes, treatment scheduling, and tasks assignments. We automated workflows for segmentation QA, plan check, and plan QA. The standardized longitudinal patient records in the data lake provide oncologists and physicists a complete view of the patient journey during radiotherapy treatment.
A data lake approach, that pulls in all relevant RT information and monitors workflow events, provides a uniform platform to implement applications in automating RT workflows, thereby improving efficiency.
Data Acquisition, Computer Software