Purpose: We proposed a deep-learning method using modified U-NET (convolutional neural network for biomedical image segmentation) to improve image quality and CT number accuracy of fast-scan low-dose cone-beam computed tomography (CBCT) for head-and-neck (HN) cancer patients.
Methods: CBCT and planning CT (pCT) pairs from 55 patients were selected. Among them, 15 pairs were acquired on the same day from HN patients who had gone through manual adaptive re-planning process, while 40 pairs were from post-operative HN patients with minimal anatomy changes (1-2 weeks apart). CBCT and pCT were co-registered and confirmed <5 mm external/anatomy deviation for all 55 pairs. A 2D U-Net was used with 19-layers in 5 depths. A total of 2080 slice pairs from 40 post-operative patients were used as training dataset, while 260 slice pairs from five same-day image pairs were used for validation, and 520 slice pairs from ten same-day image pairs were used for testing. Additional networks were trained using 30, 40, 50 patient datasets respectively to verify equivalent effectiveness between post-operative and same-day data pairs, and data sufficiency of using 40 training datasets. The enhanced CBCT (eCBCT) images were validated against pCT and quantified by mean absolute error (MAE) of Hounsfield units (HU), signal-to-noise ratio (SNR) and structural similarity (SSIM).
Results: Comparing eCBCT with the original CBCT for the 10 testing datasets, average MAE improved from 172.73 to 49.28 HU, SNR from 8.27 to 14.25 dB, and SSIM from 0.42 to 0.85. Soft tissue contrast was improved with restoration of anatomical details and removal of CBCT artifacts. Image enhancement process takes 2 seconds of GPU time per scan.
Conclusion: This deep-leaning method proved to be fast and effective for enhancing low-dose fast-scan CBCT, and the results demonstrate potential utility for real-time online-adaptive re-planning for HN cancer patients.
Funding Support, Disclosures, and Conflict of Interest: This work is supported by in part by a scholarship (N. Yuan) from China Scholarship Council (CSC) under the Grant CSC No. 201706080096.