MENU

Click here to

×

Are you sure ?

Yes, do it No, cancel

Deep Neural Network Image Fusion Without Using Training Data

L Zhu*, P Baturin , Varian Medical Systems, Palo Alto, CA

Presentations

(Tuesday, 7/16/2019) 3:45 PM - 4:15 PM

Room: Exhibit Hall | Forum 5

Purpose: To investigate the feasibility of using Deep Neural Network (DNN) framework in fusing several digital images with inherently different spatial resolution and signal to noise ratio (SNR).

Methods: The Monte Carlo simulation of 6 MV X-ray beam propagation through a line-pairs resolution phantom has been performed. The modeled resolution frequency ranged between zero and ten line-pairs per centimeter. The model included two image receptors producing two independent synthetic images with different spatial and noise information such as what might be found in a dual energy X-Ray flat panel. A combination of images has been performed using a DNN framework with a residual encoder-decoder architecture.

Results: Two synthetic images generated with Monte Carlo simulation were combined into one using DNN architecture that took advantage of noise and spatial resolution characteristics of the images. The proposed DNN-based algorithm was compared to i) the traditional weighted averaging approach used with multi-energy detectors and ii) non-DNN state-of-the-art denoising image fusion technique. The background SNR of the image produced by DNN was 14.8, while SNR of the same region with averaging and denoising approaches yielded 12.3 and 11.1, respectively. The spatial resolution in the DNN-based image was comparable to the weighted average technique but lower than in denoised image achieved at the cost of SNR.

Conclusion: This study demonstrated feasibility of using a training-free DNN architecture with X-ray based input images to produce a fused image with optimal SNR and spatial resolution characteristics. Knowing the physics of the imaging scenario allowed the DNN to be built without the typically required large training set. Future work will investigate other loss functions and DNN architectures that have similar self-regularization properties.

Keywords

Not Applicable / None Entered.

Taxonomy

Not Applicable / None Entered.

Contact Email