Room: 225BCD
Purpose: The combination of the accelerated MRI scan and Deep Learning has potential to increase the value of MRI examination with faster scan and faster reconstruction. Several deep networks have been used in reconstructing parallel MRI but there is still room to improve in training the for better image quality. This work proposed a cascade structure of a Multilayer Perceptron (MLP) and a Convolutional Neural Network (CNN) to improve image quality with reasonable amount of computational resources.
Methods: The two compartments of the cascade MLP-CNN are trained separately then fused together for reconstruction. The input layer of the MLP consists of N_y×N_coil×2 perceptrons to account for the complex data of each coil along the phase encoding direction, and N_y outputs to match the image size. The output image of the MLP along with the undersampled coil images are subsequently polished with the CNN consisting of five hidden layers. The fully sampled data are used as standard reference to train these two networks until convergence. T2W images from 90 healthy volunteers (training/validation = 50/40) from a 3.0T magnet were scanned with TR=9.3s , TE=92ms, FOV=23cm, and forty 4mm axial slices. Three variable density undersampling schemes were tested with uniform acceleration of 2, 3, and 4 along phase encoding direction plus 16 fully sampled phase encoding lines in k-space center.
Results: The reconstruction of the proposed method were compared with the other four different deep networks. The images from the present work using MLP output with additional coil data has highest peak SNR among all the setting (pSNR=43.6dB (R=2), 40.4dB (R=3), and 40.4dB (R=4))
Conclusion: The present work proposed a cascade deep network combining a MLP and a CNN. This deeper network features capabilities from the two individual networks to accelerate training convergence with better image quality.
Funding Support, Disclosures, and Conflict of Interest: Tzu-Cheng Chao receives research funding from Philips Healthcare
Not Applicable / None Entered.