Room: Track 1
Purpose: train a deep neural network to auto-segment lung tumors from T2-weighted MRI lacking expert segmentations on MRI by leveraging unrelated expert segmented CT datasets.
Methods: developed a new deep neural network based lung tumor segmentation from T2-weighted MRI volumes. Our approach jointly trains generative adversarial network and a segmentation network to simultaneously transform CT into T2w MRI image sets and to learn a segmentation model using the transformed T2w MR images. Our main contribution is a joint density structure discriminator that combines the translated images with the computed segmentation probability maps to constrain CT to MRI translation. This approach penalizes translations that fail to preserve the structures of interest for segmentation on transformed MRI. As a result, our approach reduces negative modality transfer, which can adversely impact segmentation performance on the target modality when trained using only the transformed MR image sets. We trained our model using 9696 unlabeled 2D MR image patches of size 256 x 256 and extracted from 35 T2w MRI combined with 32000 CT slices obtained from 377 scans with non-small cell lung cancer available from the Cancer Imaging Archive. Independent testing was performed on 40 T2w MRI scans obtained from 22 patients. Performance comparisons was done against multiple state-of-the-art methods using Dice similarity coefficient (DSC) and Hausdorff distance at 95th percentile (HD95) metrics.
Results: approach produced the best 3D segmentation accuracy on the testing set of DSC 0.76 ± 0.13 and a HD95 of 8.64 ± 4.58mm. The next best state-of-the-art method using synergistic image and feature adaptation achieved a significantly lower (p=0.00025) accuracy of DSC 0.72 ± 0.15 and HD95 of 11.97 ± 6.12mm.
Conclusion: found that adversarial training of a joint density discriminator that combines translated images with the task-relevant segmentation probability maps to constrain unsupervised translation resulted in most accurate segmentation.