论文标题

通过Cyclegan适应了无监督的域对多中心MR图像中的白质高强度分割

Unsupervised Domain Adaptation via CycleGAN for White Matter Hyperintensity Segmentation in Multicenter MR Images

论文作者

Palladino, Julian Alberto, Slezak, Diego Fernandez, Ferrante, Enzo

论文摘要

磁共振图像中白质超强度的自动分割至关重要,研究重要性。这些病变的量化是中风,痴呆和死亡率的预测因子。在过去的几年中,专门针对生物医学图像分割量身定制的卷积神经网络(CNN)优于此任务中所有以前的技术。但是,它们非常依赖于数据,并且只有当培训和测试数据集之间的数据分布保持不变时,才保持良好的性能。当这种分布发生变化但我们仍然旨在执行相同的任务时,我们会在域适应问题中引起(例如,使用其他MR机器或不同的采集参数进行培训和测试数据)。在这项工作中,我们探讨了对循环矛盾的对抗网络(Cyclegan)的使用,以对具有脑部病变的多中心MR图像进行无监督的域适应。我们旨在学习一个映射功能,以在域之间改变体积MR图像,这些域的特征是不同的医疗中心和MR机器,具有不同的品牌,模型和配置参数。我们的实验表明,CycleGAN使我们能够减少MR域之间的Jensen-Shannon差异,从而在没有标记的数据的域上使用CNN模型自动分割。

Automatic segmentation of white matter hyperintensities in magnetic resonance images is of paramount clinical and research importance. Quantification of these lesions serve as a predictor for risk of stroke, dementia and mortality. During the last years, convolutional neural networks (CNN) specifically tailored for biomedical image segmentation have outperformed all previous techniques in this task. However, they are extremely data-dependent, and maintain a good performance only when data distribution between training and test datasets remains unchanged. When such distribution changes but we still aim at performing the same task, we incur in a domain adaptation problem (e.g. using a different MR machine or different acquisition parameters for training and test data). In this work, we explore the use of cycle-consistent adversarial networks (CycleGAN) to perform unsupervised domain adaptation on multicenter MR images with brain lesions. We aim at learning a mapping function to transform volumetric MR images between domains, which are characterized by different medical centers and MR machines with varying brand, model and configuration parameters. Our experiments show that CycleGAN allows us to reduce the Jensen-Shannon divergence between MR domains, enabling automatic segmentation with CNN models on domains where no labeled data was available.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源