论文标题

通过未经训练的神经网络加速MRI

Accelerated MRI with Un-trained Neural Networks

论文作者

Darestani, Mohammad Zalbagi, Heckel, Reinhard

论文摘要

卷积神经网络(CNN)对于图像重建问题非常有效。通常,对CNN进行大量培训图像进行培训。然而,最近,未经训练的CNN(例如Deep Image Prior和Deep Decoder)在图像重建问题(例如DeNoing和Inpainting)中取得了出色的性能,\ emph {而无需使用任何培训数据}。在这一发展的驱动下,我们解决了通过未经训练的神经网络加速MRI引起的重建问题。我们根据深度解码器的变化提出了一种高度优化的未经训练的恢复方法,并表明它明显优于其他未经训练的方法,特别是基于稀疏性的经典压缩传感方法和未经训练的神经网络的天真应用。我们还将在训练方法的理想设置中,特别是在FastMRI数据集上进行了比较(在重建精度和计算成本方面)的性能(无论是在重建精度和计算成本方面),其中培训和测试数据来自相同的分布。我们发现,我们未经训练的算法的性能与基线训练的神经网络相似,但是最先进的训练有素的网络的表现优于未经训练的网络。最后,我们对非理想设置进行比较,其中火车和测试分布略有不同,发现我们未经训练的方法的性能与最新的加速MRI重建方法相似。

Convolutional Neural Networks (CNNs) are highly effective for image reconstruction problems. Typically, CNNs are trained on large amounts of training images. Recently, however, un-trained CNNs such as the Deep Image Prior and Deep Decoder have achieved excellent performance for image reconstruction problems such as denoising and inpainting, \emph{without using any training data}. Motivated by this development, we address the reconstruction problem arising in accelerated MRI with un-trained neural networks. We propose a highly optimized un-trained recovery approach based on a variation of the Deep Decoder and show that it significantly outperforms other un-trained methods, in particular sparsity-based classical compressed sensing methods and naive applications of un-trained neural networks. We also compare performance (both in terms of reconstruction accuracy and computational cost) in an ideal setup for trained methods, specifically on the fastMRI dataset, where the training and test data come from the same distribution. We find that our un-trained algorithm achieves similar performance to a baseline trained neural network, but a state-of-the-art trained network outperforms the un-trained one. Finally, we perform a comparison on a non-ideal setup where the train and test distributions are slightly different, and find that our un-trained method achieves similar performance to a state-of-the-art accelerated MRI reconstruction method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源