论文标题

神经重建背后的凸正则化

Convex Regularization Behind Neural Reconstruction

论文作者

Sahiner, Arda, Mardani, Morteza, Ozturkler, Batu, Pilanci, Mert, Pauly, John

论文摘要

神经网络显示出在反问题中重建高分辨率图像的巨大潜力。然而,神经网络的非凸和不透明性质阻碍了它们在诸如医学成像之类的敏感应用中的效用。为了应对这一挑战,本文提倡一个凸双重性框架,该框架使两层全趋化的归功于降解网络可及时进行凸优化。凸双网络不仅提供了凸求解器的最佳培训,而且还促进了口译培训和预测。特别是,它意味着训练具有重量衰减正则化的神经网络会导致路径稀疏性,而预测是分段线性滤波的。 MNIST和FASTMRI数据集的一系列实验证实了双网络优化问题的功效。

Neural networks have shown tremendous potential for reconstructing high-resolution images in inverse problems. The non-convex and opaque nature of neural networks, however, hinders their utility in sensitive applications such as medical imaging. To cope with this challenge, this paper advocates a convex duality framework that makes a two-layer fully-convolutional ReLU denoising network amenable to convex optimization. The convex dual network not only offers the optimum training with convex solvers, but also facilitates interpreting training and prediction. In particular, it implies training neural networks with weight decay regularization induces path sparsity while the prediction is piecewise linear filtering. A range of experiments with MNIST and fastMRI datasets confirm the efficacy of the dual network optimization problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源