论文标题

与部分观察数据的反问题的变异表示和求解器的联合学习

Joint learning of variational representations and solvers for inverse problems with partially-observed data

论文作者

Fablet, Ronan, Drumetz, Lucas, Rousseau, Francois

论文摘要

设计适当的变分正规化方案是解决反问题的关键部分,使它们更好地解决,并确保相关优化问题的解决方案满足了理想的特性。最近,通过学习直接反转方案或从可用的真实状态和观察结果中学习直接反转方案或插件的正规化器,基于学习的策略似乎非常有效地解决反问题。在本文中,我们走得更远,设计一个端到端框架,允许在这种监督环境中学习实际变分框架。差异成本和基于梯度的求解器都使用后者自动分化为神经网络。我们可以共同学习两个组件,以最大程度地减少真实状态的数据重建误差。这导致了数据驱动的变异模型的发现。我们考虑对数据集不完整的逆问题的应用(图像介入和多元时间序列插值)。我们在实验上说明,该框架可以在重建性能方面带来显着提高,包括W.R.T.从已知生成模型得出的变异配方的直接最小化。

Designing appropriate variational regularization schemes is a crucial part of solving inverse problems, making them better-posed and guaranteeing that the solution of the associated optimization problem satisfies desirable properties. Recently, learning-based strategies have appeared to be very efficient for solving inverse problems, by learning direct inversion schemes or plug-and-play regularizers from available pairs of true states and observations. In this paper, we go a step further and design an end-to-end framework allowing to learn actual variational frameworks for inverse problems in such a supervised setting. The variational cost and the gradient-based solver are both stated as neural networks using automatic differentiation for the latter. We can jointly learn both components to minimize the data reconstruction error on the true states. This leads to a data-driven discovery of variational models. We consider an application to inverse problems with incomplete datasets (image inpainting and multivariate time series interpolation). We experimentally illustrate that this framework can lead to a significant gain in terms of reconstruction performance, including w.r.t. the direct minimization of the variational formulation derived from the known generative model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源