论文标题

混合ISTA:使用自由形式的深神经网络保证具有收敛性的ISTA

Hybrid ISTA: Unfolding ISTA With Convergence Guarantees Using Free-Form Deep Neural Networks

论文作者

Zheng, Ziyang, Dai, Wenrui, Xue, Duoduo, Li, Chenglin, Zou, Junni, Xiong, Hongkai

论文摘要

通过将迭代算法(例如,迭代收缩阈值算法(ISTA))作为具有可学习参数的深神经网络(DNNS),可以解决线性反问题。但是,现有的基于ISTA的展开算法限制了通过部分重量耦合结构进行迭代更新的网络体系结构,以确保收敛。在本文中,我们提出混合ISTA通过合并自由形式的DNN(即具有任意可行和合理的网络体系结构的DNN),同时确保理论上的融合,将Hybrid ISTA与预计和学习的参数一起展开。我们首先开发HCISTA,以提高经典ISTA(带有预计参数)的效率和灵活性,而不会损害理论上的收敛速度。此外,基于DNN的混合算法被推广到被称为HLISTA的学识渊博的流行变体,以启用具有线性收敛的保证的自由架构。据我们所知,本文是第一个提供可收敛的框架,该框架可以在基于ISTA的展开算法中自由形式的DNN。该框架通常是赋予解决趋同保证的线性反问题的任意DNN。广泛的实验表明,在稀疏恢复和压缩感应的任务中,混合ISTA可以减少重建误差,并提高收敛速度。

It is promising to solve linear inverse problems by unfolding iterative algorithms (e.g., iterative shrinkage thresholding algorithm (ISTA)) as deep neural networks (DNNs) with learnable parameters. However, existing ISTA-based unfolded algorithms restrict the network architectures for iterative updates with the partial weight coupling structure to guarantee convergence. In this paper, we propose hybrid ISTA to unfold ISTA with both pre-computed and learned parameters by incorporating free-form DNNs (i.e., DNNs with arbitrary feasible and reasonable network architectures), while ensuring theoretical convergence. We first develop HCISTA to improve the efficiency and flexibility of classical ISTA (with pre-computed parameters) without compromising the convergence rate in theory. Furthermore, the DNN-based hybrid algorithm is generalized to popular variants of learned ISTA, dubbed HLISTA, to enable a free architecture of learned parameters with a guarantee of linear convergence. To our best knowledge, this paper is the first to provide a convergence-provable framework that enables free-form DNNs in ISTA-based unfolded algorithms. This framework is general to endow arbitrary DNNs for solving linear inverse problems with convergence guarantees. Extensive experiments demonstrate that hybrid ISTA can reduce the reconstruction error with an improved convergence rate in the tasks of sparse recovery and compressive sensing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源