论文标题
学习稀疏非线性回归的快速近似
Learning Fast Approximations of Sparse Nonlinear Regression
论文作者
论文摘要
将迭代算法作为深神经网络展开的想法已被广泛应用于解决稀疏编码问题,从而提供了融合率的坚实理论分析和出色的经验性能。但是,对于稀疏的非线性回归问题,由于非线性的复杂性,很少有类似的想法被利用。在这项工作中,我们通过引入非线性学习的迭代收缩阈值算法(NLISTA)来弥合这一差距,该算法可以在适当的条件下达到线性收敛。关于合成数据的实验证实了我们的理论结果,并显示我们的方法的表现优于最先进的方法。
The idea of unfolding iterative algorithms as deep neural networks has been widely applied in solving sparse coding problems, providing both solid theoretical analysis in convergence rate and superior empirical performance. However, for sparse nonlinear regression problems, a similar idea is rarely exploited due to the complexity of nonlinearity. In this work, we bridge this gap by introducing the Nonlinear Learned Iterative Shrinkage Thresholding Algorithm (NLISTA), which can attain a linear convergence under suitable conditions. Experiments on synthetic data corroborate our theoretical results and show our method outperforms state-of-the-art methods.