论文标题

通过降低方差随机近似的稀疏恢复

Sparse recovery by reduced variance stochastic approximation

论文作者

Juditsky, Anatoli, Kulunchakov, Andrei, Tsyntseus, Hlib

论文摘要

在本文中,我们讨论了迭代随机优化程序的应用,以在噪音观察中稀疏信号恢复问题。使用随机镜下降算法作为构建块,我们开发了一个多阶段程序,以恢复稀疏的溶液,以在预期目标上的平滑度和二次缩略度中恢复随机优化问题。所提出的算法的一个有趣特征是,当梯度观察中随机误差的组件的成分是,最佳解决方案的初始近似值大于“理想的”渐近误差分量,因此在梯度观察中的随机误差的成分时,近似解决方案的线性收敛。我们还展示了如何通过使用均值(如技术)来直接提高相应解决方案的可靠性。 我们说明了在广义线性回归框架中稀疏和低等级信号恢复的经典问题中所提出的算法的性能。在回归器和噪声分布的假设相当较弱的情况下,它们如何导致遵守的参数估计值(直达问题维度和置信度与对数的因素)为我们的准确性界限。

In this paper, we discuss application of iterative Stochastic Optimization routines to the problem of sparse signal recovery from noisy observation. Using Stochastic Mirror Descent algorithm as a building block, we develop a multistage procedure for recovery of sparse solutions to Stochastic Optimization problem under assumption of smoothness and quadratic minoration on the expected objective. An interesting feature of the proposed algorithm is linear convergence of the approximate solution during the preliminary phase of the routine when the component of stochastic error in the gradient observation which is due to bad initial approximation of the optimal solution is larger than the "ideal" asymptotic error component owing to observation noise "at the optimal solution." We also show how one can straightforwardly enhance reliability of the corresponding solution by using Median-of-Means like techniques. We illustrate the performance of the proposed algorithms in application to classical problems of recovery of sparse and low rank signals in generalized linear regression framework. We show, under rather weak assumption on the regressor and noise distributions, how they lead to parameter estimates which obey (up to factors which are logarithmic in problem dimension and confidence level) the best known to us accuracy bounds.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源