论文标题

通过适应性正规化硬阈值的稀疏凸优化

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

论文作者

Axiotis, Kyriakos, Sviridenko, Maxim

论文摘要

稀疏凸优化的目标是在稀疏约束$ s \ s \ leq s^*γ$下优化凸函数$ f $,其中$ s^*$是可行解决方案(sparsity)和$γ\ geq 1 $中的非零条目的目标数。在限制条件数量的$κ$方面,已经有很多工作来分析各种算法(Lasso,正交匹配追击(OMP),迭代硬阈值(IHT))的稀疏性保证。最著名的算法保证,可以找到一个值$ f(x^*)+ε$的大约解决方案,其稀疏度限制为$γ= o \ left(κ\ min \ left \ left \ {\ log \ frac {f(x^0)-f(x^*)-f(x^**)-f(x^*)}ε,κ\ right \ right \ right \ right \ right right progients $ x^我们提出了一种新的适应性正规化的硬阈值(ARHT)算法,该算法通过将绑定降低到$γ= O(κ)$来取得了重大进展,该算法已显示出包括Lasso,opp和iht在内的一般算法的一般类别。与已知算法最快的算法相比,在运行时效率上没有明显牺牲的实现。在条件$ s> s> s> s> s^* \ frac {κ^2} {4} $下,我们还提供了一般$ f $的替换(OMPR)的新分析,该条件在受限的等轴测属性(RIP)下产生压缩感测界。与其他压缩传感方法相比,它具有在RIP条件和溶液稀疏之间提供强大的权衡,同时为满足RIP条件的任何一般功能$ f $而工作。

The goal of Sparse Convex Optimization is to optimize a convex function $f$ under a sparsity constraint $s\leq s^*γ$, where $s^*$ is the target number of non-zero entries in a feasible solution (sparsity) and $γ\geq 1$ is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number $κ$. The best known algorithms guarantee to find an approximate solution of value $f(x^*)+ε$ with the sparsity bound of $γ= O\left(κ\min\left\{\log \frac{f(x^0)-f(x^*)}ε, κ\right\}\right)$, where $x^*$ is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $γ=O(κ)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general $f$, under the condition $s > s^* \frac{κ^2}{4}$, which yields Compressed Sensing bounds under the Restricted Isometry Property (RIP). When compared to other Compressed Sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function $f$ that meets the RIP condition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源