论文标题

稀疏和低级优化的外部优化

Exterior-point Optimization for Sparse and Low-rank Optimization

论文作者

Gupta, Shuvomoy Das, Stellato, Bartolomeo, Van Parys, Bart P. G.

论文摘要

当前对机器学习,统计和数据科学兴趣的许多问题都可以被提出为稀疏和低级优化问题。在本文中,我们介绍了NonConvex外部优化求解器Nexos,这是一种针对稀疏和低级别优化问题量身定制的一阶算法。我们考虑将凸函数最小化在非convex约束集上的问题,其中该集合可以分解为紧凑型凸集的相交和涉及稀疏或低额定约束的非convex集的相交。与凸松弛方法不同,Nexos通过利用非convex几何形状来解决一系列惩罚问题,从严格降低惩罚参数来解决原始问题的局部最佳点。 Nexos通过应用一阶算法解决了每个惩罚问题,该算法在规律性条件下线性收敛至相应惩罚配方的局部最小值。此外,随着罚款参数为零,刑罚问题的本地最小值将原始问题的当地最小值汇合。然后,我们从各种稀疏和低级别优化问题中实施和测试了许多实例,从经验上证明我们的算法表现优于专业方法。

Many problems of substantial current interest in machine learning, statistics, and data science can be formulated as sparse and low-rank optimization problems. In this paper, we present the nonconvex exterior-point optimization solver NExOS -- a first-order algorithm tailored to sparse and low-rank optimization problems. We consider the problem of minimizing a convex function over a nonconvex constraint set, where the set can be decomposed as the intersection of a compact convex set and a nonconvex set involving sparse or low-rank constraints. Unlike the convex relaxation approaches, NExOS finds a locally optimal point of the original problem by solving a sequence of penalized problems with strictly decreasing penalty parameters by exploiting the nonconvex geometry. NExOS solves each penalized problem by applying a first-order algorithm, which converges linearly to a local minimum of the corresponding penalized formulation under regularity conditions. Furthermore, the local minima of the penalized problems converge to a local minimum of the original problem as the penalty parameter goes to zero. We then implement and test NExOS on many instances from a wide variety of sparse and low-rank optimization problems, empirically demonstrating that our algorithm outperforms specialized methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源