论文标题

梯度一阶方法的强体和Lipschitz的常数快速自适应

Fast adaptive by constants of strong-convexity and Lipschitz for gradient first order methods

论文作者

Pletnev, Nikita

论文摘要

这项工作专门用于构建高效的且适用于凸优化的一阶方法,即仅使用目标函数及其导数的值。构造使用OGM-G,快速梯度方法,它是根据复杂性最佳的,但需要了解梯度常数的Lipschitz常数和强凸常数以确定步骤和步长的数量。该要求使实际用法不可能。 An adaptive on the constant for strong convexity algorithm ACGM is proposed, based on restarts of the OGM-G with update of the strong convexity constant estimate, and an adaptive on the Lipschitz constant for gradient ALGM, in which the use of OGM-G restarts is supplemented by the selection of the Lipschitz constant with verification of the convexity conditions used in the universal gradient descent method.这消除了与了解这些常数有关的原始方法的缺点,这使实际用法成为可能。证明了构造算法复杂性的估计值的最佳性。为了验证获得的结果,对模型功能和机器学习的实际任务进行了实验。

The work is devoted to the construction of efficient and applicable to real tasks first-order methods of convex optimization, that is, using only values of the target function and its derivatives. Construction uses OGM-G, fast gradient method which is optimal by complexity, but requires to know the Lipschitz constant for gradient and the strong convexity constant to determine the number of steps and step length. This requirement makes practical usage impossible. An adaptive on the constant for strong convexity algorithm ACGM is proposed, based on restarts of the OGM-G with update of the strong convexity constant estimate, and an adaptive on the Lipschitz constant for gradient ALGM, in which the use of OGM-G restarts is supplemented by the selection of the Lipschitz constant with verification of the convexity conditions used in the universal gradient descent method. This eliminates the disadvantages of the original method associated with the need to know these constants, which makes practical usage possible. Optimality of estimates for the complexity of the constructed algorithms is proved. To verify the results obtained, experiments on model functions and real tasks from machine learning are carried out.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源