论文标题
使用坐标下降算法的在线凸优化
Online Convex Optimization Using Coordinate Descent Algorithms
论文作者
论文摘要
本文考虑了目标函数时间变化的在线优化问题。特别是,我们将坐标下降类型算法扩展到在线情况,在线情况下,目标函数在算法的有限迭代次数之后变化。我们没有在每个时间步骤中精确地解决问题,而仅在每个时间步骤中应用有限数量的迭代。常用的遗憾概念用于衡量在线算法的性能。此外,考虑了具有不同更新规则的协调下降算法,包括经典离线优化文献中开发的确定性和随机规则。每种情况都会进行彻底的遗憾分析。最后,提供数值模拟以说明理论结果。
This paper considers the problem of online optimization where the objective function is time-varying. In particular, we extend coordinate descent type algorithms to the online case, where the objective function varies after a finite number of iterations of the algorithm. Instead of solving the problem exactly at each time step, we only apply a finite number of iterations at each time step. Commonly used notions of regret are used to measure the performance of the online algorithm. Moreover, coordinate descent algorithms with different updating rules are considered, including both deterministic and stochastic rules that are developed in the literature of classical offline optimization. A thorough regret analysis is given for each case. Finally, numerical simulations are provided to illustrate the theoretical results.