论文标题

在线镜像下降的动态遗憾,用于相对平滑的凸成本功能

Dynamic Regret of Online Mirror Descent for Relatively Smooth Convex Cost Functions

论文作者

Eshraghi, Nima, Liang, Ben

论文摘要

在动态环境中,在线凸优化算法的性能通常以动态遗憾表示,这可以根据一系列随时间变化的比较器来衡量决策者的性能。在对动态遗憾的分析中,先前的作品经常假设Lipschitz的连续性或成本功能的平滑度均匀。但是,实践中有许多重要的成本功能无法满足这些条件。在这种情况下,事先分析不适用,无法保证优化性能。在这封信中,我们表明,即使既不存在Lipschitz的连续性也不存在均匀的平滑度,也可以束缚动态的遗憾。我们采用相对平滑度相对于某些用户定义的正则化功能的概念,这对成本函数来说是一个较温和的要求。我们首先表明,在相对平滑度下,动态遗憾具有基于路径长度和功能变化的上限。然后,我们表明,在相对较强的凸度的附加条件下,动态遗憾可以受路径长度和梯度变化的界定。这些遗憾的界限为不同应用领域中出现的各种在线优化问题提供了绩效保证。最后,我们提出了数值实验,这些实验证明了采用正则化函数的优势,在该功能下,成本函数相对平滑。

The performance of online convex optimization algorithms in a dynamic environment is often expressed in terms of the dynamic regret, which measures the decision maker's performance against a sequence of time-varying comparators. In the analysis of the dynamic regret, prior works often assume Lipschitz continuity or uniform smoothness of the cost functions. However, there are many important cost functions in practice that do not satisfy these conditions. In such cases, prior analyses are not applicable and fail to guarantee the optimization performance. In this letter, we show that it is possible to bound the dynamic regret, even when neither Lipschitz continuity nor uniform smoothness is present. We adopt the notion of relative smoothness with respect to some user-defined regularization function, which is a much milder requirement on the cost functions. We first show that under relative smoothness, the dynamic regret has an upper bound based on the path length and functional variation. We then show that with an additional condition of relatively strong convexity, the dynamic regret can be bounded by the path length and gradient variation. These regret bounds provide performance guarantees to a wide variety of online optimization problems that arise in different application domains. Finally, we present numerical experiments that demonstrate the advantage of adopting a regularization function under which the cost functions are relatively smooth.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源