论文标题
竞争镜下降
Competitive Mirror Descent
论文作者
论文摘要
受限的竞争优化涉及多个试图最大程度地减少矛盾目标的代理,但受到限制。这是一种高度表现力的建模语言,涵盖了大部分现代机器学习。在这项工作中,我们提出了竞争镜下降(CMD):一种基于一阶信息解决此类问题的通用方法,可以通过自动分化获得。首先,通过添加Lagrange乘数,我们获得了具有相关Bregman潜力的简化约束集。然后,在每次迭代中,我们求解了完整问题的正则双线性近似的NASH平衡,以获得代理的运动方向。最后,我们根据布雷格曼电位引起的双几何形状来遵循此方向,从而获得了下一个迭代。通过使用双几何形状,尽管仅在每次迭代处求解了线性系统,但我们就可以获得可行的迭代,从而消除了投影步骤的需求,同时仍考虑约束集的全局非线性结构。作为一种特殊情况,我们获得了一种新型的竞争性乘法算法,以解决阳性锥体上的问题。
Constrained competitive optimization involves multiple agents trying to minimize conflicting objectives, subject to constraints. This is a highly expressive modeling language that subsumes most of modern machine learning. In this work we propose competitive mirror descent (CMD): a general method for solving such problems based on first order information that can be obtained by automatic differentiation. First, by adding Lagrange multipliers, we obtain a simplified constraint set with an associated Bregman potential. At each iteration, we then solve for the Nash equilibrium of a regularized bilinear approximation of the full problem to obtain a direction of movement of the agents. Finally, we obtain the next iterate by following this direction according to the dual geometry induced by the Bregman potential. By using the dual geometry we obtain feasible iterates despite only solving a linear system at each iteration, eliminating the need for projection steps while still accounting for the global nonlinear structure of the constraint set. As a special case we obtain a novel competitive multiplicative weights algorithm for problems on the positive cone.