论文标题

DC-DISTADMM:用于在有向图上受约束的分布式优化的ADMM算法

DC-DistADMM: ADMM Algorithm for Constrained Distributed Optimization over Directed Graphs

论文作者

Khatana, Vivek, Salapaka, Murti V.

论文摘要

本文报告了具有共同决策变量,局部线性平等和不平等约束的多代理分布式优化问题的算法,并设置了具有收敛速率保证的约束。 \ textColor {black} {该算法累积了乘数交替方向方法(admm)方法的所有好处}。它还通过允许定向的通信拓扑来克服现有方法对线性不平等,平等和设置约束的局限性的局限性。此外,该算法可以分布合成。开发的算法具有:(i)$ o(1/k)$融合率,其中$ k $是迭代计数器,当单个功能是凸的,但不必要的差异,以及(ii)当目标函数和最佳函数的任何任意小邻域的几何融合速率是平稳的,并且在最佳范围内都可以在最佳的强度解决方案。该算法的功效是通过与最先进的约束优化算法进行比较来评估的,以求解约束的分布式分布式$ \ ell_1 $调节的逻辑回归问题,以及无限制的优化算法在求解$ \ ell_1 $ grell_1 $ groundized Huber损失损失最小化问题方面。此外,还提供了使用多个通信步骤的文献中算法的性能与其他算法的比较。

This article reports an algorithm for multi-agent distributed optimization problems with a common decision variable, local linear equality and inequality constraints and set constraints with convergence rate guarantees. \textcolor{black}{The algorithm accrues all the benefits of the Alternating Direction Method of Multipliers (ADMM) approach}. It also overcomes the limitations of existing methods on convex optimization problems with linear inequality, equality and set constraints by allowing directed communication topologies. Moreover, the algorithm can be synthesized distributively. The developed algorithm has: (i) a $O(1/k)$ rate of convergence, where $k$ is the iteration counter, when individual functions are convex but not-necessarily differentiable, and (ii) a geometric rate of convergence to any arbitrary small neighborhood of the optimal solution, when the objective functions are smooth and restricted strongly convex at the optimal solution. The efficacy of the algorithm is evaluated by a comparison with state-of-the-art constrained optimization algorithms in solving a constrained distributed $\ell_1$-regularized logistic regression problem, and unconstrained optimization algorithms in solving a $\ell_1$-regularized Huber loss minimization problem. Additionally, a comparison of the algorithm's performance with other algorithms in the literature that utilize multiple communication steps is provided.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源