论文标题

用于大规模分布式多代理优化的通信有效的准牛顿法

A Communication Efficient Quasi-Newton Method for Large-scale Distributed Multi-agent Optimization

论文作者

Li, Yichuan, Voulgaris, Petros G., Freris, Nikolaos M.

论文摘要

我们提出了一种用于大型多代理凸复合优化的通信有效的准牛顿法。我们假设设置了一个合作解决全球最小化问题的代理网络,并强烈凸出局部成本功能,并通过非平滑凸正则使用者增强。通过引入共识变量,我们获得了一个块 - 二基因HESSIAN,因此在近似客观曲率信息时消除了对额外通信的需求。此外,我们通过存储$ c $ c $ dimension $ d $的向量,从$ \ mathcal {o}(d^3)$从$ \ mathcal {o}(d^3)$降低现有的二级准牛顿方法的计算成本。提出了一种异步实现,以消除协调的需求。建立了预期中的全局线性收敛率,我们使用实际数据集以数值来证明我们的算法的优点。

We propose a communication efficient quasi-Newton method for large-scale multi-agent convex composite optimization. We assume the setting of a network of agents that cooperatively solve a global minimization problem with strongly convex local cost functions augmented with a non-smooth convex regularizer. By introducing consensus variables, we obtain a block-diagonal Hessian and thus eliminate the need for additional communication when approximating the objective curvature information. Moreover, we reduce computational costs of existing primal-dual quasi-Newton methods from $\mathcal{O}(d^3)$ to $\mathcal{O}(cd)$ by storing $c$ pairs of vectors of dimension $d$. An asynchronous implementation is presented that removes the need for coordination. Global linear convergence rate in expectation is established, and we demonstrate the merit of our algorithm numerically with real datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源