论文标题
方差降低了对有针的图形的随机优化,该图具有行和列随机重量
Variance reduced stochastic optimization over directed graphs with row and column stochastic weights
论文作者
论文摘要
本文提出了AB-SAGA,AB-SAGA是一种一阶分布式随机优化方法,以最大程度地减少在任意有向图上分布的平滑且强烈凸出功能的有限和强烈凸出功能。 AB-SAGA使用节点级别降低来消除由随机梯度引起的不确定性,然后采用网络级梯度跟踪来解决整个节点的数据差异。与使用非线性推送校正来取消有向通信引起的不平衡的现有方法不同,AB-SAGA中的共识更新是线性的,并且同时使用行和列随机权重。我们表明,对于恒定的步进大小,AB-SAGA线性收敛到全局最佳。我们使用明确的方向性恒定量化了基础图的定向性质,并表征了AB-SAGA在其集中式同行中实现线性加速的机制。数值实验说明了AB-SAGA对于强凸和非凸问题的收敛性。
This paper proposes AB-SAGA, a first-order distributed stochastic optimization method to minimize a finite-sum of smooth and strongly convex functions distributed over an arbitrary directed graph. AB-SAGA removes the uncertainty caused by the stochastic gradients using a node-level variance reduction and subsequently employs network-level gradient tracking to address the data dissimilarity across the nodes. Unlike existing methods that use the nonlinear push-sum correction to cancel the imbalance caused by the directed communication, the consensus updates in AB-SAGA are linear and uses both row and column stochastic weights. We show that for a constant step-size, AB-SAGA converges linearly to the global optimal. We quantify the directed nature of the underlying graph using an explicit directivity constant and characterize the regimes in which AB-SAGA achieves a linear speed-up over its centralized counterpart. Numerical experiments illustrate the convergence of AB-SAGA for strongly convex and nonconvex problems.