论文标题
与共识约束的随机镜下降进行凸优化
Stochastic Mirror Descent for Convex Optimization with Consensus Constraints
论文作者
论文摘要
众所周知,在适应优化模型的基础几何形状非常有益的情况下,镜下下降算法是有效的。但是,镜像图对分布式优化问题几何形状的影响尚未解决。在本文中,我们研究了在添加噪声下连续时间的精确分布式镜下降算法。我们为凸优化设置的拟议动力学建立了线性收敛速率。我们的分析从增强的Lagrangian及其与梯度跟踪的关系中汲取了动力。为了进一步探索在分布式设置中镜像图的好处,我们提出了算法的预处理变体,并在拉格朗日双重变量上提供了额外的镜像图。这使我们的方法可以适应原始变量的几何形状,也可以适应共识约束的几何形状。我们还为提出的方法提出了一种高斯 - 赛型型离散化方案,并建立了线性收敛速率。对于某些类别的问题,我们确定了镜像图,以减轻图形光谱特性对算法收敛速率的影响。使用数值实验,我们证明了该方法在凸模型上的效率,无论是有还是没有约束。我们的发现表明,所提出的方法优于其他方法,尤其是在模型的几何形状未被标准欧几里得规范捕获的情况下
The mirror descent algorithm is known to be effective in situations where it is beneficial to adapt the mirror map to the underlying geometry of the optimization model. However, the effect of mirror maps on the geometry of distributed optimization problems has not been previously addressed. In this paper we study an exact distributed mirror descent algorithm in continuous-time under additive noise. We establish a linear convergence rate of the proposed dynamics for the setting of convex optimization. Our analysis draws motivation from the Augmented Lagrangian and its relation to gradient tracking. To further explore the benefits of mirror maps in a distributed setting we present a preconditioned variant of our algorithm with an additional mirror map over the Lagrangian dual variables. This allows our method to adapt to both the geometry of the primal variables, as well as to the geometry of the consensus constraint. We also propose a Gauss-Seidel type discretization scheme for the proposed method and establish its linear convergence rate. For certain classes of problems we identify mirror maps that mitigate the effect of the graph's spectral properties on the convergence rate of the algorithm. Using numerical experiments we demonstrate the efficiency of the methodology on convex models, both with and without constraints. Our findings show that the proposed method outperforms other methods, especially in scenarios where the model's geometry is not captured by the standard Euclidean norm