论文标题
通过chebyshev期刊连续过度放松
Acceleration of Cooperative Least Mean Square via Chebyshev Periodical Successive Over-Relaxation
论文作者
论文摘要
至少均方根(LMS)的分布式算法可用于分布式信号估计和多元回归模型的分布式训练中。算法的收敛速度是关键因素,因为更快的算法需要更少的通信开销,并且会导致网络带宽较窄。本文的目的是提出Chebyshev期刊连续过度删节(PSOR)的使用可以自然地加速分布式LMS算法。 Chbyshev Psor的基本思想是引入索引依赖性的psor因子,以控制控制修改后的定点迭代的收敛行为的矩阵的光谱半径。收敛速度的加速度在广泛的网络中得到了经验证实,例如已知的小图(例如空手道图)和随机图,例如ERDOS-RENYI(ER)随机图和Barabasi-Albert随机图。
A distributed algorithm for least mean square (LMS) can be used in distributed signal estimation and in distributed training for multivariate regression models. The convergence speed of an algorithm is a critical factor because a faster algorithm requires less communications overhead and it results in a narrower network bandwidth. The goal of this paper is to present that use of Chebyshev periodical successive over-relaxation (PSOR) can accelerate distributed LMS algorithms in a naturally manner. The basic idea of Chbyshev PSOR is to introduce index-dependent PSOR factors that control the spectral radius of a matrix governing the convergence behavior of the modified fixed-point iteration. Accelerations of convergence speed are empirically confirmed in a wide range of networks, such as known small graphs (e.g., Karate graph), and random graphs, such as Erdos-Renyi (ER) random graphs and Barabasi-Albert random graphs.