论文标题
带有凸参数模型的分布式个性化梯度跟踪
Distributed Personalized Gradient Tracking with Convex Parametric Models
论文作者
论文摘要
我们提出了一种分布式优化算法,用于在计算和通信节点网络上求解在线个性化优化问题,每种节点都链接到特定用户。假定局部目标函数具有复合结构,并且由已知的时间变化(工程)部分和未知(用户特定)部分组成。关于未知部分,假定它具有先验的已知参数(例如,二次)结构,其参数与算法的演变一起学习。该算法由两个相互交织的组件组成:(i)一种动态梯度跟踪方案,用于查找本地解决方案估计值和(ii)一种递归最小二乘方案,用于通过用户对本地解决方案估计的用户噪声反馈来估算未知参数。该算法显示在适当的假设下表现出有限的遗憾。最后,一个数值示例证实了理论分析。
We present a distributed optimization algorithm for solving online personalized optimization problems over a network of computing and communicating nodes, each of which linked to a specific user. The local objective functions are assumed to have a composite structure and to consist of a known time-varying (engineering) part and an unknown (user-specific) part. Regarding the unknown part, it is assumed to have a known parametric (e.g., quadratic) structure a priori, whose parameters are to be learned along with the evolution of the algorithm. The algorithm is composed of two intertwined components: (i) a dynamic gradient tracking scheme for finding local solution estimates and (ii) a recursive least squares scheme for estimating the unknown parameters via user's noisy feedback on the local solution estimates. The algorithm is shown to exhibit a bounded regret under suitable assumptions. Finally, a numerical example corroborates the theoretical analysis.