论文标题
在通信网络上分散的基于八卦的随机二元优化
Decentralized Gossip-Based Stochastic Bilevel Optimization over Communication Networks
论文作者
论文摘要
二杆优化的优化已获得越来越多的兴趣,在元学习,最小值游戏,增强学习和嵌套组成优化中发现了许多应用。本文研究了通过网络上的分布式双层优化的问题,在该网络中,代理只能与邻居进行交流,包括来自多任务,多项式学习和联合学习的示例。在本文中,我们提出了一种基于八卦的分布式双层学习算法,该算法允许网络代理在单个时间表中解决内部和外部优化问题,并通过网络传播共享信息。 We show that our algorithm enjoys the $\mathcal{O}(\frac{1}{K ε^2})$ per-agent sample complexity for general nonconvex bilevel optimization and $\mathcal{O}(\frac{1}{K ε})$ for strongly convex objective, achieving a speedup that scales linearly with the network size.样本复杂性在$ε$和$ k $中都是最佳的。我们在高参数调整和分散的增强学习的示例中测试算法。模拟实验证实,我们的算法达到了最先进的训练效率和测试准确性。
Bilevel optimization have gained growing interests, with numerous applications found in meta learning, minimax games, reinforcement learning, and nested composition optimization. This paper studies the problem of distributed bilevel optimization over a network where agents can only communicate with neighbors, including examples from multi-task, multi-agent learning and federated learning. In this paper, we propose a gossip-based distributed bilevel learning algorithm that allows networked agents to solve both the inner and outer optimization problems in a single timescale and share information via network propagation. We show that our algorithm enjoys the $\mathcal{O}(\frac{1}{K ε^2})$ per-agent sample complexity for general nonconvex bilevel optimization and $\mathcal{O}(\frac{1}{K ε})$ for strongly convex objective, achieving a speedup that scales linearly with the network size. The sample complexities are optimal in both $ε$ and $K$. We test our algorithm on the examples of hyperparameter tuning and decentralized reinforcement learning. Simulated experiments confirmed that our algorithm achieves the state-of-the-art training efficiency and test accuracy.