论文标题

差异自动回归高斯流程的持续学习

Variational Auto-Regressive Gaussian Processes for Continual Learning

论文作者

Kapoor, Sanyam, Karaletsos, Theofanis, Bui, Thang D.

论文摘要

通过在在线观察数据上的后期构造,贝叶斯定理为持续学习提供了自然框架。我们开发了各种自动回归高斯过程(VAR-GPS),这是一种原则上的后验更新机制,可以解决持续学习中的顺序任务。通过依靠可伸缩后期的稀疏诱导点近似值,我们提出了一种新型的自动回归变分布分布,揭示了与贝叶斯推理,期望传播和正交诱导点的现有结果的两个富有成果的联系。平均预测性熵估计表明,VAR-GP可以防止灾难性遗忘,这在现代持续学习基准对竞争基准的基准上的强劲表现得到了验证。一项彻底的消融研究证明了我们的建模选择的功效。

Through sequential construction of posteriors on observing data online, Bayes' theorem provides a natural framework for continual learning. We develop Variational Auto-Regressive Gaussian Processes (VAR-GPs), a principled posterior updating mechanism to solve sequential tasks in continual learning. By relying on sparse inducing point approximations for scalable posteriors, we propose a novel auto-regressive variational distribution which reveals two fruitful connections to existing results in Bayesian inference, expectation propagation and orthogonal inducing points. Mean predictive entropy estimates show VAR-GPs prevent catastrophic forgetting, which is empirically supported by strong performance on modern continual learning benchmarks against competitive baselines. A thorough ablation study demonstrates the efficacy of our modeling choices.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源