论文标题
可扩展的变异高斯过程回归网络
Scalable Variational Gaussian Process Regression Networks
论文作者
论文摘要
高斯流程回归网络(GPRN)是用于多输出回归的强大贝叶斯模型,但它们的推断是棘手的。为了解决这个问题,现有方法在所有输出和后近似的所有输出和潜在功能上使用完全分解的结构(或此类结构的混合物),但是,这可能会错过潜在变量之间强大的后依赖性并损害推理质量。此外,变化参数的更新效率低下,对于大量输出而言,更昂贵。为了克服这些局限性,我们提出了GPRN的可扩展变异推理算法,该算法不仅捕获了丰富的后依赖性,而且对于大量输出而言,它也更有效。我们张开输出空间,并引入张量/矩阵正常的变分后期,以捕获后相关性并减少参数。我们共同优化所有参数,并利用变异模型证据的固有的Kronecker产品结构,以加速计算。我们在几个现实世界应用中证明了我们方法的优势。
Gaussian process regression networks (GPRN) are powerful Bayesian models for multi-output regression, but their inference is intractable. To address this issue, existing methods use a fully factorized structure (or a mixture of such structures) over all the outputs and latent functions for posterior approximation, which, however, can miss the strong posterior dependencies among the latent variables and hurt the inference quality. In addition, the updates of the variational parameters are inefficient and can be prohibitively expensive for a large number of outputs. To overcome these limitations, we propose a scalable variational inference algorithm for GPRN, which not only captures the abundant posterior dependencies but also is much more efficient for massive outputs. We tensorize the output space and introduce tensor/matrix-normal variational posteriors to capture the posterior correlations and to reduce the parameters. We jointly optimize all the parameters and exploit the inherent Kronecker product structure in the variational model evidence lower bound to accelerate the computation. We demonstrate the advantages of our method in several real-world applications.