论文标题

纵向深内核高斯过程回归

Longitudinal Deep Kernel Gaussian Process Regression

论文作者

Liang, Junjie, Wu, Yanting, Xu, Dongkuan, Honavar, Vasant

论文摘要

高斯工艺为从纵向数据(即随着时间的推移从一组个体中进行了不规则采样,稀疏观察结果)提供了一个有吸引力的框架。但是,这种方法有两个主要的缺点:(i)它们依靠临时启发式方法或昂贵的反复试验来选择有效的内核,并且(ii)他们无法处理数据中的多级相关结构。据我们所知,我们介绍了纵向深内核高斯过程回归(L-DKGPR),是通过完全自动化从纵向数据发现复杂多级相关结构来克服这些局限性的唯一方法。具体而言,L-DKGPR消除了对深内核学习的新型适应的临时启发式或反复试验的需求,该学习结合了深神经网络的表达能力和非参数核方法的灵活性。 L-DKGPR有效地学习了与新颖的成瘾核的多层次相关性,该内核同时适应了时间变化和时间不变的效果。我们通过使用潜在空间诱导点和变异推理来得出有效的算法来训练L-DKGPR。在几个基准数据集上进行的广泛实验的结果表明,L-DKGPR显着胜过最新的纵向数据分析(LDA)方法。

Gaussian processes offer an attractive framework for predictive modeling from longitudinal data, i.e., irregularly sampled, sparse observations from a set of individuals over time. However, such methods have two key shortcomings: (i) They rely on ad hoc heuristics or expensive trial and error to choose the effective kernels, and (ii) They fail to handle multilevel correlation structure in the data. We introduce Longitudinal deep kernel Gaussian process regression (L-DKGPR), which to the best of our knowledge, is the only method to overcome these limitations by fully automating the discovery of complex multilevel correlation structure from longitudinal data. Specifically, L-DKGPR eliminates the need for ad hoc heuristics or trial and error using a novel adaptation of deep kernel learning that combines the expressive power of deep neural networks with the flexibility of non-parametric kernel methods. L-DKGPR effectively learns the multilevel correlation with a novel addictive kernel that simultaneously accommodates both time-varying and the time-invariant effects. We derive an efficient algorithm to train L-DKGPR using latent space inducing points and variational inference. Results of extensive experiments on several benchmark data sets demonstrate that L-DKGPR significantly outperforms the state-of-the-art longitudinal data analysis (LDA) methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源