论文标题
在异质输入域学习多任务高斯流程
Learning Multi-Task Gaussian Process Over Heterogeneous Input Domains
论文作者
论文摘要
多任务高斯流程(MTGP)是一种众所周知的非参数贝叶斯模型,用于通过跨任务传输知识来有效地学习相关任务。但是当前的MTGP通常仅限于在同一输入域中定义的多任务场景,没有留出空间来解决异质案例,即输入域的特征在任务上有所不同。为此,本文介绍了一种新型的异质随机变化线性线性模型(HSVLMC)模型,用于同时学习具有不同输入域的任务。特别是,我们通过贝叶斯校准开发了随机变化框架,这些框架(i)考虑了域映射提高的尺寸降低的影响,以实现有效的输入一致性; (ii)采用残差建模策略来利用先前域映射带来的电感偏差来获得更好的模型推断。最后,对现有LMC模型的优势在各种异质的多任务案例和实用的多保真蒸汽轮机排气问题上进行了广泛的验证。
Multi-task Gaussian process (MTGP) is a well-known non-parametric Bayesian model for learning correlated tasks effectively by transferring knowledge across tasks. But current MTGPs are usually limited to the multi-task scenario defined in the same input domain, leaving no space for tackling the heterogeneous case, i.e., the features of input domains vary over tasks. To this end, this paper presents a novel heterogeneous stochastic variational linear model of coregionalization (HSVLMC) model for simultaneously learning the tasks with varied input domains. Particularly, we develop the stochastic variational framework with Bayesian calibration that (i) takes into account the effect of dimensionality reduction raised by domain mappings in order to achieve effective input alignment; and (ii) employs a residual modeling strategy to leverage the inductive bias brought by prior domain mappings for better model inference. Finally, the superiority of the proposed model against existing LMC models has been extensively verified on diverse heterogeneous multi-task cases and a practical multi-fidelity steam turbine exhaust problem.