论文标题

引导神经过程

Bootstrapping Neural Processes

论文作者

Lee, Juho, Lee, Yoonho, Kim, Jungtaek, Yang, Eunho, Hwang, Sung Ju, Teh, Yee Whye

论文摘要

与传统的统计建模不同,用户通常对先前的神经过程(NPS)进行手工指定,隐含地定义了具有神经网络的一类随机过程。给定数据流,NP学习了最能描述数据的随机过程。尽管这种学习随机过程的“数据驱动”方式已被证明可以处理各种数据,但NP仍然依靠一个假设,即随机过程中的不确定性由单个潜在变量建模,这可能会限制灵活性。为此,我们提出了Bootstrap的NP家族的新型扩展神经过程(BNP)。引导程序是一种经典数据驱动的技术,用于估计不确定性,它使BNP可以在不假设特定形式的情况下学习NP中的随机性。我们证明了BNP对在模型数据不匹配的情况下的各种类型数据及其鲁棒性的功效。

Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源