论文标题

BR-SNIS:偏差减少了自称的重要性抽样

BR-SNIS: Bias Reduced Self-Normalized Importance Sampling

论文作者

Cardoso, Gabriel, Samsonov, Sergey, Thin, Achille, Moulines, Eric, Olsson, Jimmy

论文摘要

重要性采样(IS)是一种使用来自建议分布和相关重要性权重的独立样本在目标分布下近似期望的方法。在许多应用中,只有直到归一化常数才知道目标分布,在这种情况下,可以使用(SNIS)。虽然使用自我归一化可以对估算器的分散产生积极影响,但它引入了偏见。在这项工作中,我们提出了一种新方法BR-SNIS,其复杂性与SNI的复杂性基本相同,并且在不增加方差的情况下大大减少了偏差。这种方法是一种包装器,从某种意义上说,它使用了与SNIS相同的建议样本和重要性权重,但巧妙地使用了迭代采样(ISIR)重新采样(ISIR)来形成估算器的偏置版本。我们为提出的算法提供了严格的理论结果,包括新的偏见,方差和高概率界限,这些算法通过数值示例来说明。

Importance Sampling (IS) is a method for approximating expectations under a target distribution using independent samples from a proposal distribution and the associated importance weights. In many applications, the target distribution is known only up to a normalization constant, in which case self-normalized IS (SNIS) can be used. While the use of self-normalization can have a positive effect on the dispersion of the estimator, it introduces bias. In this work, we propose a new method, BR-SNIS, whose complexity is essentially the same as that of SNIS and which significantly reduces bias without increasing the variance. This method is a wrapper in the sense that it uses the same proposal samples and importance weights as SNIS, but makes clever use of iterated sampling--importance resampling (ISIR) to form a bias-reduced version of the estimator. We furnish the proposed algorithm with rigorous theoretical results, including new bias, variance and high-probability bounds, and these are illustrated by numerical examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源