论文标题
关于简单贝叶斯随机效应模型家族的Gibbs采样器的收敛复杂性
On the convergence complexity of Gibbs samplers for a family of simple Bayesian random effects models
论文作者
论文摘要
大数据的出现导致了所谓的收敛复杂性分析,这是对马尔可夫链蒙特卡洛(MCMC)算法如何以样本量,$ n $和/或参数的数量($ p $,$ p $)在基础数据集中增加的研究。这种类型的分析通常非常具有挑战性,部分原因是固定$ n $和$ p $的现有结果根本不足以产生良好的渐近结果。连续状态空间上MCMC算法的第一个收敛复杂性之一是Yang and Rosenthal(2019)引起的。然而,该吉布斯采样器的光谱间隙的渐近行为仍然未知。我们使用最近开发的仿真技术(Qin等,2019)提供了实质性的数值证据,表明该差距从0限制为$ n \ rightarrow \ infty $。我们还为两个与Rosenthal(1996)考虑的随机效应模型的概括相关的两个不同的Gibbs采样器建立了一对严格的合并复杂性结果。我们的结果表明,在强烈的规律性条件下,随着样本量的增加,这些Gibbs采样器的光谱间隙会收敛到1。
The emergence of big data has led to so-called convergence complexity analysis, which is the study of how Markov chain Monte Carlo (MCMC) algorithms behave as the sample size, $n$, and/or the number of parameters, $p$, in the underlying data set increase. This type of analysis is often quite challenging, in part because existing results for fixed $n$ and $p$ are simply not sharp enough to yield good asymptotic results. One of the first convergence complexity results for an MCMC algorithm on a continuous state space is due to Yang and Rosenthal (2019), who established a mixing time result for a Gibbs sampler (for a simple Bayesian random effects model) that was introduced and studied by Rosenthal (1996). The asymptotic behavior of the spectral gap of this Gibbs sampler is, however, still unknown. We use a recently developed simulation technique (Qin et. al., 2019) to provide substantial numerical evidence that the gap is bounded away from 0 as $n \rightarrow \infty$. We also establish a pair of rigorous convergence complexity results for two different Gibbs samplers associated with a generalization of the random effects model considered by Rosenthal (1996). Our results show that, under strong regularity conditions, the spectral gaps of these Gibbs samplers converge to 1 as the sample size increases.