论文标题

一般数据分布的基于得分的生成建模的收敛

Convergence of score-based generative modeling for general data distributions

论文作者

Lee, Holden, Lu, Jianfeng, Tan, Yixin

论文摘要

基于得分的生成建模(SGM)已成为一种成功成功地从图像和音频等复杂数据分布中生成样本的方法。它是基于不断发展的SDE,该SDE使用得分函数或梯度log-pdf的估计,将白噪声从学习分布中转换为样品。对这些方法的先前收敛分析已经遭受了对数据分布的强烈假设或指数依赖性的强大假设,因此无法为在实践中出现的多模式和非平滑分布提供有效的保证,并且观察到了良好的经验绩效。我们考虑一种流行的SGM - 降级扩散模型 - 并为一般数据分布提供多项式收敛保证,而没有与功能不平等或平滑度有关的假设。假设有$ l^2 $准确的分数估计值,我们为任何有限支撑或足够衰减的尾巴的分配获得了Wasserstein距离保证,以及具有进一步平滑度假设的分布的电视保证。

Score-based generative modeling (SGM) has grown to be a hugely successful method for learning to generate samples from complex data distributions such as that of images and audio. It is based on evolving an SDE that transforms white noise into a sample from the learned distribution, using estimates of the score function, or gradient log-pdf. Previous convergence analyses for these methods have suffered either from strong assumptions on the data distribution or exponential dependencies, and hence fail to give efficient guarantees for the multimodal and non-smooth distributions that arise in practice and for which good empirical performance is observed. We consider a popular kind of SGM -- denoising diffusion models -- and give polynomial convergence guarantees for general data distributions, with no assumptions related to functional inequalities or smoothness. Assuming $L^2$-accurate score estimates, we obtain Wasserstein distance guarantees for any distribution of bounded support or sufficiently decaying tails, as well as TV guarantees for distributions with further smoothness assumptions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源