论文标题
用几何詹森 - 香农差异约束变异推断
Constraining Variational Inference with Geometric Jensen-Shannon Divergence
论文作者
论文摘要
我们研究了控制变异自动编码器中潜在空间正则化的差异的问题。具体而言,旨在重建示例$ x \ in \ mathbb {r}^{m} $通过潜在空间$ z \ in \ mathbb {r}^{n} $($ n \ leq m $),同时平衡这种情况与对普通延伸的必需品的需求平衡。我们提出了一种基于偏差几何的Jensen-Shannon Divergence $ \ left(\ textrm {js}^{\ textrm {g}_α} \ right)$的正规化机制。我们发现$ \ textrm {js}^{\ textrm {g}_α} $,通过限制案例的启发,这导致在分布和偏见的空间中向前kl和反向kl之间的直观插值。在提出定量和定性结果之前,我们通过低维实例来激发其对VAE的潜在优势。我们的实验表明,在$ \ textrm {js}^{jS}^{\ textrm {g}_α}_α} $ - vaes中,与几个基础相比,在$ \ textrm {js}^{\ textrm {js}^{\ textrm {js}^{\ textrm {js}^{\ textrm {js}^{\ textrm {js}^{\ textrm {js}^{\ textrm {js}^{我们的方法完全是无监督的,并且仅利用一个可以在潜在空间中轻松解释的高参数。
We examine the problem of controlling divergences for latent space regularisation in variational autoencoders. Specifically, when aiming to reconstruct example $x\in\mathbb{R}^{m}$ via latent space $z\in\mathbb{R}^{n}$ ($n\leq m$), while balancing this against the need for generalisable latent representations. We present a regularisation mechanism based on the skew-geometric Jensen-Shannon divergence $\left(\textrm{JS}^{\textrm{G}_α}\right)$. We find a variation in $\textrm{JS}^{\textrm{G}_α}$, motivated by limiting cases, which leads to an intuitive interpolation between forward and reverse KL in the space of both distributions and divergences. We motivate its potential benefits for VAEs through low-dimensional examples, before presenting quantitative and qualitative results. Our experiments demonstrate that skewing our variant of $\textrm{JS}^{\textrm{G}_α}$, in the context of $\textrm{JS}^{\textrm{G}_α}$-VAEs, leads to better reconstruction and generation when compared to several baseline VAEs. Our approach is entirely unsupervised and utilises only one hyperparameter which can be easily interpreted in latent space.