论文标题

信息的无限划分

Infinite Divisibility of Information

论文作者

Li, Cheuk Ting

论文摘要

我们研究了无限划分的概率分布的信息类似物,其中I.I.D.总和由I.I.D的联合分布取代。顺序。随机变量$ x $被称为信息无限分开,如果对于任何$ n \ ge1 $,则存在I.I.D.随机变量的顺序$ z_ {1},\ ldots,z_ {n} $,其中包含与$ x $相同的信息,即,存在一个注入函数$ f $,因此$ x = f(z__ {1},\ ldots,\ ldots,z__ {n})$。虽然没有信息在无限划分的离散随机变量上存在,但我们表明,任何离散的随机变量$ x $都有一个有界的乘法差距到无限的可划分性,也就是说,如果我们在$ f $上删除Indentivity Septionement,则存在I.I.D. $ z_ {1},\ ldots,z_ {n} $和$ f $满足$ x = f(z__ {1},\ ldots,z_ {n})$,并且熵满足$ h(x)/n \ le H(z___ {1}} \ le1.59h(x)/n \ le h(x)/n \ le h(x)/n \ le h(x)/n \ le1.59h(x)我们还研究了一类新的离散概率分布,称为Spectral无限分布分布,我们可以在其中删除乘法差距$ 1.59 $。此外,我们研究$ x =(y_ {1},\ ldots,y_ {m})$的情况本身就是一个I.I.D.序列,$ m \ ge2 $,为此,乘法差距$ 1.59 $可以用$ 1+5 \ sqrt {(\ log M)/m} $代替。这意味着,随着$ m $的增加,$(y_ {1},\ ldots,y_ {m})$变得更接近光谱以统一的方式无限分组。这可以被视为Kolmogorov统一定理的信息类似物。我们结果的应用包括独立的组件分析,具有保密约束的分布式存储以及分布式随机数生成。

We study an information analogue of infinitely divisible probability distributions, where the i.i.d. sum is replaced by the joint distribution of an i.i.d. sequence. A random variable $X$ is called informationally infinitely divisible if, for any $n\ge1$, there exists an i.i.d. sequence of random variables $Z_{1},\ldots,Z_{n}$ that contains the same information as $X$, i.e., there exists an injective function $f$ such that $X=f(Z_{1},\ldots,Z_{n})$. While there does not exist informationally infinitely divisible discrete random variable, we show that any discrete random variable $X$ has a bounded multiplicative gap to infinite divisibility, that is, if we remove the injectivity requirement on $f$, then there exists i.i.d. $Z_{1},\ldots,Z_{n}$ and $f$ satisfying $X=f(Z_{1},\ldots,Z_{n})$, and the entropy satisfies $H(X)/n\le H(Z_{1})\le1.59H(X)/n+2.43$. We also study a new class of discrete probability distributions, called spectral infinitely divisible distributions, where we can remove the multiplicative gap $1.59$. Furthermore, we study the case where $X=(Y_{1},\ldots,Y_{m})$ is itself an i.i.d. sequence, $m\ge2$, for which the multiplicative gap $1.59$ can be replaced by $1+5\sqrt{(\log m)/m}$. This means that as $m$ increases, $(Y_{1},\ldots,Y_{m})$ becomes closer to being spectral infinitely divisible in a uniform manner. This can be regarded as an information analogue of Kolmogorov's uniform theorem. Applications of our result include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源