论文标题

独立离散随机变量总和的最大熵

On the Maximum Entropy of a Sum of Independent Discrete Random Variables

论文作者

Kovačević, Mladen

论文摘要

令$ x_1,\ ldots,x_n $为独立的随机变量,在字母$ \ {0,1,\ ldots,r \} $中,和$ s_n = \ sum_ {i = 1}^n x_i $。 Shepp-olkin定理指出,在二进制案例($ r = 1 $)中,当所有$ x_i $'s均匀分布时,$ s_n $的香农熵将最大化,即伯诺利(Bernoulli)(1/2)。为了将该定理概括为任意有限字母,我们在$ s_n $的最大熵上获得了一个下限,并证明在几种特殊情况下它很紧。除这些特殊情况外,还提出了一个争论,支持猜想,即界限代表所有$ n,r $,即当$ x_1,\ ldots,x__ {n-1} $均匀分布在$ \ \ \ r \ r \ r \ r \ r \ r \ r \ r \ r \ r \ r \ r \ r \ iS ai progienta $ x时,当$ h(s_n)$最大化时,$ $ \ {0,r \} $和$ \ {1,\ ldots,r-1 \} $的均匀分布的非零重量)。

Let $ X_1, \ldots, X_n $ be independent random variables taking values in the alphabet $ \{0, 1, \ldots, r\} $, and $ S_n = \sum_{i = 1}^n X_i $. The Shepp--Olkin theorem states that, in the binary case ($ r = 1 $), the Shannon entropy of $ S_n $ is maximized when all the $ X_i $'s are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of $ S_n $ and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all $ n, r $, i.e., that $ H(S_n) $ is maximized when $ X_1, \ldots, X_{n-1} $ are uniformly distributed over $ \{0, r\} $, while the probability mass function of $ X_n $ is a mixture (with explicitly defined non-zero weights) of the uniform distributions over $ \{0, r\} $ and $ \{1, \ldots, r-1\} $.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源