论文标题

MMSE在kullback-leibler差异限制下限制了联合输入输出分布

MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

论文作者

Fauß, Michael, Dysto, Alex, Poor, H. Vincent

论文摘要

本文提出了一个最小平方误差(MMSE)的下限和上限的新家族。关键思想是最大程度地减少/最大化MMSE受到以下限制,即输入输出统计量的联合分布位于以某些高斯参考分布为中心的kullback-leibler发散球。这两个界限都是紧密的,并通过高斯分布来实现,高斯分布与参考分布相同,其协方差矩阵由标量参数确定,可以通过找到单调函数的根来获得。上限对应于最小值的最佳估计器,并在分布不确定性下提供性能保证。下边界为估计理论(例如Cramér-rao结合)提供了众所周知的不平等现象的替代方案,该估计理论可能更紧密,并且针对较大的分布而定义。信号处理和信息理论中应用的示例说明了拟议界限在实践中的有用性。

This paper proposes a new family of lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian reference distribution. Both bounds are tight and are attained by Gaussian distributions whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the root of a monotonic function. The upper bound corresponds to a minimax optimal estimator and provides performance guarantees under distributional uncertainty. The lower bound provides an alternative to well-known inequalities in estimation theory, such as the Cramér-Rao bound, that is potentially tighter and defined for a larger class of distributions. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源