论文标题
在Kullback-Leibler损失下,对独立泊松过程的几乎最小值预测改进了
Improved nearly minimax prediction for independent Poisson processes under Kullback-Leibler loss
论文作者
论文摘要
在现实生活实践中通常会遇到独立泊松随机变量的问题。研究了独立泊松可观察物的同时预测分布,并使用Kullback-Leibler(K-L)损失评估预测分布的性能。这项研究介绍了基于先验的超谐音的直观条件,以根据先验的杰弗里斯提高贝叶斯预测分布。足够的条件与多元正态分布已知的条件表现出一定的类比。此外,本研究研究了要预测的数据和目标变量是具有不同持续时间的独立泊松过程的情况。提供满足足够条件的示例,包括点和子空间收缩率。证明改进的预测的K-L风险小于最小下限的1.04倍。
The problem of predicting independent Poisson random variables is commonly encountered in real-life practice. Simultaneous predictive distributions for independent Poisson observables are investigated, and the performance of predictive distributions is evaluated using the Kullback-Leibler (K-L) loss. This study introduces intuitive sufficient conditions, based on superharmonicity of priors, to improve the Bayesian predictive distribution based on the Jeffreys prior. The sufficient conditions exhibit a certain analogy with those known for the multivariate normal distribution. Additionally, this study examines the case where the observed data and target variables to be predicted are independent Poisson processes with different durations. Examples that satisfy the sufficient conditions are provided, including point and subspace shrinkage priors. The K-L risk of the improved predictions is demonstrated to be less than 1.04 times a minimax lower bound.