论文标题
通过伪划分从多个依赖数据中推断出流行病
Inferring Epidemics from Multiple Dependent Data via Pseudo-Marginal Methods
论文作者
论文摘要
健康政策计划需要证据表明流行病在医疗保健系统上的负担。多个(通常取决于)数据集提供了来自未观察到的流行过程(包括传输和严重性动态)的嘈杂和碎片信号。本文探讨了在分析多个依赖数据集时使用状态空间模型进行流行推断的重要挑战。我们提出了一个新的半故事模型,该模型利用了大规模传播动态的确定性近似值,同时在发生相对罕见的严重事件的发生和报告中保留了随机性。该模型适合许多实时情况,包括大型季节性流行病和流行病。在这种情况下,我们开发算法以提供精确的参数推理并通过仿真测试它们。最后,我们将联合模型和提议的算法应用于2017 - 18年度流感流行病的几种监视数据,以重建传播动态,并估算每日新的流感感染以及诸如病例医院化的风险和医院密集型护理风险等严重性指标。
Health-policy planning requires evidence on the burden that epidemics place on healthcare systems. Multiple, often dependent, datasets provide a noisy and fragmented signal from the unobserved epidemic process including transmission and severity dynamics. This paper explores important challenges to the use of state-space models for epidemic inference when multiple dependent datasets are analysed. We propose a new semi-stochastic model that exploits deterministic approximations for large-scale transmission dynamics while retaining stochasticity in the occurrence and reporting of relatively rare severe events. This model is suitable for many real-time situations including large seasonal epidemics and pandemics. Within this context, we develop algorithms to provide exact parameter inference and test them via simulation. Finally, we apply our joint model and the proposed algorithm to several surveillance data on the 2017-18 influenza epidemic in England to reconstruct transmission dynamics and estimate the daily new influenza infections as well as severity indicators such as the case-hospitalisation risk and the hospital-intensive care risk.