论文标题

贝叶斯EM算法收敛的动力学系统方法

A Dynamical Systems Approach for Convergence of the Bayesian EM Algorithm

论文作者

Romero, Orlando, Das, Subhro, Chen, Pin-Yu, Pequito, Sérgio

论文摘要

在系统和控制方面的最新进展(S \&C)的优化算法分析中,没有足够的工作专门用于机器学习(ML)算法及其应用程序。本文通过说明(离散时间)Lyapunov稳定性理论可以作为优化算法(不一定基于梯度的优化算法的分析(和潜在设计))来解决这一差距。本文重点关注的特定ML问题是通过流行的优化算法被称为最大A后验预期最大化(MAP-EM)中的不完整数据贝叶斯框架中参数估计的问题。遵循动态系统稳定理论的第一原理,开发了MAP-EM收敛条件。此外,如果满足其他假设,我们表明可以实现快速收敛(线性或二次),如果没有我们采用的S \&C方法,这可能很难揭幕。本文中的收敛保证有效地扩展了EM应用的足够条件,从而证明了其他ML算法的基于S \&C的相似收敛分析的潜力。

Out of the recent advances in systems and control (S\&C)-based analysis of optimization algorithms, not enough work has been specifically dedicated to machine learning (ML) algorithms and its applications. This paper addresses this gap by illustrating how (discrete-time) Lyapunov stability theory can serve as a powerful tool to aid, or even lead, in the analysis (and potential design) of optimization algorithms that are not necessarily gradient-based. The particular ML problem that this paper focuses on is that of parameter estimation in an incomplete-data Bayesian framework via the popular optimization algorithm known as maximum a posteriori expectation-maximization (MAP-EM). Following first principles from dynamical systems stability theory, conditions for convergence of MAP-EM are developed. Furthermore, if additional assumptions are met, we show that fast convergence (linear or quadratic) is achieved, which could have been difficult to unveil without our adopted S\&C approach. The convergence guarantees in this paper effectively expand the set of sufficient conditions for EM applications, thereby demonstrating the potential of similar S\&C-based convergence analysis of other ML algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源