论文标题
信息瓶颈问题及其在机器学习中的应用
The Information Bottleneck Problem and Its Applications in Machine Learning
论文作者
论文摘要
近年来,机器学习(ML)系统的推论能力飙升,现在在社会各个方面起着关键作用。统计学习的目标是使用数据获取简单的算法来预测相关的观察$ x $的随机变量$ y $。由于$ x $的尺寸通常是巨大的,因此计算上可行的解决方案应将其总结为较低维的功能向量$ t $,从中可以预测$ y $。尽管$ t $是$ y $的良好代理,但算法将成功做出预测。现在可以使用无数的ML算法(主要是使用深度学习(DL))来基于现实世界数据来查找此类表示$ T $。尽管这些方法在实践中通常是有效的,但由于缺乏一个全面的解释理论,它们的成功受到了阻碍。信息瓶颈(IB)理论最近成为用于分析DL系统的大胆信息理论范例。采用共同信息作为功绩数字,这表明最好的表示$ t $应该在$ y $上具有最大信息,同时用$ x $最大程度地减少共同信息。在本教程中,我们调查了该抽象原则的信息理论起源及其对DL的最新影响。对于后者,我们涵盖了IB问题对DL理论的含义,以及受其启发的实用算法。我们的目标是提供统一和凝聚力的描述。对当前知识的清晰观点对于进一步利用IB和其他信息理论思想来研究DL模型尤为重要。
Inference capabilities of machine learning (ML) systems skyrocketed in recent years, now playing a pivotal role in various aspect of society. The goal in statistical learning is to use data to obtain simple algorithms for predicting a random variable $Y$ from a correlated observation $X$. Since the dimension of $X$ is typically huge, computationally feasible solutions should summarize it into a lower-dimensional feature vector $T$, from which $Y$ is predicted. The algorithm will successfully make the prediction if $T$ is a good proxy of $Y$, despite the said dimensionality-reduction. A myriad of ML algorithms (mostly employing deep learning (DL)) for finding such representations $T$ based on real-world data are now available. While these methods are often effective in practice, their success is hindered by the lack of a comprehensive theory to explain it. The information bottleneck (IB) theory recently emerged as a bold information-theoretic paradigm for analyzing DL systems. Adopting mutual information as the figure of merit, it suggests that the best representation $T$ should be maximally informative about $Y$ while minimizing the mutual information with $X$. In this tutorial we survey the information-theoretic origins of this abstract principle, and its recent impact on DL. For the latter, we cover implications of the IB problem on DL theory, as well as practical algorithms inspired by it. Our goal is to provide a unified and cohesive description. A clear view of current knowledge is particularly important for further leveraging IB and other information-theoretic ideas to study DL models.