论文标题

归纳的相对论

The Relativity of Induction

论文作者

Muhlstein, Larry

论文摘要

最近,关于为什么深度学习算法的性能比我们理论上怀疑的更好。为了了解这个问题,它有助于提高我们对学习方式的理解。我们探讨了概括的核心问题,并表明长期被接受的Occam的剃须刀和简约原则不足以进行地面学习。取而代之的是,我们得出并演示了一系列相对论原则,这些原则对学习的本质和动态产生了更清晰的见解。我们表明,简单性的概念从根本上是偶然性的,所有学习均相对于初始猜测运行,并且不能测量或强烈推断概括,但是如果有足够的观察,可以预期。使用这些原则,我们根据分布式学习系统的分布式学习系统继承了信念并更新它们。然后,我们将此观点应用于阐明某些现实世界的归纳过程的性质,包括深度学习。

Lately there has been a lot of discussion about why deep learning algorithms perform better than we would theoretically suspect. To get insight into this question, it helps to improve our understanding of how learning works. We explore the core problem of generalization and show that long-accepted Occam's razor and parsimony principles are insufficient to ground learning. Instead, we derive and demonstrate a set of relativistic principles that yield clearer insight into the nature and dynamics of learning. We show that concepts of simplicity are fundamentally contingent, that all learning operates relative to an initial guess, and that generalization cannot be measured or strongly inferred, but that it can be expected given enough observation. Using these principles, we reconstruct our understanding in terms of distributed learning systems whose components inherit beliefs and update them. We then apply this perspective to elucidate the nature of some real world inductive processes including deep learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源