论文标题

相对偏差边界

Relative Deviation Margin Bounds

论文作者

Cortes, Corinna, Mohri, Mehryar, Suresh, Ananda Theertha

论文摘要

我们提供了一系列新的,更有利的基于利润的学习保证,这些保证取决于预测指标的经验差损失。我们提供了两种类型的学习范围,既取决于分布依赖性且对一般家庭有效,涉及Rademacher的复杂性或经验$ \ ell_ \ Infty $涵盖所使用的假设集的数量。此外,使用我们的相对偏差边界,在有限矩的假设下,我们得出了无界损耗函数的分布依赖性概括界。我们还简要强调了这些界限的几个应用程序,并与现有结果讨论了它们的联系。

We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, both distribution-dependent and valid for general families, in terms of the Rademacher complexity or the empirical $\ell_\infty$ covering number of the hypothesis set used. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源