论文标题

用于实施梯度提升机的合奏的广义堆叠

A Generalized Stacking for Implementing Ensembles of Gradient Boosting Machines

论文作者

Konstantinov, Andrei V., Utkin, Lev V.

论文摘要

梯度提升机是解决回归问题的强大工具之一。为了应对其缺点,提出了一种构建梯度增强模型合奏的方法。该方法背后的主要思想是使用堆叠算法,以学习第二级元模型,该模型可以被视为实现各种梯度增强模型的模型。首先,在条件下,线性模型相对于其系数(重量)是可区分的,梯度增强模型的线性回归被认为是对元模型的最简单实现。然后表明,所提出的方法可以简单地在任意可区分的组合模型上进行扩展,例如,在可区分的神经网络上,可以实现梯度增强模型的任意功能。各种数值示例说明了提出的方法。

The gradient boosting machine is one of the powerful tools for solving regression problems. In order to cope with its shortcomings, an approach for constructing ensembles of gradient boosting models is proposed. The main idea behind the approach is to use the stacking algorithm in order to learn a second-level meta-model which can be regarded as a model for implementing various ensembles of gradient boosting models. First, the linear regression of the gradient boosting models is considered as a simplest realization of the meta-model under condition that the linear model is differentiable with respect to its coefficients (weights). Then it is shown that the proposed approach can be simply extended on arbitrary differentiable combination models, for example, on neural networks which are differentiable and can implement arbitrary functions of gradient boosting models. Various numerical examples illustrate the proposed approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源