论文标题
带模型树的贝叶斯添加剂回归树
Bayesian Additive Regression Trees with Model Trees
论文作者
论文摘要
贝叶斯添加剂回归树(BART)是一种基于树的机器学习方法,已成功应用于回归和分类问题。 Bart在一组弱学习者的树木上假设正规化先验,并且在存在非线性和高阶相互作用的情况下非常灵活。在本文中,我们介绍了BART的扩展名,称为模型树Bart(Motr-Bart),该扩展是在节点级别而不是分段常数上考虑分段线性函数。在MOTR-BART中,考虑到已经用作相应树中的分裂变量的协变量,估计了线性预测因素,而不是在节点级别上具有唯一的值。在我们的方法中,局部线性更有效地捕获局部线性,并且比BART相同或更好的表现需要更少的树木。通过模拟研究和实际数据应用,我们将Motr-Bart与其主要竞争对手进行了比较。用于Motr-Bart实现的R代码可在https://github.com/ebprado/motr-bart上获得。
Bayesian Additive Regression Trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of non-linearity and high-order interactions. In this paper, we introduce an extension of BART, called Model Trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.