论文标题

更好的模型选择具有新的特征重要性定义

Better Model Selection with a new Definition of Feature Importance

论文作者

Fang, Fan, Ventre, Carmine, Li, Lingbo, Kanthan, Leslie, Wu, Fan, Basios, Michail

论文摘要

特征重要性旨在衡量每个输入特征对于模型预测的关键。它被广泛用于功能工程,模型选择和可解释的人工智能(XAI)中。在本文中,我们提出了一种新的树模型解释方法,以进行模型选择。我们的新颖概念利用特征权重的变化系数(根据特征对预测的贡献来衡量)来捕获重要性比样品的分散。广泛的实验结果表明,在时间效率和准确性性能方面,我们的新型特征解释在模型选择中的性能要比一般交叉验证方法更好。

Feature importance aims at measuring how crucial each input feature is for model prediction. It is widely used in feature engineering, model selection and explainable artificial intelligence (XAI). In this paper, we propose a new tree-model explanation approach for model selection. Our novel concept leverages the Coefficient of Variation of a feature weight (measured in terms of the contribution of the feature to the prediction) to capture the dispersion of importance over samples. Extensive experimental results show that our novel feature explanation performs better than general cross validation method in model selection both in terms of time efficiency and accuracy performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源