论文标题
ADACC:分类不平衡的累积成本敏感性提升
AdaCC: Cumulative Cost-Sensitive Boosting for Imbalanced Classification
论文作者
论文摘要
班级失衡对机器学习构成了重大挑战,因为大多数监督学习模型可能对多数级别和表现不佳的少数群体表现出偏见。成本敏感的学习通过以不同的方式处理类别,通常通过用户定义的固定错误分类成本矩阵来解决此问题,以提供给学习者的输入。这种参数调整是一项具有挑战性的任务,需要域知识,此外,错误的调整可能会导致整体预测性能恶化。在这项工作中,我们为不平衡数据提出了一种新颖的成本敏感方法,该方法可以动态调整助推器的分类成本,以响应Model的性能,而不是使用固定的错误分类成本矩阵。我们的方法称为ADACC,是无参数的,因为它依赖于增强模型的累积行为,以便调整下一次增强回合的错误分类成本,并具有有关培训错误的理论保证。来自不同领域的27个现实世界数据集的实验表明,我们方法的优势超过了12种最先进的成本敏感的增强方法,这些方法在不同度量方面表现出一致的改进,例如,在[0.3%-28.56%]的范围内,AUC的范围是AUC,[3.4%-21.4%],[3.4%-21.4%] for balandic and for GMEN [4.8%] [4.8%] [4.8%] [4.8%] [4.8%5%] [7.4%-85.5%]召回。
Class imbalance poses a major challenge for machine learning as most supervised learning models might exhibit bias towards the majority class and under-perform in the minority class. Cost-sensitive learning tackles this problem by treating the classes differently, formulated typically via a user-defined fixed misclassification cost matrix provided as input to the learner. Such parameter tuning is a challenging task that requires domain knowledge and moreover, wrong adjustments might lead to overall predictive performance deterioration. In this work, we propose a novel cost-sensitive boosting approach for imbalanced data that dynamically adjusts the misclassification costs over the boosting rounds in response to model's performance instead of using a fixed misclassification cost matrix. Our method, called AdaCC, is parameter-free as it relies on the cumulative behavior of the boosting model in order to adjust the misclassification costs for the next boosting round and comes with theoretical guarantees regarding the training error. Experiments on 27 real-world datasets from different domains with high class imbalance demonstrate the superiority of our method over 12 state-of-the-art cost-sensitive boosting approaches exhibiting consistent improvements in different measures, for instance, in the range of [0.3%-28.56%] for AUC, [3.4%-21.4%] for balanced accuracy, [4.8%-45%] for gmean and [7.4%-85.5%] for recall.