论文标题
基于加权和修剪的集合深度随机矢量功能链路网络,用于表格数据分类
Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network for Tabular Data Classification
论文作者
论文摘要
在本文中,我们首先将批处理归一化为EDRVFL网络。这种重新归一化的方法可以帮助网络避免隐藏功能的分歧。然后,我们提出了集合深的随机矢量功能链路(EDRVFL)的新型变体。加权EDRVFL(WEDRVFL)使用加权方法,根据如何在上一层中自信地对样品进行分类,从而为不同层的训练样品提供不同的权重,从而提高了集合的多样性和准确性。此外,还提出了基于修剪的EDRVFL(PEDRVFL)。在生成下一个隐藏层之前,我们基于其对分类的重要性修剪一些下等神经元。通过这种方法,我们确保随机生成的劣等特征不会传播到更深的层。随后,提出了加权和修剪的组合,称为加权和修剪的集合深层随机矢量功能链路网络(WPEDRVFL)。我们将它们的性能与24个表格UCI分类数据集上的其他最先进的深馈神经网络(FNN)进行了比较。实验结果说明了我们提出的方法的出色性能。
In this paper, we first introduce batch normalization to the edRVFL network. This re-normalization method can help the network avoid divergence of the hidden features. Then we propose novel variants of Ensemble Deep Random Vector Functional Link (edRVFL). Weighted edRVFL (WedRVFL) uses weighting methods to give training samples different weights in different layers according to how the samples were classified confidently in the previous layer thereby increasing the ensemble's diversity and accuracy. Furthermore, a pruning-based edRVFL (PedRVFL) has also been proposed. We prune some inferior neurons based on their importance for classification before generating the next hidden layer. Through this method, we ensure that the randomly generated inferior features will not propagate to deeper layers. Subsequently, the combination of weighting and pruning, called Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network (WPedRVFL), is proposed. We compare their performances with other state-of-the-art deep feedforward neural networks (FNNs) on 24 tabular UCI classification datasets. The experimental results illustrate the superior performance of our proposed methods.