论文标题
深n-ary错误纠正输出代码
Deep N-ary Error Correcting Output Codes
论文作者
论文摘要
合奏学习始终通过汇总一系列基本分类器来提高多类分类的性能。为此,由于校正输出代码(ECOC)的数据独立的集合方法,由于其实现和并行化的易于纠正输出代码(ECOC)引起了越来越多的关注。具体而言,传统的ECOC及其一般扩展N- ARY ECOC将原始的多类分类问题分解为一系列独立的简单分类子问题。不幸的是,由于培训基础学习者的高昂费用,将ECOC(尤其是N- ARY ECOC与Deep N- Ary网络)整合为深神经网络,并不是直接的,但在文献中并不完全被利用。为了促进通过深度学习基础学习者对N-ARY ECOC的培训,我们进一步提出了三种不同的参数共享架构的变体,以进行深N- ARY ECOC。为了验证深n-ary ECOC的概括能力,我们通过改变具有不同深层神经网络架构的骨干来进行图像和文本分类任务来进行实验。此外,对深N- ARY ECOC的广泛消融研究表明,其优于数据独立的集合方法的性能优越。
Ensemble learning consistently improves the performance of multi-class classification through aggregating a series of base classifiers. To this end, data-independent ensemble methods like Error Correcting Output Codes (ECOC) attract increasing attention due to its easiness of implementation and parallelization. Specifically, traditional ECOCs and its general extension N-ary ECOC decompose the original multi-class classification problem into a series of independent simpler classification subproblems. Unfortunately, integrating ECOCs, especially N-ary ECOC with deep neural networks, termed as deep N-ary ECOC, is not straightforward and yet fully exploited in the literature, due to the high expense of training base learners. To facilitate the training of N-ary ECOC with deep learning base learners, we further propose three different variants of parameter sharing architectures for deep N-ary ECOC. To verify the generalization ability of deep N-ary ECOC, we conduct experiments by varying the backbone with different deep neural network architectures for both image and text classification tasks. Furthermore, extensive ablation studies on deep N-ary ECOC show its superior performance over other deep data-independent ensemble methods.