论文标题

修订的跨熵成本:明确多样性的框架鼓励框架

Amended Cross Entropy Cost: Framework For Explicit Diversity Encouragement

论文作者

Shoham, Ron, Permuter, Haim

论文摘要

交叉熵(CE)在机器学习,尤其是在神经网络中具有重要作用。它通常在神经网络中用作已知标签分布与软磁/sigmoid输出之间的成本。在本文中,我们提出了一种新的成本函数,称为修订的横熵(ACE)。它的新颖性在于它具有训练多个分类器的能力,同时明确控制它们之间的多样性。我们通过数学分析和希望梯度行为的方式“反向工程”得出了新的成本,并产生了量身定制,优雅和直观的成本功能,以实现所需的结果。此过程类似于将CE成本选为SoftMax/Sigmoid分类器用于获得线性衍生物的成本函数的方式。通过选择最佳多样性因子,我们产生的合奏会比香草一般产生更好的结果。我们证明了这一结果的两种潜在用途,并提出了经验结果。我们的方法针对回归问题类似于分类问题(NCL)类似于分类问题。

Cross Entropy (CE) has an important role in machine learning and, in particular, in neural networks. It is commonly used in neural networks as the cost between the known distribution of the label and the Softmax/Sigmoid output. In this paper we present a new cost function called the Amended Cross Entropy (ACE). Its novelty lies in its affording the capability to train multiple classifiers while explicitly controlling the diversity between them. We derived the new cost by mathematical analysis and "reverse engineering" of the way we wish the gradients to behave, and produced a tailor-made, elegant and intuitive cost function to achieve the desired result. This process is similar to the way that CE cost is picked as a cost function for the Softmax/Sigmoid classifiers for obtaining linear derivatives. By choosing the optimal diversity factor we produce an ensemble which yields better results than the vanilla one. We demonstrate two potential usages of this outcome, and present empirical results. Our method works for classification problems analogously to Negative Correlation Learning (NCL) for regression problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源