论文标题

一个积极的标签就足够了:具有标签增强的单阳性多标签学习

One Positive Label is Sufficient: Single-Positive Multi-Label Learning with Label Enhancement

论文作者

Xu, Ning, Qiao, Congyu, Lv, Jiaqi, Geng, Xin, Zhang, Min-Ling

论文摘要

多标签学习(MLL)从每个与多个标签相关联的示例中学习,其中每个培训示例的所有相关标签的高成本对于现实世界中的应用程序都有挑战。为了应对挑战,我们研究了仅使用一个相关标签注释每个示例的单阳性多标签学习(SPMLL),并表明人们可以成功地学习一个理论上扎根的多标签分类器,以解决该问题。在本文中,提出了一种名为Smile的新型SPMLL方法,即带有标签增强的单阳性多标签学习。具体而言,得出了无偏见的风险估计器,可以保证该估计器大致融合到完全监督的学习的最佳风险最小化器,并表明每个实例的一个正标能够足以训练预测模型。然后,通过将潜在软标签作为标签增强过程建立相应的经验风险估计器,其中潜在软标签的后验密度近似于通过推动模型对变异beta beta beta密度参数。基准数据集上的实验验证了所提出方法的有效性。

Multi-label learning (MLL) learns from the examples each associated with multiple labels simultaneously, where the high cost of annotating all relevant labels for each training example is challenging for real-world applications. To cope with the challenge, we investigate single-positive multi-label learning (SPMLL) where each example is annotated with only one relevant label, and show that one can successfully learn a theoretically grounded multi-label classifier for the problem. In this paper, a novel SPMLL method named SMILE, i.e., Single-positive MultI-label learning with Label Enhancement, is proposed. Specifically, an unbiased risk estimator is derived, which could be guaranteed to approximately converge to the optimal risk minimizer of fully supervised learning and shows that one positive label of each instance is sufficient to train the predictive model. Then, the corresponding empirical risk estimator is established via recovering the latent soft label as a label enhancement process, where the posterior density of the latent soft labels is approximate to the variational Beta density parameterized by an inference model. Experiments on benchmark datasets validate the effectiveness of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源