论文标题

学习域不变表示特征的条件熵最小化原理

Conditional entropy minimization principle for learning domain invariant representation features

论文作者

Nguyen, Thuan, Lyu, Boyang, Ishwar, Prakash, Scheutz, Matthias, Aeron, Shuchin

论文摘要

基于不变的基本方法,例如不变风险最小化(IRM),最近已成为有前途的域泛化方法(DG)。尽管有希望的理论,但由于真正不变特征和虚假不变特征的混合,这种方法在共同的分类任务中失败了。为了解决这个问题,我们提出了一个基于条件熵最小化(CEM)原理的框架,以滤除具有更好的概括能力的新算法的虚假不变特征。我们表明,我们提出的方法与众所周知的信息瓶颈(IB)框架密切相关,并证明在某些假设下,熵最小化可以准确恢复真正的不变特征。与最近在几个DG数据集中的最新原理替代方案相比,我们的方法提供了竞争性的分类精度。

Invariance-principle-based methods such as Invariant Risk Minimization (IRM), have recently emerged as promising approaches for Domain Generalization (DG). Despite promising theory, such approaches fail in common classification tasks due to the mixing of true invariant features and spurious invariant features. To address this, we propose a framework based on the conditional entropy minimization (CEM) principle to filter-out the spurious invariant features leading to a new algorithm with a better generalization capability. We show that our proposed approach is closely related to the well-known Information Bottleneck (IB) framework and prove that under certain assumptions, entropy minimization can exactly recover the true invariant features. Our approach provides competitive classification accuracy compared to recent theoretically-principled state-of-the-art alternatives across several DG datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源