论文标题

注意差距:放大开放式域中的域间隙

Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation

论文作者

Chang, Dongliang, Sain, Aneeshan, Ma, Zhanyu, Song, Yi-Zhe, Guo, Jun

论文摘要

无监督的域适应性旨在利用来自源域的标记数据来学习未标记的目标域的分类器。在其许多变体中,开放式设置域适应(OSDA)也许是最具挑战性的,因为它进一步假设目标域中存在未知类别。在本文中,我们研究了OSDA,特别着重于丰富其跨越较大域间隙的能力。首先,我们表明,在较大的域间隙存在下,现有的最新方法的性能下降了相当大的性能,尤其是在我们为OSDA重新使用的新数据集(PACS)上。然后,我们提出了一个新颖的框架,以专门解决较大的域间隙。关键洞察力在于我们如何利用两个网络之间的互惠信息。 (a)分离已知和未知类别的样本,(b)以最大化源和目标域之间的域混淆,而不会影响未知样本的影响。因此,(a)和(b)将相互监督并交替直至收敛。在Office-31,Office Home和PACS数据集上进行了广泛的实验,证明了我们方法与其他最先进的方法相比。可在https://github.com/dongliangchang/mutual-to-separate/上获得代码

Unsupervised domain adaptation aims to leverage labeled data from a source domain to learn a classifier for an unlabeled target domain. Among its many variants, open set domain adaptation (OSDA) is perhaps the most challenging, as it further assumes the presence of unknown classes in the target domain. In this paper, we study OSDA with a particular focus on enriching its ability to traverse across larger domain gaps. Firstly, we show that existing state-of-the-art methods suffer a considerable performance drop in the presence of larger domain gaps, especially on a new dataset (PACS) that we re-purposed for OSDA. We then propose a novel framework to specifically address the larger domain gaps. The key insight lies with how we exploit the mutually beneficial information between two networks; (a) to separate samples of known and unknown classes, (b) to maximize the domain confusion between source and target domain without the influence of unknown samples. It follows that (a) and (b) will mutually supervise each other and alternate until convergence. Extensive experiments are conducted on Office-31, Office-Home, and PACS datasets, demonstrating the superiority of our method in comparison to other state-of-the-arts. Code available at https://github.com/dongliangchang/Mutual-to-Separate/

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源