论文标题

EGO2HANDS:用于以Egentric双手分割和检测的数据集

Ego2Hands: A Dataset for Egocentric Two-hand Segmentation and Detection

论文作者

Lin, Fanqing, Price, Brian, Martinez, Tony

论文摘要

在真正无约束的基于RGB的设置中,手动细分和检测对于许多应用程序都很重要。但是,由于大量分割和检测数据的手动注释的不可行,现有数据集在大小和多样性方面远远不够。结果,当前方法受到许多基本假设的限制,例如受约束的环境,一致的肤色和照明。在这项工作中,我们提出了EGO2HANDS,这是一种基于RGB的大规模的以Egb为中心的手部分割/检测数据集,该数据集是半自动注释的,并且是一种基于色彩不变的数据生成技术,能够创建具有大量数量和品种的培训数据。对于定量分析,我们手动注释了一个评估集,该评估集大大超过了数量,多样性和注释精度的现有基准。我们提供跨数据库评估以及对EGO2HAND上最先进模型的性能的详尽分析,以表明我们的数据集和数据生成技术可以产生概括以不看到无域适应的环境的模型。

Hand segmentation and detection in truly unconstrained RGB-based settings is important for many applications. However, existing datasets are far from sufficient in terms of size and variety due to the infeasibility of manual annotation of large amounts of segmentation and detection data. As a result, current methods are limited by many underlying assumptions such as constrained environment, consistent skin color and lighting. In this work, we present Ego2Hands, a large-scale RGB-based egocentric hand segmentation/detection dataset that is semi-automatically annotated and a color-invariant compositing-based data generation technique capable of creating training data with large quantity and variety. For quantitative analysis, we manually annotated an evaluation set that significantly exceeds existing benchmarks in quantity, diversity and annotation accuracy. We provide cross-dataset evaluation as well as thorough analysis on the performance of state-of-the-art models on Ego2Hands to show that our dataset and data generation technique can produce models that generalize to unseen environments without domain adaptation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源