论文标题

将密集姿势转移到近端动物类别

Transferring Dense Pose to Proximal Animal Classes

论文作者

Sanakoyeu, Artsiom, Khalidov, Vasil, McCarthy, Maureen S., Vedaldi, Andrea, Neverova, Natalia

论文摘要

最近的贡献表明,可以详细注释的大量姿势数据集,可以识别人类的姿势。原则上,可以将相同的方法扩展到任何动物类别,但是为每种情况收集新注释所需的努力使该策略不切实际,尽管在自然保护,科学和商业中进行了重要应用。我们表明,至少对于黑猩猩等近端动物类别,可以将存在于人类以及更一般的对象探测器和细分器中的密集姿势识别中的知识转移到其他类别中良好姿势识别的问题。我们通过(1)为新动物建立致密模型来做到这一点,该模型在几何上与人类(2)介绍了多头R-CNN结构,该体系结构有助于促进多个识别任务的转移,(3)发现已知类别的组合可以通过新动物和新动物的质量进行分析,以进行自我培训,以使用新动物和新的动物(4)培训,并将其分析(4),(4) - 4) - 4)od-leferatie forse for new Antile-lefore(4),(4)pece new shep(4),(4)使用新的动物(4),(4)均可通过(4)培训。对于这个课。我们还介绍了两个基准数据集,这些数据集以班级黑猩猩的致密方式标记,并使用它们来评估我们的方法,并显示出出色的转移学习表现。

Recent contributions have demonstrated that it is possible to recognize the pose of humans densely and accurately given a large dataset of poses annotated in detail. In principle, the same approach could be extended to any animal class, but the effort required for collecting new annotations for each case makes this strategy impractical, despite important applications in natural conservation, science and business. We show that, at least for proximal animal classes such as chimpanzees, it is possible to transfer the knowledge existing in dense pose recognition for humans, as well as in more general object detectors and segmenters, to the problem of dense pose recognition in other classes. We do this by (1) establishing a DensePose model for the new animal which is also geometrically aligned to humans (2) introducing a multi-head R-CNN architecture that facilitates transfer of multiple recognition tasks between classes, (3) finding which combination of known classes can be transferred most effectively to the new animal and (4) using self-calibrated uncertainty heads to generate pseudo-labels graded by quality for training a model for this class. We also introduce two benchmark datasets labelled in the manner of DensePose for the class chimpanzee and use them to evaluate our approach, showing excellent transfer learning performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源