论文标题

GraphMLP:用于3D人姿势估计的图形MLP样结构

GraphMLP: A Graph MLP-Like Architecture for 3D Human Pose Estimation

论文作者

Li, Wenhao, Liu, Mengyuan, Liu, Hong, Guo, Tianyu, Wang, Ti, Tang, Hao, Sebe, Nicu

论文摘要

现代的多层感知器(MLP)模型在学习视觉表现的情况下没有自我注意力显示出竞争性的结果。但是,现有的MLP模型不擅长捕获本地细节,并且缺乏人体配置的先验知识,这限制了其骨骼表示学习的建模能力。为了解决这些问题,我们提出了一个名为GraphMLP的简单而有效的图形增强的MLP架构,该体系结构将MLP和Graph卷积网络(GCN)组合在3D人类姿势估计的全球 - 局部 - 局部图形统一体系中。 GraphMLP将人体的图结构纳入MLP模型中,以满足3D人姿势的域特异性需求,同时允许局部和全局空间相互作用。此外,我们建议灵活有效地将GraphMlp扩展到视频域,并表明可以以简单的方式有效地建模复杂的时间动力学,并以序列长度可忽略不计的计算成本增长。据我们所知,这是单个框架和视频序列中3D人类姿势估算的第一个类似MLP的架构。广泛的实验表明,所提出的GraphMLP在两个数据集(即Human3.6M和MPI-INF-3DHP)上实现了最先进的性能。代码和型号可在https://github.com/vegetebird/graphmlp上找到。

Modern multi-layer perceptron (MLP) models have shown competitive results in learning visual representations without self-attention. However, existing MLP models are not good at capturing local details and lack prior knowledge of human body configurations, which limits their modeling power for skeletal representation learning. To address these issues, we propose a simple yet effective graph-reinforced MLP-Like architecture, named GraphMLP, that combines MLPs and graph convolutional networks (GCNs) in a global-local-graphical unified architecture for 3D human pose estimation. GraphMLP incorporates the graph structure of human bodies into an MLP model to meet the domain-specific demand of the 3D human pose, while allowing for both local and global spatial interactions. Furthermore, we propose to flexibly and efficiently extend the GraphMLP to the video domain and show that complex temporal dynamics can be effectively modeled in a simple way with negligible computational cost gains in the sequence length. To the best of our knowledge, this is the first MLP-Like architecture for 3D human pose estimation in a single frame and a video sequence. Extensive experiments show that the proposed GraphMLP achieves state-of-the-art performance on two datasets, i.e., Human3.6M and MPI-INF-3DHP. Code and models are available at https://github.com/Vegetebird/GraphMLP.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源