论文标题
DSNET:通过复发神经网络预测动态皮肤变形
DSNet: Dynamic Skin Deformation Prediction by Recurrent Neural Network
论文作者
论文摘要
皮肤动态有助于在渲染场景中的人体模型的丰富现实主义。传统方法依靠基于物理学的模拟来准确再现软组织的动态行为。但是,由于模型的复杂性和重型计算,它们并未直接为需要实时性能的域提供实用的解决方案。通过基于物理学的模拟获得的质量形状并未通过基于示例的或更新的DataDriven方法完全利用,其中大多数通过利用质量数据而着重于静态皮肤形状的建模。为了解决这些局限性,我们提出了一种基于学习的动态皮肤变形方法。我们工作的核心是一个经常性的神经网络,该网络学会了预测随着时间的流逝的非线性,动态依赖性形状的变化,从现有的网格变形序列数据。我们的网络还学会了预测不同人体形状不同个体的皮肤动态变化。训练后,网络提供了实时课程中特定于人的逼真,高质量的皮肤动态。我们获得的结果可显着节省计算时间,同时与最先进的结果保持可比的预测质量。
Skin dynamics contributes to the enriched realism of human body models in rendered scenes. Traditional methods rely on physics-based simulations to accurately reproduce the dynamic behavior of soft tissues. Due to the model complexity and thus the heavy computation, however, they do not directly offer practical solutions to domains where real-time performance is desirable. The quality shapes obtained by physics-based simulations are not fully exploited by example-based or more recent datadriven methods neither, with most of them having focused on the modeling of static skin shapes by leveraging quality data. To address these limitations, we present a learningbased method for dynamic skin deformation. At the core of our work is a recurrent neural network that learns to predict the nonlinear, dynamics-dependent shape change over time from pre-existing mesh deformation sequence data. Our network also learns to predict the variation of skin dynamics across different individuals with varying body shapes. After training the network delivers realistic, high-quality skin dynamics that is specific to a person in a real-time course. We obtain results that significantly saves the computational time, while maintaining comparable prediction quality compared to state-of-the-art results.