论文标题
Bareskinnet:通过3D脸重建进行脱落和脱光灯
BareSkinNet: De-makeup and De-lighting via 3D Face Reconstruction
论文作者
论文摘要
我们提出了一种新颖的方法Bareskinnet,同时消除了面部图像的化妆和照明影响。我们的方法利用3D形态模型,不需要参考干净的面部图像或指定的光条件。通过结合3D面重构的过程,我们可以轻松获得3D几何和粗3D纹理。使用此信息,我们可以通过图像翻译网络推断出归一化的3D面纹理图(扩散,正常,粗糙和镜面)。因此,没有不良信息的重建3D面部纹理将显着受益于随后的过程,例如重新照明或重新制作。在实验中,我们表明Bareskinnet优于最先进的化妆方法。此外,我们的方法有助于卸妆以生成一致的高保真纹理图,这使其可扩展到许多现实的面部生成应用。它还可以在相应的3D数据之前和之后自动构建面部化妆图像的图形资产。这将有助于艺术家加速他们的作品,例如3D Makeup Avatar创作。
We propose BareSkinNet, a novel method that simultaneously removes makeup and lighting influences from the face image. Our method leverages a 3D morphable model and does not require a reference clean face image or a specified light condition. By combining the process of 3D face reconstruction, we can easily obtain 3D geometry and coarse 3D textures. Using this information, we can infer normalized 3D face texture maps (diffuse, normal, roughness, and specular) by an image-translation network. Consequently, reconstructed 3D face textures without undesirable information will significantly benefit subsequent processes, such as re-lighting or re-makeup. In experiments, we show that BareSkinNet outperforms state-of-the-art makeup removal methods. In addition, our method is remarkably helpful in removing makeup to generate consistent high-fidelity texture maps, which makes it extendable to many realistic face generation applications. It can also automatically build graphic assets of face makeup images before and after with corresponding 3D data. This will assist artists in accelerating their work, such as 3D makeup avatar creation.