论文标题
一致性损失的照片样式转移
Photo style transfer with consistency losses
论文作者
论文摘要
我们解决了两张照片之间的样式转移问题,并提出了一种保存光真实主义的新方法。我们使用可用的一对照片作为输入,我们训练一对深卷积网络(Convnets),每张照片将一张照片的样式转移到另一张照片。为了强制执行摄影主义,我们通过将周期矛盾损失与自稳定性损失相结合来引入内容保存机制。实验结果表明,这种方法不会遭受在相同设置工作的方法中观察到的典型伪影。然后,我们进一步分析了这些经过训练的探测的某些特性。首先,我们注意到它们可用于以相同的已知样式对其他看不见的图像进行样式化。其次,我们表明,仅重新训练网络参数的一小部分可以足以使这些卷轴适应新样式。
We address the problem of style transfer between two photos and propose a new way to preserve photorealism. Using the single pair of photos available as input, we train a pair of deep convolution networks (convnets), each of which transfers the style of one photo to the other. To enforce photorealism, we introduce a content preserving mechanism by combining a cycle-consistency loss with a self-consistency loss. Experimental results show that this method does not suffer from typical artifacts observed in methods working in the same settings. We then further analyze some properties of these trained convnets. First, we notice that they can be used to stylize other unseen images with same known style. Second, we show that retraining only a small subset of the network parameters can be sufficient to adapt these convnets to new styles.