论文标题

用于生成对抗网络的基于U-NET的鉴别器

A U-Net Based Discriminator for Generative Adversarial Networks

论文作者

Schönfeld, Edgar, Schiele, Bernt, Khoreva, Anna

论文摘要

生成对抗网络(GAN)的主要剩余挑战之一是能够与对象形状和纹理合成全球和本地一致的图像,与对象形状和纹理与真实图像无法区分。为了解决这个问题,我们提出了一个基于U-NET的替代歧视架构,借用了分割文献中的见解。提出的基于U-NET的体系结构允许通过提供全局图像反馈来维护综合图像的全局相干性,同时还可以向发电机提供详细的每个像素反馈。在歧视器的每个像素响应的能力下,我们进一步提出了基于CutMix数据增强的每像素一致性正规化技术,鼓励U-NET歧视器更多地专注于真实图像和假图像之间的语义和结构变化。这改善了U-NET歧视训练,进一步提高了生成的样品的质量。在标准分布和图像质量指标方面,新颖的判别器改善了艺术的状态,从而使生成器能够合成具有不同结构,外观和细节水平的图像,从而维持全球和本地现实主义。与Biggan基线相比,我们在FFHQ,Celeba和新引入的可可胺数据集的平均提高了2.7 FID点。该代码可从https://github.com/boschresearch/unetgan获得。

Among the major remaining challenges for generative adversarial networks (GANs) is the capacity to synthesize globally and locally coherent images with object shapes and textures indistinguishable from real images. To target this issue we propose an alternative U-Net based discriminator architecture, borrowing the insights from the segmentation literature. The proposed U-Net based architecture allows to provide detailed per-pixel feedback to the generator while maintaining the global coherence of synthesized images, by providing the global image feedback as well. Empowered by the per-pixel response of the discriminator, we further propose a per-pixel consistency regularization technique based on the CutMix data augmentation, encouraging the U-Net discriminator to focus more on semantic and structural changes between real and fake images. This improves the U-Net discriminator training, further enhancing the quality of generated samples. The novel discriminator improves over the state of the art in terms of the standard distribution and image quality metrics, enabling the generator to synthesize images with varying structure, appearance and levels of detail, maintaining global and local realism. Compared to the BigGAN baseline, we achieve an average improvement of 2.7 FID points across FFHQ, CelebA, and the newly introduced COCO-Animals dataset. The code is available at https://github.com/boschresearch/unetgan.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源