论文标题

学习图像超分辨率的纹理变压器网络

Learning Texture Transformer Network for Image Super-Resolution

论文作者

Yang, Fuzhi, Yang, Huan, Fu, Jianlong, Lu, Hongtao, Guo, Baining

论文摘要

我们研究了图像超分辨率(SR),该图像旨在从低分辨率(LR)图像中恢复逼真的纹理。最新的进展是通过将高分辨率图像作为参考(参考)(参考)取得的,因此可以将相关纹理传输到LR图像。但是,现有的SR方法忽略了使用注意机制从参考图像转移高分辨率(HR)纹理的方法,这限制了这些方法在有挑战性的情况下。在本文中,我们提出了一个用于图像超分辨率(TTSR)的新型纹理变压器网络,其中LR和REF图像分别为变压器中的查询和钥匙。 TTSR由针对图像生成任务进行优化的四个密切相关的模块组成,包括DNN的可学习纹理提取器,相关性嵌入模块,用于纹理传输的硬注意模块以及用于纹理合成的软主模块。这样的设计鼓励在LR和Ref图像上进行联合特征学习,其中可以通过注意力发现深度特征对应关系,因此可以传递准确的纹理特征。提出的纹理变压器可以以跨尺度的方式进一步堆叠,从而可以从不同级别(例如,从1x到4倍)恢复纹理恢复。广泛的实验表明,TTSR对定量和定性评估的最先进方法实现了重大改进。

We study on image super-resolution (SR), which aims to recover realistic textures from a low-resolution (LR) image. Recent progress has been made by taking high-resolution images as references (Ref), so that relevant textures can be transferred to LR images. However, existing SR approaches neglect to use attention mechanisms to transfer high-resolution (HR) textures from Ref images, which limits these approaches in challenging cases. In this paper, we propose a novel Texture Transformer Network for Image Super-Resolution (TTSR), in which the LR and Ref images are formulated as queries and keys in a transformer, respectively. TTSR consists of four closely-related modules optimized for image generation tasks, including a learnable texture extractor by DNN, a relevance embedding module, a hard-attention module for texture transfer, and a soft-attention module for texture synthesis. Such a design encourages joint feature learning across LR and Ref images, in which deep feature correspondences can be discovered by attention, and thus accurate texture features can be transferred. The proposed texture transformer can be further stacked in a cross-scale way, which enables texture recovery from different levels (e.g., from 1x to 4x magnification). Extensive experiments show that TTSR achieves significant improvements over state-of-the-art approaches on both quantitative and qualitative evaluations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源