论文标题

RZSR:基于参考的零射击超分辨率,深度引导自我表述

RZSR: Reference-based Zero-Shot Super-Resolution with Depth Guided Self-Exemplars

论文作者

Yoo, Jun-Sang, Kim, Dong-Wook, Lu, Yucheng, Jung, Seung-Won

论文摘要

单图超分辨率(SISR)的最新方法在从低分辨率(LR)图像产生高分辨率(HR)图像方面表现出了出色的性能。但是,这些方法中的大多数使用合成生成的LR图像显示出它们的优势,并且它们对现实世界图像的概括性通常并不令人满意。在本文中,我们关注为可靠的超级分辨率(SR)开发的两种知名策略,即基于参考的SR(REFSR)和零摄影SR(ZSSR),并提出了一种集成解决方案,称为基于参考的零击SR(RZSR)。遵循ZSSR的原理,我们使用仅从输入图像本身提取的训练样本在测试时间训练特定于图像的SR网络。为了推进ZSSR,我们获得具有丰富纹理和高频细节的参考图像贴片,也仅使用跨尺度匹配从输入图像中提取。为此,我们使用深度信息构建了一个内部参考数据集并从数据集中检索参考图像补丁。使用LR补丁及其相应的HR参考贴片,我们训练具有非本地注意模块体现的REFSR网络。实验结果表明,与以前的ZSSR方法相比,与其他完全监督的SISR方法相比,所提出的RZSR的优越性与前所未有的图像相比。

Recent methods for single image super-resolution (SISR) have demonstrated outstanding performance in generating high-resolution (HR) images from low-resolution (LR) images. However, most of these methods show their superiority using synthetically generated LR images, and their generalizability to real-world images is often not satisfactory. In this paper, we pay attention to two well-known strategies developed for robust super-resolution (SR), i.e., reference-based SR (RefSR) and zero-shot SR (ZSSR), and propose an integrated solution, called reference-based zero-shot SR (RZSR). Following the principle of ZSSR, we train an image-specific SR network at test time using training samples extracted only from the input image itself. To advance ZSSR, we obtain reference image patches with rich textures and high-frequency details which are also extracted only from the input image using cross-scale matching. To this end, we construct an internal reference dataset and retrieve reference image patches from the dataset using depth information. Using LR patches and their corresponding HR reference patches, we train a RefSR network that is embodied with a non-local attention module. Experimental results demonstrate the superiority of the proposed RZSR compared to the previous ZSSR methods and robustness to unseen images compared to other fully supervised SISR methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源