论文标题

Repnp:插件及其深入的加固学习,以前的图像恢复

REPNP: Plug-and-Play with Deep Reinforcement Learning Prior for Robust Image Restoration

论文作者

Wang, Chong, Zhang, Rongkai, Ravishankar, Saiprasad, Wen, Bihan

论文摘要

基于预训练的深层模型的图像恢复方案由于解决各种反问题的独特灵活性而受到了极大的关注。尤其是,插件播放(PNP)框架是一种流行而强大的工具,可以将现成的深层Denoiser集成,以与已知的观察模型一起,以用于不同的图像恢复任务。但是,获得与实际相匹配的观察模型在实践中可能具有挑战性。因此,具有常规深层DeNoiser的PNP方案可能无法在某些现实世界图像恢复任务中产生令人满意的结果。我们认为,通过使用经过确定性优化训练的现成的深层DENOISER,PNP框架的鲁棒性在很大程度上受到限制。为此,我们提出了一种新颖的深入增强学习(DRL),以称为Repnp的PNP框架,通过利用基于轻巧的DRL的DENOISER来制定可靠的图像恢复任务。实验结果表明,所提出的REPNP对与实际情况的PNP方案中使用的观察模型具有鲁棒性。因此,RepNP可以为图像脱张和超级分辨率任务生成更可靠的恢复结果。与几个最先进的深层图像恢复基线相比,RepNP可以通过更少的模型参数实现更好的模型偏差的结果。

Image restoration schemes based on the pre-trained deep models have received great attention due to their unique flexibility for solving various inverse problems. In particular, the Plug-and-Play (PnP) framework is a popular and powerful tool that can integrate an off-the-shelf deep denoiser for different image restoration tasks with known observation models. However, obtaining the observation model that exactly matches the actual one can be challenging in practice. Thus, the PnP schemes with conventional deep denoisers may fail to generate satisfying results in some real-world image restoration tasks. We argue that the robustness of the PnP framework is largely limited by using the off-the-shelf deep denoisers that are trained by deterministic optimization. To this end, we propose a novel deep reinforcement learning (DRL) based PnP framework, dubbed RePNP, by leveraging a light-weight DRL-based denoiser for robust image restoration tasks. Experimental results demonstrate that the proposed RePNP is robust to the observation model used in the PnP scheme deviating from the actual one. Thus, RePNP can generate more reliable restoration results for image deblurring and super resolution tasks. Compared with several state-of-the-art deep image restoration baselines, RePNP achieves better results subjective to model deviation with fewer model parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源