论文标题

Bistnet:语义图像先验指导的双向时间特征融合,用于基于典范的视频着色

BiSTNet: Semantic Image Prior Guided Bidirectional Temporal Feature Fusion for Deep Exemplar-based Video Colorization

论文作者

Yang, Yixin, Peng, Zhongzheng, Du, Xiaoyu, Tao, Zhulin, Tang, Jinhui, Pan, Jinshan

论文摘要

如何有效地探索参考示例的颜色并传播它们以使每个框架着色,对于基于示例的视频着色至关重要。在本文中,我们提出了一个有效的bistnet,以探索参考示例的颜色,并利用它们通过双向时间特征融合来帮助视频着色,并在先验的语义图像的指导下进行指导。我们首先在深色特征空间中建立每个帧与参考示例之间的语义对应关系,以探索参考示例中的颜色信息。然后,为了更好地将参考示例的颜色传播到每个框架中,并避免不准确的匹配典范中的颜色,我们开发了一个简单而有效的双向时间特征融合模块,以更好地使每个框架着色。我们注意到,通常在视频中重要对象的边界周围存在颜色多余的文物。为了克服这个问题,我们进一步开发了一个混合的专家块,以提取语义信息,以建模框架的对象边界,以便先验的语义图像可以更好地指导着色过程以提高性能。此外,我们开发了一个多尺度的复发块,以粗到精细的方式逐渐使框架逐渐着色。广泛的实验结果表明,所提出的Bistnet对基准数据集上的最新方法表现出色。我们的代码将在\ url {https://yyang181.github.io/bistnet/}提供。

How to effectively explore the colors of reference exemplars and propagate them to colorize each frame is vital for exemplar-based video colorization. In this paper, we present an effective BiSTNet to explore colors of reference exemplars and utilize them to help video colorization by a bidirectional temporal feature fusion with the guidance of semantic image prior. We first establish the semantic correspondence between each frame and the reference exemplars in deep feature space to explore color information from reference exemplars. Then, to better propagate the colors of reference exemplars into each frame and avoid the inaccurate matches colors from exemplars we develop a simple yet effective bidirectional temporal feature fusion module to better colorize each frame. We note that there usually exist color-bleeding artifacts around the boundaries of the important objects in videos. To overcome this problem, we further develop a mixed expert block to extract semantic information for modeling the object boundaries of frames so that the semantic image prior can better guide the colorization process for better performance. In addition, we develop a multi-scale recurrent block to progressively colorize frames in a coarse-to-fine manner. Extensive experimental results demonstrate that the proposed BiSTNet performs favorably against state-of-the-art methods on the benchmark datasets. Our code will be made available at \url{https://yyang181.github.io/BiSTNet/}

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源