论文阅读笔记之——《Image Super-Resolution Using Very Deep Residual Channel Attention Networks》RCAN

先给出论文的链接http://openaccess.thecvf.com/content_ECCV_2018/papers/Yulun_Zhang_Image_Super-Resolution_Using_ECCV_2018_paper.pdf

论文的代码https://github.com/yulunzhang/RCAN

复现的代码在博客《实验笔记之——Channel Attention(RCAN的复现)》

卷积网络的深度是影响超分性能的一个关键因素。然而,更深的网络是比较难训练的。低分辨率的输入和特征包含丰富的低频信息,这些信息在信道间被同等对待,从而阻碍了CNN的代表性能力。为了解决这一问题,作者提出了一个新的结构,residual channel attention networks。特别地,作者提出了一个RIR结构(residual in residual)由几个residual组和长的skip connection来构建非常深的网络。每一个residual group都包含若干个residual block和短的skip connections。与此同时,RIR通过乘法链接来丰富(abundant)了低频信息,使得网络关注学习高频信息。进一步地,作者提出一个channel attention机制来自适应重新缩放channel-wise特征通过考虑channel间的相互依赖性(interdependencies)

simply stacking residual blocks to construct deeper networks can hardly obtain better improvements.

Image SR can be viewed as a process, where we try to recover as more high-frequency information as possible.

there are lots of abundant information in the LR inputs and features and the goal of SR network is to recover more useful information.

由于之前对于超分方面的介绍已经比较熟悉,这里直接看网络结构

network

论文阅读笔记之——《Image Super-Resolution Using Very Deep Residual Channel Attention Networks》RCAN_第1张图片

包含了四个部分:shallow feature extraction, residual in residual (RIR) deep feature extraction, upscale module, and reconstruction part.

residual in residual (RIR)

RIR structure which contains G residual groups (RG) and long skip connection (LSC). Each RG further contains B residual channel attention blocks (RCAB) with short skip connection (SSC).

The abundant low-frequency information can be bypassed through identity-based skip connection.

Channel Attention (CA)

为了让网络focus更多信息,利用了特征channel之间的相互依赖性。而如何对每个channel-wise的特征生成不一样的attention是关键的一步。这里主要有两个concerns:首先是LR空间的信息有丰富的低频以及有价值的高频部分。低频部分看似更complanate(扁平化)。高频部分一般都是区域的,充满边缘、纹理以及其他细节。另一方面,在每个conv层的filter中,都有局部感受野,从而导致卷积层的输出不可以利用局部区域以外的上下文信息。如下图所示。

论文阅读笔记之——《Image Super-Resolution Using Very Deep Residual Channel Attention Networks》RCAN_第2张图片

Residual Channel Attention Block

residual groups and long skip connection allow the main parts of network to focus on more informative components of the LR features.

 

论文阅读笔记之——《Image Super-Resolution Using Very Deep Residual Channel Attention Networks》RCAN_第3张图片

 

 

 

 

 

 

 

 

 

 

 

 

 

你可能感兴趣的:(超分辨率重建,图像处理,卷积神经网络)