[深度学习从入门到女装]Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net)

论文地址:Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net) for Medical Image Segmentat

 

这篇文章使用Recurrent Residual conv来对U-Net进行改进

[深度学习从入门到女装]Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net)_第1张图片

(a)为普通的两个conv模块,(b)为使用了recurrent conv的模块,(c)为使用了residual conv的模块,(d)是同时使用了residual 和recurrent conv的模块

[深度学习从入门到女装]Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net)_第2张图片

然后每个recurrent conv的模块如上图所示,本来我以为是RNN conv模块或者是LSTM conv的模块,后来看了源码之后发现不是这样的

源码如下:

class RRCNN_block(nn.Module):
    def __init__(self,ch_in,ch_out,t=2):
        super(RRCNN_block,self).__init__()
        self.RCNN = nn.Sequential(
            Recurrent_block(ch_out,t=t),
            Recurrent_block(ch_out,t=t)
        )
        self.Conv_1x1 = nn.Conv2d(ch_in,ch_out,kernel_size=1,stride=1,padding=0)

    def forward(self,x):
        x = self.Conv_1x1(x)
        x1 = self.RCNN(x)
        return x+x1

class Recurrent_block(nn.Module):
    def __init__(self,ch_out,t=2):
        super(Recurrent_block,self).__init__()
        self.t = t
        self.ch_out = ch_out
        self.conv = nn.Sequential(
            nn.Conv2d(ch_out,ch_out,kernel_size=3,stride=1,padding=1,bias=True),
		    nn.BatchNorm2d(ch_out),
			nn.ReLU(inplace=True)
        )

    def forward(self,x):
        for i in range(self.t):

            if i==0:
                x1 = self.conv(x)
            
            x1 = self.conv(x+x1)
        return x1

可以看到 一个Recurrent_block其实就是t个普通的conv(conv+BN+ReLU)的堆叠,只不过,除了第一个conv的输入是x之外,后边的conv的输入是前一个conv的输出加上x作为输入

 

[深度学习从入门到女装]Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net)_第3张图片

改进后的网络图上图所示

[深度学习从入门到女装]Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net)_第4张图片

https://github.com/LeeJunHyun/Image_Segmentation

这个是实现的源码,这里还实现了attention-unet、还将attention-unet和R2U-UNET进行了结合

你可能感兴趣的:(深度学习)