[通道注意力]--ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

[通道注意力]--ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks_第1张图片

实验效果:

[通道注意力]--ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks_第2张图片

创新点:

将SENet中的两个先降维后升维的卷积,替换为更有效的连接方式,提高准确率的同时也减少了参数量。
[通道注意力]--ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks_第3张图片

代码:

把代码拿到超分网络RCAN中尝试了一下,直接损失破万,目前还没找到原因,初步怀疑是缺少了BN层的原因。

import torch 
from torch import nn 
from torch.nn.parameter import Parameter 

class eca_layer(nn.Module): 
    """Constructs a ECA module. 

    Args: 
        channel: Number of channels of the input feature map 
        k_size: Adaptive selection of kernel size 
    """ 
   def __init__(self, channel, k_size=3): 
        super(eca_layer, self).__init__() 
        self.avg_pool = nn.AdaptiveAvgPool2d(1) 
        self.conv = nn.Conv1d(1, 1, kernel_size=k_size, padding=(k_size - 1) // 2, bias=False)  
        self.sigmoid = nn.Sigmoid() 

   def forward(self, x): 
        # x: input features with shape [b, c, h, w] 
        b, c, h, w = x.size() 

        # feature descriptor on the global spatial information 
        y = self.avg_pool(x) 

        # Two different branches of ECA module 
        y = self.conv(y.squeeze(-1).transpose(-1, -2)).transpose(-1, -2).unsqueeze(-1) 
        # Multi-scale information fusion 
        y = self.sigmoid(y) 

        return x * y.expand_as(x) 

你可能感兴趣的:([通道注意力]--ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks)