Global Attention Mechanism: Retain Information to Enhance Channel-Spatial Interactions(GAM)
Codesofpytorch:importtorch.nnasnnimporttorchclassGAM_Attention(nn.Module):def__init__(self,in_channels,out_channels,rate=4):super(GAM_Attention,self).__init__()self.channel_attention=nn.Sequential(nn.