神经网络基础-神经网络补充概念-63-残差网络

概念

残差网络(Residual Network,ResNet)是一种深度卷积神经网络结构,旨在解决深层网络训练中的梯度消失和梯度爆炸问题,以及帮助训练非常深的网络。ResNet 在2015年被提出,其核心思想是引入了"残差块"(residual block)来克服训练深层网络时的优化问题。

传统的神经网络认为层与层之间是逐渐学习到更高级的特征表示的,但在实践中,增加层数可能会导致性能下降,这是因为深层网络在训练过程中可能会难以优化。ResNet 通过引入"跳跃连接"或"残差连接",使得网络可以学习残差(即原始特征)并将其添加到后续层的输出中,从而解决了这个问题。

一个残差块的结构

Input
  |
Convolution
  |
Batch Normalization
  |
ReLU
  |
Convolution
  |
Batch Normalization
  |
Output
  |
Addition (Residual Connection)
  |
ReLU

代码实现

import torch
import torch.nn as nn

class ResidualBlock(nn.Module):
    def __init__(self, in_channels, out_channels, stride=1):
        super(ResidualBlock, self).__init__()
        self.conv1 = nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=stride, padding=1, bias=False)
        self.bn1 = nn.BatchNorm2d(out_channels)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = nn.Conv2d(out_channels, out_channels, kernel_size=3, stride=1, padding=1, bias=False)
        self.bn2 = nn.BatchNorm2d(out_channels)
        
        # 如果输入输出通道数不匹配,使用 1x1 卷积调整维度
        self.shortcut = nn.Sequential()
        if stride != 1 or in_channels != out_channels:
            self.shortcut = nn.Sequential(
                nn.Conv2d(in_channels, out_channels, kernel_size=1, stride=stride, bias=False),
                nn.BatchNorm2d(out_channels)
            )
        
    def forward(self, x):
        residual = x
        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)
        out = self.conv2(out)
        out = self.bn2(out)
        out += self.shortcut(residual)
        out = self.relu(out)
        return out

# 创建一个残差块实例
residual_block = ResidualBlock(in_channels=64, out_channels=128, stride=2)
print(residual_block)

你可能感兴趣的:(神经网络补充,神经网络,神经网络,人工智能,深度学习)