经典的卷积神经网络实现(一)---LeNet、AlexNet

一、LeNet

  1. 卷积神经网络的开山鼻祖。
  2. 网络结构如下:
    经典的卷积神经网络实现(一)---LeNet、AlexNet_第1张图片
    1. 7层网络:2层卷积层,2层池化层交替出现。最后是三层全连接层。
  3. Pytorch实现
import torch
from torch import nn

class LeNet(nn.Module):
    def __init__(self):
        super(LeNet, self).__init__()
        # 前两层卷积层和池化层
        # [b,1,32,32] => [b,6,28,28] => [b,6,14,14]
        self.layer1 = nn.Sequential(
            nn.Conv2d(1,6,kernel_size=3,padding=1),
            nn.MaxPool2d(2,2)
        )
        #  [b,6,14,14] => [b,16,10,10] => [b,16,5,5]
        self.layer2 = nn.Sequential(
            nn.Conv2d(6,16,kernel_size=3, padding=1),
            nn.MaxPool2d(2, 2)
        )
        # 后三层是全连接层
        self.layer3 = nn.Sequential(
            nn.Linear(16*5*5,120),
            nn.Linear(120,84),
            nn.Linear(84,10)
        )
    
    def forward(self,x):
        # 卷积层,池化层
        x = self.layer1(x)
        x = self.layer2(x)
        # 全连接层
        x = x.view(x.size(0),-1)
        x = self.layer3(x)
        return x
if __name__ == '__main__':
    LeNet_model = LeNet()

二、AlexNet

  1. AlexNet相对于LeNet第一次引入了Relu激活函数,并在全连接层引入了Dropout,从而防止过拟合。
  2. 网络结构
    经典的卷积神经网络实现(一)---LeNet、AlexNet_第2张图片
  3. 代码实现
from torch import nn

class AlexNet(nn.Module):
    def __init__(self,num_classes):
        super(AlexNet, self).__init__()
        # 卷积层、激活层、池化层
        self.features = nn.Sequential(
            nn.Conv2d(3,64,kernel_size=11,stride=4,padding=2),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3,stride=2),

            nn.Conv2d(64,192, kernel_size=5, padding=2),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3, stride=2),

            nn.Conv2d(192,384, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),

            nn.Conv2d(384, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            # => [b,256,6,6]
            nn.Conv2d(256, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=3,stride=2)
        )

        self.classifier = nn.Sequential(
            nn.Dropout(),#防止过拟合
            nn.Linear(256*6*6,4096),
            nn.ReLU(inplace=True),
            nn.Dropout(),
            nn.Linear(4096,4096),
            nn.ReLU(inplace=True),
            nn.Linear(4096,num_classes)
        )

        def forward(self,x):
            x = self.features(x)
            x = x.view(x.size(0),-1)
            x = self.classifier(x)
            return x

if __name__ == '__main__':
    AlexNet_module = AlexNet()

你可能感兴趣的:(从零开始学深度学习,pytorch,深度学习,python)