(八)神经网络-线性层及其他层介绍

【声明】来源b站视频小土堆PyTorch深度学习快速入门教程(绝对通俗易懂!)【小土堆】_哔哩哔哩_bilibili

Normalization Layers

nn.BatchNorm2d

对输入采用正则化,加快神经网络的训练速度

主要是num_features,对应通道数channel,其他默认即可

nn.BatchNorm2d(num_features, eps=1e-05, momentum=0.1, affine=True, 
						track_running_stats=True, device=None, dtype=None)

Linear Layers

nn.Linear(in_feature,out_feature,bias=True)
#​in_features为上一层神经元的个数,out_features为这一层的神经元个数

​

(八)神经网络-线性层及其他层介绍_第1张图片

 

(八)神经网络-线性层及其他层介绍_第2张图片 VGG16

 

比如一张图片是5*5的,展开成一行为25个,通过线性层,变为3个

import torch
import torchvision
from torch.utils.data import DataLoader

dataset = torchvision.datasets.CIFAR10("./data",train=False,transform=torchvision.transforms.ToTensor(),download=False)

dataloader = DataLoader(dataset,batch_size=64,shuffle=True,num_workers=0)

for data in dataloader:
    imgs,labels = data
    print(imgs.shape)
    output = torch.reshape(imgs,(1,1,1,-1))
    print(output.shape)

(八)神经网络-线性层及其他层介绍_第3张图片

 

import torch
import torch.nn as nn
import torchvision
from torch.nn import Linear
from torch.utils.data import DataLoader

dataset = torchvision.datasets.CIFAR10("./data",train=False,transform=torchvision.transforms.ToTensor(),download=False)

dataloader = DataLoader(dataset,batch_size=64,shuffle=True,num_workers=0)

class Mymodel(nn.Module):
    def __init__(self):
        super(Mymodel, self).__init__()
        self.linear = Linear(196608,10)#in_features=19668,out_features=10

    def forward(self,input):
        output = self.linear(input)
        return output

mymodel = Mymodel()

for data in dataloader:
    imgs,labels = data
    print(imgs.shape)

    output = torch.flatten(imgs)#展开
    print(output.shape)

    output = mymodel(output)#将展开后的output送到神经网络中
    print(output.shape)

  

你可能感兴趣的:(PyTorch,神经网络,深度学习,人工智能)