PyTorch深度学习笔记(十五)损失函数与反向传播

课程学习笔记,课程链接

PyTorch深度学习笔记(十五)损失函数与反向传播_第1张图片

import torch
from torch import nn
from torch.nn import L1Loss, MSELoss
​
inputs = torch.tensor([1, 2, 3], dtype=torch.float32)
targets = torch.tensor([1, 2, 5], dtype=torch.float32)
​
inputs = torch.reshape(inputs, (1, 1, 1, 3))
targets = torch.reshape(targets, (1, 1, 1, 3))
​
loss = L1Loss(reduction='sum')
result = loss(inputs, targets)
​
loss_mse = MSELoss()
result_mse = loss_mse(inputs, targets)
​
print(result)
print(result_mse)
​
# 交叉熵
x = torch.tensor([0.1, 0.2, 0.3])
y = torch.tensor([1])
x = torch.reshape(x, (1, 3))
loss_cross = nn.CrossEntropyLoss()
result_cross = loss_cross(x, y)
print(result_cross)

现在以 CIFAR10 数据集为例,在上一篇文章中最后搭建的神经网络中使用 CrossEntropyLoss 函数作为损失函数,讲解在神经网络中如何使用损失函数。

import torchvision
from torch import nn
from torch.nn import Sequential, Conv2d, MaxPool2d, Flatten, Linear
from torch.utils.data import DataLoader
​
dataset = torchvision.datasets.CIFAR10("D:\Code\Project\learn_pytorch\pytorch_p17-21\data", train=False,
                                       download=True, transform=torchvision.transforms.ToTensor())
dataloader = DataLoader(dataset, batch_size=4)
​
class Jiaolong(nn.Module):
    def __init__(self):
        super(Jiaolong, self).__init__()
        self.model1 = Sequential(
            Conv2d(in_channels=3, out_channels=32, kernel_size=5, padding=2),
            MaxPool2d(kernel_size=2),
            Conv2d(in_channels=32, out_channels=32, kernel_size=5, padding=2),
            MaxPool2d(kernel_size=2),
            Conv2d(in_channels=32, out_channels=64, kernel_size=5, padding=2),
            MaxPool2d(kernel_size=2),
            Flatten(),
            Linear(1024, 64),
            Linear(64, 10)
        )
​
    def forward(self, x):
        x = self.model1(x)
        return x
​
loss = nn.CrossEntropyLoss()
jiaolong = Jiaolong()
for data in dataloader:
    imgs, targets = data
    outputs = jiaolong(imgs)
    result_loss = loss(outputs, targets)
    print(result_loss)

你可能感兴趣的:(PyTorch,pytorch,深度学习,python)