pytorch损失函数中‘reduction‘参数

内容介绍

在调用pytorch的损失函数时,会有一个’reduction’的参数,本文介绍使用不同的参数对应的结果,以L1 loss为例子:

reduction = mean

当使用的参数为 mean(在pytorch1.7.1中elementwise_mean已经弃用)会对N个样本的loss进行平均之后返回

import torch
import torch.nn as nn
from torch.autograd import Variable
sample = Variable(torch.ones(1,4))
target = Variable(torch.tensor([0,1,2,3]))
criterion = nn.L1Loss(reduction='mean')
loss = criterion(sample,target)
print(loss)

输出:tensor(1.0)

reduction = sum

当使用的参数为 sum会对N个样本的loss求和

import torch
import torch.nn as nn
from torch.autograd import Variable
sample = Variable(torch.ones(1,4))
target = Variable(torch.tensor([0,1,2,3]))
criterion = nn.L1Loss(reduction='sum')
loss = criterion(sample,target)
print(loss)

输出:tensor(4.)

reduction = none

表示直接返回n分样本的loss

import torch
import torch.nn as nn
from torch.autograd import Variable
sample = Variable(torch.ones(1,4))
target = Variable(torch.tensor([0,1,2,3]))
criterion = nn.L1Loss(reduction='none')
loss = criterion(sample,target)
print(loss)

输出:tensor([[1.,0.,1.,2.]])


你可能感兴趣的:(pytorch,pytorch,深度学习,机器学习)