torch.nn.NLLLOSS vs torch.nn.CrossEntropyLoss

通常用于深度学习中计算预测值与真实标签的损失 。

torch.nn.CrossEntropyLoss相当于softmax + log + nllloss

torch.nn.NLLLOSS示例

nllloss = nn.NLLLoss()
predict = torch.Tensor([[2, 3, 1],
                        [6, 9, 8]])
predict = torch.log(torch.softmax(predict, dim=-1))
label = torch.tensor([1, 2])
nllloss(predict, label)

softmax:                     tensor([[0.2447, 0.6652, 0.0900],        #每行之和为0
                                               [0.0351, 0.7054, 0.2595]])
softmax+log:              tensor([[-1.4076, -0.4076, -2.4076],    #对应位置求对数ln
                                               [-3.3490, -0.3490, -1.3490]])
softmax+log+nllloss:  0.8783090710639954                   #依据给出的label(0.4076+0.1291)/2

torch.nn.CrossEntropyLoss

from torch import nn
import torch

cross_loss = nn.CrossEntropyLoss()
predict = torch.Tensor([[2, 3, 1],
                        [6, 9, 8]])
label = torch.tensor([1, 2])
cross_loss(predict, label)

 

你可能感兴趣的:(深度学习,pytorch,python)