CrossEntropyLoss in Pytorch

in mathematics

  • softmax function (normalized exponential function)

  • cross entropy

in pytorch

  • NLLLoss

  • cross entropy loss

    doc : This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class.

import torch
import torch.nn.functional as F

output = torch.randn(3, 5, requires_grad=True)
target = torch.tensor([1, 0, 4])


y1 = F.cross_entropy(output,target)
y2 = F.nll_loss(F.log_softmax(output,dim=1), target)

# y1 == y2
print(y1)
print(y2)

你可能感兴趣的:(CrossEntropyLoss in Pytorch)