Pytorch中常用Loss函数总结

Pytorch中常用Loss函数总结


  • torch.nn.MSELoss():常用于回归模型
  • torch.nn.CrossEntropyLoss():常用于分类模型
  • smooth L1损失

一、MSE(均方差)


MSE loss = 1/n(x-y)***

loss_func2 = torch.nn.MSELoss()
target = torch.tensor([[112.2396,  82.2253],
       				  [115.4184,  76.4030]],dtype=torch.float32)
predict = torch.tensor([[113,  80],
                        [113,  80]],dtype=torch.float32
                        
loss = loss_func2(predict, target)
loss_compare = torch.mul((predict-target),(predict-target)).mean()
print loss_compare
print loss

output:
tensor(6.0792)
tensor(6.0792)

二、CrossEntropy(交叉熵)


关于信息熵的概念可以参考这篇文章信息熵,交叉熵及相对熵代码实现过程理解Pytorch中常用Loss函数总结_第1张图片
在pytorch中交叉熵的计算公式如下
Pytorch中常用Loss函数总结_第2张图片

# -*- coding: utf-8 -*-
import torch
import torch.nn.functional as F
# predict为网络输出预测的值
predict = torch.tensor([[1.,  0],
                       [0 ,0.5]])
# label为分类标签值
label = torch.tensor([0,1])

#通过pytorch自带函数包计算的交叉熵
loss_fun1 = torch.nn.CrossEntropyLoss()
loss1 = loss_fun1(predict,label)
print 'CrossEntropyLoss()的计算结果',loss1
######################################
#手算多维张量的交叉熵
### 通过label计算实分布 p(x)
def one_hot(label,n_class):
    one_hot_label = torch.zeros((label.shape[0], n_class))
    one_hot_label[torch.arange(0, label.shape[0]), label] = 1
    return one_hot_label
p = one_hot(label,2)

### 通过predict计算非真实分布(预测分布) q(x)
loss_fun2 =torch.nn.Softmax()
q = loss_fun2(predict)

### 根据交叉熵公式,L = -(1/m)sum(p(x)*log(q(x)))
loss2 = torch.sum(-(p*torch.log(q)))/label.size(0)

### 或者直接利用log_softmax()
loss_fun3 =torch.nn.LogSoftmax()
q1 = loss_fun3(predict)
loss3 = torch.sum(-(p*q1))/label.size(0)

print '手算的结果',loss2,loss3

结果为:

CrossEntropyLoss()的计算结果 tensor(0.3937)
手算的结果 tensor(0.3937) tensor(0.3937)

你可能感兴趣的:(Pytorch中常用Loss函数总结)