本文代码来自:基于pytorch实现的图像分类源码
代码实践示例:使用Pytorch实现图像花朵分类
import torch
import torch.nn as nn
class CrossEntropyLoss(nn.Module):
def __init__(self, label_smoothing: float = 0.0, weight: torch.Tensor = None):
super(CrossEntropyLoss, self).__init__()
self.cross_entropy = nn.CrossEntropyLoss(weight=weight, label_smoothing=label_smoothing)
def forward(self, input: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
return self.cross_entropy(input, target)
其中label_smoothing是标签平滑的值,weight是每个类别的类别权重。
假设有三个类别,我想设定类别权重为 0.5,0.8,1.5
那么代码就是:
l = CrossEntropyLoss(weight=torch.fromnumpy(np.array([0.5,0.8,1.5])))
import torch
import torch.nn as nn
import torch.nn.functional as F
class FocalLoss(nn.Module):
def __init__(self, label_smoothing:float = 0.0, weight: torch.Tensor = None, gamma:float = 2.0):
super(FocalLoss, self).__init__()
self.label_smoothing = label_smoothing
self.weight = weight
self.gamma = gamma
def forward(self, input: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
target_onehot = F.one_hot(target, num_classes=input.size(1))
target_onehot_labelsmoothing = torch.clamp(target_onehot.float(), min=self.label_smoothing/(input.size(1)-1), max=1.0-self.label_smoothing)
input_softmax = F.softmax(input, dim=1) + 1e-7
input_logsoftmax = torch.log(input_softmax)
ce = -1 * input_logsoftmax * target_onehot_labelsmoothing
fl = torch.pow((1 - input_softmax), self.gamma) * ce
fl = fl.sum(1) * self.weight[target.long()]
return fl.mean()
我们这个focalloss是支持标签平滑的。
其中label_smoothing是标签平滑的值,weight是每个类别的类别权重(可以理解为二分类focalloss中的alpha,因为alpha就是调节样本的平衡度),。
假设有三个类别,我想设定类别权重为 0.5,0.8,1.5
那么代码就是:
l = FocalLoss(weight=torch.fromnumpy(np.array([0.5,0.8,1.5])))
import torch
import torch.nn as nn
import torch.nn.functional as F
class PolyLoss(nn.Module):
"""
PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions
"""
def __init__(self, label_smoothing: float = 0.0, weight: torch.Tensor = None, epsilon=2.0):
super().__init__()
self.epsilon = epsilon
self.label_smoothing = label_smoothing
self.weight = weight
def forward(self, outputs, targets):
ce = F.cross_entropy(outputs, targets, label_smoothing=self.label_smoothing, weight=self.weight)
pt = F.one_hot(targets, outputs.size()[1]) * F.softmax(outputs, 1)
return (ce + self.epsilon * (1.0 - pt.sum(dim=1))).mean()
其中label_smoothing是标签平滑的值,weight是每个类别的类别权重。
假设有三个类别,我想设定类别权重为 0.5,0.8,1.5
那么代码就是:
l = PolyLoss(weight=torch.fromnumpy(np.array([0.5,0.8,1.5])))
可能有小伙伴不知道该如何确定类别的权重,一般出现需要设定类别权重的两种情况:
当然也有小伙伴可能代码能力比较差,但是不用怕,这个代码已经帮你们实现上述的需求,还有更加丰富的可视化和功能,有需要可以使用一下。这个代码的使用示例在本博客开头代码来源中有链接,有整个操作流程。