语义分割损失函数

这里面有几个损失函数:

GitHub - MichaelFan01/STDC-Seg: Source Code of our CVPR2021 paper "Rethinking BiSeNet For Real-time Semantic Segmentation"

损失函数代码: 

#!/usr/bin/python
# -*- encoding: utf-8 -*-


import torch
import torch.nn as nn
import torch.nn.functional as F
from loss.util import enet_weighing
import numpy as np


class OhemCELoss(nn.Module):
    def __init__(self, thresh, n_min, ignore_lb=255, *args, **kwargs):
        super(OhemCELoss, self).__init__()
        self.thresh = -torch.log(torch.tensor(thresh, dtype=torch.float)).cuda()
        self.n_min = n_min
        self.ignore_lb = ignore_lb
        self.criteria = nn.CrossEntropyLoss(ignore_index=ignore_lb, reduction

你可能感兴趣的:(深度学习宝典,深度学习基础,深度学习,机器学习,人工智能)