SoftMarginLoss原理

1.triple_loss

class TripleLoss(object):
    """Modified from Tong Xiao's open-reid (https://github.com/Cysu/open-reid).
    Related Triplet Loss theory can be found in paper 'In Defense of the Triplet
    Loss for Person Re-Identification'."""

    def __init__(self, margin=None):
       
        self.margin = margin
        if margin is not None:
            self.ranking_loss = nn.MarginRankingLoss(margin=margin)
        else:
            self.ranking_loss = nn.SoftMarginLoss()

    def __call__(self, dist_ap, dist_an):
        """
        Args:
          dist_ap: pytorch Variable, distance between anchor and positive sample,
            shape [N]
          dist_an: pytorch Variable, distance between anchor and negative sample,
            shape [N]
        Returns:
          loss: pytorch Variable, with shape [1]
        """
        y = Variable(dist_an.data.new().resize_as_(dist_an.data).fill_(1))
        if self.margin is not None:
            loss = self.ranking_loss(dist_an, dist_ap, y)
        else:
            loss = self.ranking_loss(dist_an - dist_ap, y)
        return loss

2.nn.SoftMarginLoss(input,target)

# nn.SoftMarginLoss_math:
# \text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}
     
# input=dist_ap-dist_an
# target=[-1,1]
# 无论输入是ap-an还是an-ap,最后的标签都是同方向的(-1,1),即得loss

loss(x,y)=\sum_{i}^{ }\frac{log(1+exp(-y[i]*x[i]))}{x.nelement())} 

SoftMarginLoss原理_第1张图片

3.nn.MarginRankingLoss

# nn.MarginRankingLoss_math:
# \text{loss}(x, y) = \max(0, -y * (x1 - x2) + \text{margin})

在这里插入图片描述

你可能感兴趣的:(pytorch,math)