机器学习常用特征相似度或距离度量【Pytorch实现】

余弦相似度、特征距离、KL散度计算

import torch

feat1 = torch.randn((3, 4))
feat2 = torch.randn((3, 4))

# ==========cosine相似度===============
a_norm = torch.linalg.norm(feat1, dim=1)
b_norm = torch.linalg.norm(feat2, dim=1)
cos = ((feat1 * feat2).sum(dim=(-2, -1)) / (a_norm * b_norm)).mean()
cos = 0.5 * cos + 0.5   # [-1, 1]  --> [0, 1]

# ==========矩阵范数===============
lfro = torch.linalg.norm(feat1 - feat2, dim=-1).mean()	# F范数
l1 = torch.linalg.norm(feat1 - feat2, dim=-1, ord=1).mean()
linf_max = torch.linalg.norm(feat1 - feat2, dim=-1, ord=float('inf')).mean()
linf_min = torch.linalg.norm(feat1 - feat2, dim=-1, ord=-float('inf')).mean()

# ==========KL散度===============
kl = torch.nn.functional.kl_div(feat1.softmax(dim=-1).log(), feat2.softmax(dim=-1), reduction='batchmean')	# 'mean'
kl2 = torch.nn.functional.kl_div(feat1.softmax(dim=-1).log(), feat2.softmax(dim=-1), reduction='sum')

若为三维矩阵:batch * channel * feature,则

import torch

feat1 = torch.randn((2, 3, 4))
feat2 = torch.randn((2, 3, 4))

# ==========cosine相似度===============
a_norm = torch.linalg.norm(feat1, dim=(-2, -1))
b_norm = torch.linalg.norm(feat2, dim=(-2, -1))
cos = ((feat1 * feat2).sum(dim=(-2, -1)) / (a_norm * b_norm)).mean()
cos = 0.5 * cos + 0.5   # [-1, 1]  --> [0, 1]

# ==========矩阵范数===============
lfro = torch.linalg.norm(feat1 - feat2, dim=(-2, -1)).mean()	# F范数
l1 = torch.linalg.norm(feat1 - feat2, dim=(-2, -1), ord=1).mean()
linf_max = torch.linalg.norm(feat1 - feat2, dim=(-2, -1), ord=float('inf')).mean()
linf_min = torch.linalg.norm(feat1 - feat2, dim=(-2, -1), ord=-float('inf')).mean()

# ==========KL散度===============
kl = torch.nn.functional.kl_div(feat1.softmax(dim=-1).log(), feat2.softmax(dim=-1), reduction='batchmean')	# 'mean'
kl2 = torch.nn.functional.kl_div(feat1.softmax(dim=-1).log(), feat2.softmax(dim=-1), reduction='sum')

Reference

Pytorch范数文档
向量与矩阵的范数
向量余弦相似度
KL散度

你可能感兴趣的:(学习成长,pytorch,深度学习,机器学习,度量学习,特征度量)