回归loss笔记


[图片上传失败...(image-ee44b2-1619413842353)]
](https://upload-images.jianshu.io/upload_images/16949777-c3860d701efbec2e.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)



content_loss += torch.mean((f1 - f2) ** 2)

MSE


为了解决L1 loss在0点处不能求导以及L2 loss对离群点比较敏感的问题,提出了huber loss融合了L1、L2 loss的优点

image.png


BCE loss是二元交叉熵损失,其实是二分类交叉熵损失,可以用在前背景分割、语义分割的网络训练当中
loss_fn = nn.BCELoss()

感知损失,就是把预训练好的特征提取器相当于人的眼睛,要求生成的图像和真实图像经过预训练好的模型提取得到的特征尽可能相似。在图像生成里面,感知损失可以让图片生成地更加逼真一些

什么是自编码器,自编码器及其应用详解 (biancheng.net)
关于自编码器的核心点理解zhuhongde的博客-CSDN博客自编码器

def cosine_similarities(vector_1, vectors_all):
    """Compute cosine similarities between one vector and a set of other vectors.

    Parameters
    ----------
    vector_1 : numpy.ndarray
        Vector from which similarities are to be computed, expected shape (dim,).
    vectors_all : numpy.ndarray
        For each row in vectors_all, distance from vector_1 is computed, expected shape (num_vectors, dim).

    Returns
    -------
    numpy.ndarray
        Contains cosine distance between `vector_1` and each row in `vectors_all`, shape (num_vectors,).

    """
    norm = np.linalg.norm(vector_1)
    all_norms = np.linalg.norm(vectors_all, axis=1)
    dot_products = dot(vectors_all, vector_1)
    similarities = dot_products / (norm * all_norms)
    return similarities

你可能感兴趣的:(回归loss笔记)