BCELOSS和MSELOSS区别(pytorch BCELoss and MSELoss)

BCELoss

torch.nn.BCELoss(weight=None, size_average=None, reduce=None, reduction=‘mean’)
评价预测值与真值的 Binary Cross Entropy

l ( x , y ) = L = { l 1 , l 2 , . . . l N } T , l n = − ω n [ y n l o g x n + ( 1 − y n ) l o g ( 1 − x n ) ] l(x,y) = L = \{ l_1, l_2, ... l_N\}^T, l_n = - \omega_n[y_nlogx_n+(1-y_n)log(1-x_n)] l(x,y)=L={l1,l2,...lN}T,ln=ωn[ynlogxn+(1yn)log(1xn)]
N表示batch_size
l ( x , y ) = { m e a n ( L ) , if reduction = ’mean’  s u m ( L ) , if reduction = ’sum’ l(x,y) =\begin{cases}mean(L), &\text{if reduction = 'mean' }\\sum(L), & \text{if reduction = 'sum'}\end{cases} l(x,y)={mean(L),sum(L),if reduction = ’mean’ if reduction = ’sum’

MSELoss

torch.nn.MSELoss(size_average=None, reduce=None, reduction=‘mean’)
Mean Squared Error
l ( x , y ) = L = { l 1 , l 2 , . . . l N } T , l n = ( x n − y n ) 2 l(x,y) = L = \{ l_1, l_2, ... l_N\}^T, l_n = (x_n-y_n)^2 l(x,y)=L={l1,l2,...lN}T,ln=(xnyn)2
l ( x , y ) = { m e a n ( L ) , if reduction = ’mean’  s u m ( L ) , if reduction = ’sum’ l(x,y) =\begin{cases}mean(L), &\text{if reduction = 'mean' }\\sum(L), & \text{if reduction = 'sum'}\end{cases} l(x,y)={mean(L),sum(L),if reduction = ’mean’ if reduction = ’sum’

你可能感兴趣的:(pytorch,机器学习,深度学习,神经网络)