pytorch nan
判断方法:
if torch.isnan(tmploss):
解决方法:
loss_t_conf=0
if target[target ==1].size()>torch.Size([0]):
loss_t_conf = self.bce_loss(out[target ==1], target[target ==1])
3.对于回归问题,可能出现了除0 的计算,加一个很小的余项可能可以解决
4.数据本身,是否存在Nan,可以用numpy.any(numpy.isnan(x))检查一下input和target
5.target本身应该是能够被loss函数计算的,比如sigmoid激活函数的target应该大于0,同样的需要检查数据集
这个会产生nan:
import torch
v = 0.5 # 1-0.0001
v1 = v - 0.01
a = torch.FloatTensor([])
b = torch.FloatTensor([])
loss_fn = torch.nn.BCELoss() # reduce=False, size_average=False)
x = loss_fn(a, b).item()
print(x)
这个会产生nan:
import torch
input = torch.Tensor([0,0.5,1,1])
target = torch.Tensor([0,0,1,1])
a = torch.FloatTensor(input[input==2])
b = torch.FloatTensor(target[target==2])
loss_fn = torch.nn.BCELoss() # reduce=False, size_average=False)
x = loss_fn(a, b)
print(x)
#正确判断写法:input[input==2].size()==torch.Size([0])
解决方法:
loss_t_conf=0
if target[target ==1].size()>torch.Size([0]):
loss_t_conf = self.bce_loss(out[target ==1], target[target ==1])
这种情况会产生空loss,但是不是nan
import torch
from torch import nn
loss = nn.BCELoss(size_average=False, reduce=False)
input = torch.Tensor([0,0.5,1,1])
target = torch.Tensor([0,0,1,1])
target = torch.Tensor([0,0,1,1])
output = loss(input[target==2], target[target==2])
print(output)
python的nan,NaN,NAN
python mean也会产生nan:
import numpy as np
a=[]
print(np.isnan(np.mean(a)))
nan来自于numpy中numpy.nan,字面意思应该是Not a Number。在不同代码中有nan,有NaN,有NAN,但其实他们都一样的
In [1]: import numpy as np
In [2]: np.nan is np.NaN is np.NAN
Out[2]: True
nan判断
None等于None,但是对于nan,nan并不等于nan
In [8]: None == None
Out[8]: True
In [4]: np.nan == np.nan
Out[4]: False
判断时可以用
In [3]: np.isnan(np.nan)
Out[3]: True
看到nan容易想到None,但是输出nan的类型,发现nan为float类型:
In [5]: type(np.nan)
Out[5]: float
In [6]: type(None)
Out[6]: NoneType
加法,直接忽略掉:
In [7]: np.nansum([11,np.nan,123])
Out[7]: 134.0
统计array中nan个数(利用nan !=nan 为True):
In [9]: a = np.array([1,2,3,4,np.nan,5,np.nan,np.nan])
In [10]: a
Out[10]: array([ 1., 2., 3., 4., nan, 5., nan, nan])
In [11]: np.count_nonzero(a != a)
Out[11]: 3
转载自:https://blog.csdn.net/jacke121/article/details/93085582