RuntimeError:element 0 of tensors does not require grad and does not have a grad_fn

x = torch.ones(1) # 默认requires_grad = False
print x.requires_grad # False
y = torch.ones(1) #y的requires_grad标志也是False
z = x + y # z的requires_grad也是False, 所以如果试图对z做梯度反传,会抛出Error

z.backward()

RuntimeError: element 0 of tensors does not require grad and does not
have a grad_fn

w = torch.zeros(1, requires_grad=True) # 创建 requires_grad=True 的Tensor
print w.requires_grad #True

total = w + z

print total.requires_grad#True

total.backward()

print w.grad#tensor([ 1.])
print z.grad == x.grad == y.grad == None #True

你可能感兴趣的:(RuntimeError:element 0 of tensors does not require grad and does not have a grad_fn)