PyTorch 从tensor.grad 看 backward(权重参数) 和 gradient accumulated
1.新建一个自变量tensorximporttorchx=torch.ones(1,requires_grad=True)print(x)1.输出:tensor([1.],requires_grad=True)2.写一个forwardimporttorchx=torch.ones(1,requires_grad=True)y=x**2z=x**33.y,z都backwardimporttorchx