PyTorch张量的.grad_fn属性和.is_leaf使用举例

Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留所有权利。

C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x000001BA5FCFD330>
>>>
>>> a = torch.randn(3,5)
>>> a
tensor([[ 0.2824, -0.3715,  0.9088, -1.7601, -0.1806],
        [ 2.0937,  1.0406, -1.7651,  1.1216,  0.8440],
        [ 0.1783,  0.6859, -1.5942, -0.2006, -0.4050]])
>>> b = a + 1
>>> b
tensor([[ 1.2824,  0.6285,  1.9088, -0.7601,  0.8194],
        [ 3.0937,  2.0406, -0.7651,  2.1216,  1.8440],
        [ 1.1783,  1.6859, -0.5942,  0.7994,  0.5950]])
>>> print(b)
tensor([[ 1.2824,  0.6285,  1.9088, -0.7601,  0.8194],
        [ 3.0937,  2.0406, -0.7651,  2.1216,  1.8440],
        [ 1.1783,  1.6859, -0.5942,  0.7994,  0.5950]])
>>>
>>> a = torch.randn(3,5,requires_grad=True)
>>> a
tensor([[-0.5556,  0.9571,  0.7435, -0.2974, -2.2825],
        [-0.6627, -1.1902, -0.1748,  1.2125,  0.6630],
        [-0.5813, -0.1549, -0.4551, -0.4570,  0.4547]], requires_grad=True)
>>> b = a + 1
>>> b
tensor([[ 0.4444,  1.9571,  1.7435,  0.7026, -1.2825],
        [ 0.3373, -0.1902,  0.8252,  2.2125,  1.6630],
        [ 0.4187,  0.8451,  0.5449,  0.5430,  1.4547]], grad_fn=<AddBackward0>)
>>> c = a * 2
>>> c
tensor([[-1.1113,  1.9141,  1.4871, -0.5948, -4.5650],
        [-1.3254, -2.3805, -0.3497,  2.4250,  1.3261],
        [-1.1625, -0.3099, -0.9102, -0.9141,  0.9093]], grad_fn=<MulBackward0>)
>>> d = b / 2
>>> d
tensor([[ 0.2222,  0.9785,  0.8718,  0.3513, -0.6412],
        [ 0.1686, -0.0951,  0.4126,  1.1063,  0.8315],
        [ 0.2094,  0.4225,  0.2725,  0.2715,  0.7273]], grad_fn=<DivBackward0>)
>>>
>>>
>>> print(d)
tensor([[ 0.2222,  0.9785,  0.8718,  0.3513, -0.6412],
        [ 0.1686, -0.0951,  0.4126,  1.1063,  0.8315],
        [ 0.2094,  0.4225,  0.2725,  0.2715,  0.7273]], grad_fn=<DivBackward0>)
>>>
>>>
>>>
>>> a.is_leaf
True
>>> b.is_leaf
False
>>> c.is_leaf
False
>>> d.is_leaf
False
>>>
>>>
>>>

你可能感兴趣的:(PyTorch张量的.grad_fn属性和.is_leaf使用举例)