pytorch中的dropout

nn.Dropout

a = torch.randn([2, 2, 2])

# tensor([[[ 2.3227,  1.2603],
#         [ 0.2886, -1.2973]],

#        [[-1.3519, -0.9159],
#         [-0.0327,  1.2753]]])

m = torch.nn.Dropout(0.5)
b = m(a)

# tensor([[[ 4.6453,  0.0000],
#          [ 0.5773, -0.0000]],

#         [[-2.7037, -0.0000],
#          [-0.0655,  2.5506]]])

这里的概率是每个元素被置零的概率

另外,dropout后未被置零的元素之所以改变,是因为:

在这里插入图片描述

nn.Dropout1d

m = torch.nn.Dropout1d(0.5)

# tensor([[[ 0.0000,  0.0000],
#          [ 0.5773, -2.5947]],

#         [[-0.0000, -0.0000],
#          [-0.0000,  0.0000]]])

整个通道置零

通常接在一维卷积核后

nn.Dropout2d

a = torch.randn([2, 2, 2, 2])

# tensor([[[[ 0.2987,  1.2165],
#           [-1.4930, -0.9580]],

#          [[-1.2327, -0.6193],
#           [ 0.0632,  0.7169]]],


#         [[[ 0.6416, -1.3733],
#           [ 1.5038,  1.9098]],

#          [[ 0.5030,  1.0592],
#           [-0.4078,  0.3067]]]])

m = torch.nn.Dropout2d(0.5)
b = m(a)

# tensor([[[[ 0.5975,  2.4330],
#           [-2.9861, -1.9160]],

#          [[-2.4654, -1.2386],
#           [ 0.1263,  1.4337]]],


#         [[[ 0.0000, -0.0000],
#           [ 0.0000,  0.0000]],

#          [[ 1.0060,  2.1184],
#           [-0.8156,  0.6133]]]])

随机将通道置零,只不过这里通道是二维特征图

通常接在二维卷积核后

nn.Dropout3d

同理类推,用在三维卷积核后,随机将通道置零,通道是三维特征图

a = torch.randn([2, 2, 2, 2, 2])
# tensor([[[[[ 1.0495,  0.6651],
#            [-0.1486, -0.0094]],

#           [[ 0.9391, -0.1916],
#            [-0.9792, -0.8239]]],


#          [[[ 0.0216, -0.8410],
#            [ 1.5924,  0.9085]],

#           [[-0.0662,  0.4695],
#            [ 1.0581, -0.0578]]]],



#         [[[[-0.7801,  0.2375],
#            [-1.3781, -0.7255]],

#           [[-0.2555, -0.0245],
#            [ 1.6996,  1.1204]]],


#          [[[-0.0411, -1.3937],
#            [-0.6560, -0.8227]],

#           [[ 0.5630, -1.3177],
#            [ 1.4010, -0.8008]]]]])


m = torch.nn.Dropout3d(0.3)
b = m(a)
# tensor([[[[[ 1.4992,  0.9501],
#            [-0.2122, -0.0134]],

#           [[ 1.3416, -0.2737],
#            [-1.3988, -1.1770]]],


#          [[[ 0.0309, -1.2015],
#            [ 2.2749,  1.2979]],

#           [[-0.0946,  0.6707],
#            [ 1.5115, -0.0825]]]],



#         [[[[-1.1145,  0.3392],
#            [-1.9686, -1.0365]],

#           [[-0.3651, -0.0350],
#            [ 2.4279,  1.6006]]],


#          [[[-0.0000, -0.0000],
#            [-0.0000, -0.0000]],

#           [[ 0.0000, -0.0000],
#            [ 0.0000, -0.0000]]]]])
# ​

到现在可以发现一些规律,按括号的由里至外,1d是一层括号内置零,2d是二层括号内置零,3d是三层括号内置零

nn.AlphaDropout

不常用

nn.FeatureAlphaDropout

不常用

你可能感兴趣的:(PyTorch深度学习实践,pytorch,深度学习,python)