Pytorch中 list(net.parameters())[0]

Pytorch中 list(net.parameters())[0]

先看一下Lenet的代码

class LeNet(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv = nn.Sequential(
            nn.Conv2d(1, 6, 5),  # in_channels, out_channels, kernel_size
            nn.Sigmoid(),

            nn.MaxPool2d(2, 2),  # kernel_size, stride
            nn.Conv2d(6, 16, 5),
            nn.Sigmoid(),
            nn.MaxPool2d(2, 2)
        )
        self.fc = nn.Sequential(
            nn.Linear(16*4*4, 120),
            nn.Sigmoid(),
            nn.Linear(120, 84),
            nn.Sigmoid(),
            nn.Linear(84, 10)
        )

    def forward(self, img):
        feature = self.conv(img)
        output = self.fc(feature.view(img.shape[0], -1))  # img.shape[0]是batch_size
        print(img.size())
        return output


net = LeNet()

加上FC层共有5个参数层(每层有权值和bias),运行下面的代码

print(list(net.parameters()))
[Parameter containing:
tensor([[[[ 0.0991,  0.1349, -0.0070, -0.1155, -0.0294],
          [-0.0658,  0.0957,  0.1981,  0.1192, -0.0146],
          [-0.1672,  0.1914,  0.1532, -0.1125,  0.0129],
          [ 0.1950, -0.0386,  0.1198, -0.0127, -0.0673],
          [ 0.1181,  0.1342,  0.1384, -0.0352,  0.1887]]],


        [[[ 0.0795, -0.1427,  0.1126, -0.0740,  0.1322],
          [ 0.0405, -0.0449,  0.1535, -0.1308, -0.0610],
          [-0.1037, -0.1754, -0.1544,  0.1937,  0.0915],
          [ 0.1464, -0.1385, -0.0502,  0.0471, -0.0091],
          [ 0.0878, -0.1831,  0.0909, -0.1313, -0.1462]]],


        [[[-0.0178,  0.0972, -0.1304,  0.0813, -0.0893],
          [ 0.1059,  0.0222, -0.0230, -0.1466, -0.0590],
          [-0.0327, -0.0503, -0.1373,  0.1634, -0.0452],
          [ 0.1720,  0.1130, -0.1299, -0.0550,  0.1723],
          [ 0.0730,  0.1294, -0.0941,  0.0520, -0.0408]]],


        [[[-0.1357, -0.1327, -0.0143,  0.1415,  0.0842],
          [ 0.0013,  0.0838, -0.1388,  0.0106,  0.0555],
          [-0.0590, -0.1105, -0.0641, -0.1874, -0.0973],
          [ 0.1863,  0.0678,  0.1571,  0.1057,  0.0562],
          [ 0.1442, -0.1517, -0.0078, -0.1562, -0.0867]]],


        [[[ 0.0950, -0.0273, -0.1122,  0.0195,  0.1027],
          [-0.1541,  0.0725, -0.1750, -0.1083, -0.0328],
          [-0.1178,  0.0273, -0.0396, -0.1906, -0.0434],
          [-0.1573,  0.1007, -0.0713, -0.0218, -0.1169],
          [-0.1945,  0.1164, -0.1634,  0.1479,  0.1212]]],


        [[[ 0.1842,  0.0659, -0.0614,  0.1092, -0.1675],
          [-0.1057,  0.0857, -0.0968,  0.1906,  0.1513],
          [-0.1912,  0.1350, -0.0484, -0.0251,  0.1643],
          [ 0.1141,  0.0235, -0.1294, -0.0091, -0.0069],
          [ 0.0336, -0.0086, -0.0673,  0.1272, -0.0220]]]], requires_grad=True), Parameter containing:
tensor([-0.0706, -0.0415, -0.1431, -0.1451, -0.1992, -0.1000],
       requires_grad=True),
Parameter containing:..........

这是第一个卷积层的参数,包含权值和10个bias
运行下面代码

print(len(list(net.parameters())))
print(type(list(net.parameters())[0]))
10
<class 'torch.nn.parameter.Parameter'>

这是意味着这个list共有10个元素,每个元素是Parameter,因为这里有5层,每层包含权值与bias,所以list有10个Parameter元素。

注意:

BN层也有两个参数(gamma, beta)。

你可能感兴趣的:(pytorch,深度学习)