nn.Sequential()对象是类似keras的前馈模型的对象,可以为之添加层实现前馈神经网络。可将其看作是一个时序容器。Modules 会以他们传入的顺序被添加到容器中。当然,也可以传入一个OrderedDict。
模型建立有三种方法:
① nn.Sequential():对象.add_module(层名,层class的实例)
net1 = nn.Sequential()
net1.add_module('conv1', nn.Conv2d(1, 3, 5))
net1.add_module('batchnorm', nn.BatchNorm2d(3))
net1.add_module('relu1', nn.ReLU())
② nn.Sequential (*n个层class的实例)
net2 = nn.Sequential(
nn.Conv2d(1, 3, 5),
nn.BatchNorm2d(3),
nn.ReLU()
)
③ nn.Sequential (OrderedDict([*n个(层名,层class的实例)]))
from collections import OrderedDict
net3 = nn.Sequential(OrderedDict([
('conv1', nn.Conv2d(1, 3, 5)),
('batchnorm', nn.BatchNorm2d(3)),
('relu1', nn.ReLU())
]))
想查看模型具体情况,print 对象即可:
print('net1:', net1)
print('net2:', net2)
print('net3:', net3)
net1: Sequential(
(conv1): Conv2d (1, 3, kernel_size=(5, 5), stride=(1, 1))
(batchnorm): BatchNorm2d(3, eps=1e-05, momentum=0.1, affine=True)
(relu1): ReLU()
)
net2: Sequential(
(0): Conv2d (1, 3, kernel_size=(5, 5), stride=(1, 1))
(1): BatchNorm2d(3, eps=1e-05, momentum=0.1, affine=True)
(2): ReLU()
)
net3: Sequential(
(conv1): Conv2d (1, 3, kernel_size=(5, 5), stride=(1, 1))
(batchnorm): BatchNorm2d(3, eps=1e-05, momentum=0.1, affine=True)
(relu1): ReLU()
)