Caffe中的Net

Net这个类代表着一个神经网络。

一个Net的创建

cafferoot='/home/abstractsky/caffe'   
caffe.set_mode_gpu                 #在GPU上运行
deployfile='/home/abstractsky/公共的/style-transfer-master/models/vgg16/VGG_ILSVRC_16_layers_deploy.prototxt'  #deploy文件
modelfile='/home/abstractsky/公共的/style-transfer-master/models/vgg16/VGG_ILSVRC_16_layers.caffemodel'   #caffemodel文件
net=caffe.Net(deployfile,modelfile,caffe.TEST)       

Net中的数据

网络中的数据都是通过Blob这个类进行传输的,这个类可看作是四维数据的集合.

print type(net.blobs)
print '\n'
for layer_name, blob in net.blobs.iteritems():
    print layer_name + '\t' + str(blob.data.shape)
输出:
<class 'collections.OrderedDict'>
  #这个类型可看作是有序的字典集合

data    (1, 3, 224, 224)
conv1_1 (1, 64, 224, 224)
conv1_2 (1, 64, 224, 224)
pool1   (1, 64, 112, 112)
conv2_1 (1, 128, 112, 112)
conv2_2 (1, 128, 112, 112)
pool2   (1, 128, 56, 56)
conv3_1 (1, 256, 56, 56)
conv3_2 (1, 256, 56, 56)
conv3_3 (1, 256, 56, 56)
pool3   (1, 256, 28, 28)
conv4_1 (1, 512, 28, 28)
conv4_2 (1, 512, 28, 28)
conv4_3 (1, 512, 28, 28)
pool4   (1, 512, 14, 14)
conv5_1 (1, 512, 14, 14)
conv5_2 (1, 512, 14, 14)
conv5_3 (1, 512, 14, 14)
pool5   (1, 512, 7, 7)      #这些都是网络中各个层的输入规格

Net中的参数

for layer_name, param in net.params.iteritems():
    print layer_name + '\t' + str(param[0].data.shape), str(param[1].data.shape)

输出:
conv1_1 (64, 3, 3, 3) (64,)
conv1_2 (64, 64, 3, 3) (64,)
conv2_1 (128, 64, 3, 3) (128,)
conv2_2 (128, 128, 3, 3) (128,)
conv3_1 (256, 128, 3, 3) (256,)
conv3_2 (256, 256, 3, 3) (256,)
conv3_3 (256, 256, 3, 3) (256,)
conv4_1 (512, 256, 3, 3) (512,)
conv4_2 (512, 512, 3, 3) (512,)
conv4_3 (512, 512, 3, 3) (512,)
conv5_1 (512, 512, 3, 3) (512,)
conv5_2 (512, 512, 3, 3) (512,)
conv5_3 (512, 512, 3, 3) (512,)

net.params同net.blobs一样,都是collections.OrderedDict类型。而net.params中的元素是Blob的容器(C++中的vector),在这里包含两个blob,分别表示Conv层中的weight和bias.weight中的四个参数分别表示输出的维数、输入的维数、过滤器的高、过滤滤的宽,事实上这也就是这个conv层中的所有参数个数。

初学不久,相互交流。

你可能感兴趣的:(Caffe)