环境:Pycharm2017.2,tensorflow 2.0.0b0,Win7
感想:最近在研究卷积神经网络的构建和训练,权重(参数)是一个非常重要的概念,卷积层会用多个不同的卷积核,每个卷积核有m*n个权重,用来提取和强化图像的特征,主要利用了图像的局部相关性。全连接层,顾名思义,每一层的每个神经元都与上一层的每个神经元连接起来,每条连接都有一个权重和偏置来进行传递。
构建卷积网络如下:
import tensorflow as tf
from tensorflow.python.keras import datasets, models, layers
import os
class CNN(object):
def __init__(self):
model = models.Sequential()
# 第1层卷积,卷积核大小为3*3,32个卷积核,28*28为待训练图片的大小
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
# 池化层
model.add(layers.MaxPooling2D(2, 2))
# 第2层卷积,卷积核大小为3*3,64个卷积核
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
# 池化层
model.add(layers.MaxPooling2D(2, 2))
# 第3层卷积,卷积核大小为3*3,128个卷积核
model.add(layers.Conv2D(128, (3, 3), activation='relu'))
# 池化层
model.add(layers.MaxPooling2D(2, 2))
# 拉成1维形状
model.add(layers.Flatten())
# 第4层全连接层,64个神经元
model.add(layers.Dense(64, activation='relu'))
# 第5层全连接层 10个神经元,softmax 多用于分类
model.add(layers.Dense(10, activation='softmax'))
model.summary()
self.model = model
if __name__ == "__main__":
# app = Train()
# app.train()
CNN()
输出的网络信息如下:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 26, 26, 32) 320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 11, 11, 64) 18496
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 5, 5, 64) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 3, 3, 128) 73856
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 1, 1, 128) 0
_________________________________________________________________
flatten (Flatten) (None, 128) 0
_________________________________________________________________
dense (Dense) (None, 64) 8256
_________________________________________________________________
dense_1 (Dense) (None, 10) 650
=================================================================
Total params: 101,578
Trainable params: 101,578
Non-trainable params: 0
1 、卷积层参数个数计算方法:(卷积核高 * 卷积核宽 * 通道数 + 1) * 卷积核个数
2 、当前全连接层参数个数计算方法: (上一层神经元个数 + 1) * 当前层神经元个数
以上的1代表偏置,因为每个神经元都有一个偏置
卷积层1: 320 = (3 * 3 * 1 +1) * 32
卷积层2: 18496 = (3 * 3 * 32 +1) * 64
卷积层3: 73856 = (3 * 3 * 64 +1) * 128
全连接层1: 8256 = (128 + 1) * 64
全连接层2: 650 = (64 + 1) * 10
参考链接:https://blog.csdn.net/ybdesire/article/details/85217688