Python实现Keras搭建神经网络训练回归模型

鼠标下滑附有注释讲解版

# Regressor example

import numpy as np
# for reproducibility
np.random.seed(1337)
from keras.models import Sequential
from keras.layers import Dense
import matplotlib.pyplot as plt

# create some data
X = np.linspace(-1, 1, 200)
# randomize the data
np.random.shuffle(X)
Y = 0.5*X + 2 + np.random.normal(0, 0.05, (200, ))
# plot data
plt.scatter(X, Y)
plt.show()

# first 160 data points
X_train, Y_train = X[:160], Y[:160]
# last 40 data points
X_test, Y_test = X[160:], Y[160:]

# build a neural network from the 1st layer to the last layer
model = Sequential()
model.add(Dense(output_dim = 1, input_dim = 1))

# choose loss function and optimizing method
model.compile(loss = 'mse', optimizer = 'sgd')

# training
print('Training-----------')
for step in range(301):
    cost = model.train_on_batch(X_train, Y_train)
    if step % 100 == 0:
        print('train cost:', cost)

# test
print('\nTesting------------')
cost = model.evaluate(X_test, Y_test, batch_size=40)
print('test cost:', cost)
W, b = model.layers[0].get_weights()
print('Weights = ', W, '\nbiases = ', b)

# plotting the prediction
Y_pred = model.predict(X_test)
plt.scatter(X_test, Y_test)
plt.plot(X_test, Y_pred)
plt.show()

运行结果:

Using TensorFlow backend.

Training-----------

train cost: 4.0225005
train cost: 0.073238626
train cost: 0.00386274
train cost: 0.002643449

Testing------------

40/40 [==============================] - 0s 975us/step
test cost: 0.0031367032788693905
Weights =  [[0.4922711]] 
biases =  [1.9995022]

Python实现Keras搭建神经网络训练回归模型_第1张图片
Python实现Keras搭建神经网络训练回归模型_第2张图片
注释讲解版:

# Regressor example

import numpy as np
# for reproducibility
np.random.seed(1337)
# 按顺序建层
from keras.models import Sequential
# 全连接层
from keras.layers import Dense
# 可视化模块
import matplotlib.pyplot as plt

# 随机生成200个数字并模拟一个线性函数
# 随机生成200个范围在-1到1之间一个浮点数
# 并模拟一个线性函数的公式,0.5*X+2 并加上一些随机的干扰,生成200个函数结果Y
# 然后从中抽选出160组数据作为训练数据,40组作为测试训练的结果的数据
# create some data
X = np.linspace(-1, 1, 200)
# randomize the data
np.random.shuffle(X)
Y = 0.5*X + 2 + np.random.normal(0, 0.05, (200, ))
# plot data
plt.scatter(X, Y)
plt.show()

# first 160 data points
X_train, Y_train = X[:160], Y[:160]
# last 40 data points
X_test, Y_test = X[160:], Y[160:]

# 用Keras的API建立一个神经网络模型
# 这个模型是总共有只要一层,1个输入和一个输出,建立好神经网络后,选择损失函数和优化器
# build a neural network from the 1st layer to the last layer
model = Sequential()
model.add(Dense(output_dim = 1, input_dim = 1))

# choose loss function and optimizing method
# mse表示均方误差 sgd表示随机梯度下降法
model.compile(loss = 'mse', optimizer = 'sgd')

# 分批次训练
# 批次的次数不是越多越好,在当前的例子中,
# 批次的训练次数达到1300次左右基本上已经达到损失函数能够达到的最好的结果了,
# 再增加次数也增加了不了精度了,具体请见日志的输出
# training
print('Training-----------')
for step in range(1401):
    # train_on_batch 一批一批的训练 X_train, Y_train
    # 默认的返回值是 cost,每100步输出一下结果
    cost = model.train_on_batch(X_train, Y_train)
    if step % 100 == 0:
        print('train cost:', cost)

# 在上面的部分,经过1400次的分批次训练,神经网络已经完全模拟出了上面的线性函数的模型
# 这个时候代入剩下的40组测试数据进行测试
# 我们会发现0.5*X+2 这个线性函数完全被建立起来了
# 从输出的weight和biases的值其实就是上面的0.5和2
# weight和0.5越接近,说明效果越好;biases和2越接近说明效果越好
# test
print('\nTesting------------')
cost = model.evaluate(X_test, Y_test, batch_size=40)
print('test cost:', cost)
# 选第一个layer,就是Dense的全连接层
W, b = model.layers[0].get_weights()
print('Weights = ', W, '\nbiases = ', b)

# Keras模型结果 VS 原始测试数据结果
# 把通过神经网络得出的结果与原始的测试结果得出的结果进行比较,并显示其对比图像
# plotting the prediction
Y_pred = model.predict(X_test)
plt.scatter(X_test, Y_test)
plt.plot(X_test, Y_pred)
plt.show()

运行结果:

Using TensorFlow backend.

Training-----------

train cost: 4.0225005
train cost: 0.073238626
train cost: 0.00386274
train cost: 0.002643449
train cost: 0.0026218703
train cost: 0.0026214502
train cost: 0.002621432
train cost: 0.0026214297
train cost: 0.002621428
train cost: 0.0026214276
train cost: 0.0026214283
train cost: 0.0026214288
train cost: 0.0026214286
train cost: 0.0026214286
train cost: 0.0026214286

Testing------------

40/40 [==============================] - 0s 751us/step
test cost: 0.00324707361869514
Weights =  [[0.49136025]] 
biases =  [2.004053]

Python实现Keras搭建神经网络训练回归模型_第3张图片

你可能感兴趣的:(知识增加,深度学习)