Keras学习(二)——Regression(回归)

本文主要介绍利用keras搭建简单的神经网络,对数据拟合。

示例代码:

import numpy as np
from keras import Sequential   # 按顺序建立的神经网络
from keras.layers import Dense  # Dense全连接层
import matplotlib.pyplot as plt

np.random.seed(1337)   # 使多次生成的随机数相同

# 生成数据
X = np.linspace(-1, 1, 200)
np.random.shuffle(X)  # 打乱生成的数据
Y = 0.5 * X + 2 + np.random.normal(0, 0.005, (200,))

# 显示生成的数据
# plt.scatter(X, Y)
# plt.show()

# 把数据分成训练数据和测试数据
X_train, Y_train = X[:160], Y[:160]
X_test, Y_test = X[160:], Y[160:]

# 利用keras建造神经网络
model = Sequential()
model.add(Dense(output_dim=1, input_dim=1))
# model.add(Dense(output_dim=1, ))   # 输入默认为上一层运行的输出

# 选择误差函数和优化方法
model.compile(loss='mse', optimizer='sgd')

# 训练
print('Training....')
for step in range(301):
    cost = model.train_on_batch(X_train, Y_train)
    if step % 100 == 0:
        print('train cost', cost)

# 测试
print('\nTesting....\n')
cost = model.evaluate(X_test, Y_test, batch_size=40)
print('test cost', cost)
W, b = model.layers[0].get_weights()
print('Weights=', W, '\nbiases=', b)

# 显示预测值
Y_pred = model.predict(X_test)
plt.scatter(X_test, Y_test)
plt.plot(X_test, Y_pred)
plt.show()

显示数据:

Keras学习(二)——Regression(回归)_第1张图片

拟合结果:

Training....
2018-10-29 19:21:23.931121: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
train cost 4.0045943
train cost 0.070371404
train cost 0.0012619623
train cost 4.7929352e-05

Testing....


40/40 [==============================] - 0s 826us/step
test cost 3.964245115639642e-05
Weights= [[0.4990688]] 
biases= [1.9958383]

 

Keras学习(二)——Regression(回归)_第2张图片

你可能感兴趣的:(Keras)