这个课程实在太简单,一步步构建一个最最基本的回归模型,代码如下。
import tensorflow as tf
from tensorflow import keras
from keras.models import Sequential
from keras.layers import Dense
import numpy as np
import matplotlib.pylab as plt
构建模型
model = Sequential()
model.add(Dense(1, input_shape=[1]))
model.output_shape, model.input_shape
optimizer = tf.train.AdamOptimizer(learning_rate=1e-3)
model.compile(optimizer, loss='mean_squared_error', metrics=['mse'])
构建数据
x = np.linspace(1, 20, 20)
y = x * 50 # + np.random.randn(20) * 10
x = x / np.max(x)
y = y / np.max(y)
print('x: {}'.format(x), '\ny: {}'.format(y))
#绘制数据
plt.scatter(x, y)
x: [0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6 0.65 0.7
0.75 0.8 0.85 0.9 0.95 1. ]
y: [0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6 0.65 0.7
0.75 0.8 0.85 0.9 0.95 1. ]
模型训练,非常简单:
model.fit(x, y, batch_size=1000, epochs=5)
Epoch 1/5
20/20 [==============================] - 0s 6ms/step - loss: 0.0129 - mean_squared_error: 0.0129
Epoch 2/5
20/20 [==============================] - 0s 170us/step - loss: 0.0125 - mean_squared_error: 0.0125
Epoch 3/5
20/20 [==============================] - 0s 135us/step - loss: 0.0122 - mean_squared_error: 0.0122