一. tf.keras线性回归

环境:
python 3.7
tensorflow 2.3 gpu
numpy 1.20.3
pandas 1.3.3
matplotlib 3.4.2

import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
if __name__ == '__main__':
    print('Tensorflow Version: {}'.format(tf.__version__))#格式化输出tensorflow版本
    data = pd.read_csv('./dataset/Income1.csv')
    print(data)
    x = data.Education
    y = data.Income

    plt.scatter(x,y)#绘制散点图
    plt.show()#显示图像
    # Sequential()方法是一个容器,描述了神经网络的网络结构,在Sequential()的输入参数中描述从输入层到输出层的网络结构
    model = tf.keras.Sequential()
    model.add(tf.keras.layers.Dense(1,input_shape=(1,)))
    
    print(model.summary())#输出模型参数

    model.compile(
        optimizer='adam',#优化方法
        loss='mse'#损失函数,均方差
    )
    history = model.fit(x,y,epochs=5000)#训练5000次
    print(model.predict(pd.Series([20])))#Series(是一维数组,因为输入x是一维数组,这里也是一维数组形式

**

tf.keras.layers.Dense(
    units,                                 # 正整数,输出空间的维数,一层的神经元个数
    activation=None,                       # 激活函数,不指定则没有
    use_bias=True,						   # 布尔值,是否使用偏移向量
    kernel_initializer='glorot_uniform',   # 核权重矩阵的初始值设定项
    bias_initializer='zeros',              # 偏差向量的初始值设定项
    kernel_regularizer=None,               # 正则化函数应用于核权矩阵
    bias_regularizer=None,                 # 应用于偏差向量的正则化函数
    activity_regularizer=None,             # Regularizer function applied to the output of the layer (its "activation")
    kernel_constraint=None,                # Constraint function applied to the kernel weights matrix.
    bias_constraint=None, **kwargs         # Constraint function applied to the bias vector
)

tf.keras.layers.Dense(1,input_shape=(1,)):1对应units。输入参数为元组。

数据集

    Unnamed: 0  Education     Income
0            1  10.000000  26.658839
1            2  10.401338  27.306435
2            3  10.842809  22.132410
3            4  11.244147  21.169841
4            5  11.645485  15.192634
5            6  12.086957  26.398951
6            7  12.488294  17.435307
7            8  12.889632  25.507885
8            9  13.290970  36.884595
9           10  13.732441  39.666109
10          11  14.133779  34.396281
11          12  14.535117  41.497994
12          13  14.976589  44.981575
13          14  15.377926  47.039595
14          15  15.779264  48.252578
15          16  16.220736  57.034251
16          17  16.622074  51.490919
17          18  17.023411  61.336621
18          19  17.464883  57.581988
19          20  17.866221  68.553714
20          21  18.267559  64.310925
21          22  18.709030  68.959009
22          23  19.110368  74.614639
23          24  19.511706  71.867195
24          25  19.913043  76.098135
25          26  20.354515  75.775218
26          27  20.755853  72.486055
27          28  21.157191  77.355021
28          29  21.598662  72.118790
29          30  22.000000  80.260571

绘制散点图
一. tf.keras线性回归_第1张图片

model.summary():y=ax+b

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 1)                 2         
=================================================================
Total params: 2
Trainable params: 2
Non-trainable params: 0

数据集链接:https://pan.baidu.com/s/1SgK1gZdTEYkBtVboe8EBgA
密码:1234

你可能感兴趣的:(keras,tensorflow,神经网络)