机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)

机器学习-02

    • 回归模型
      • 线性回归
        • 评估训练结果误差(metrics)
        • 模型的保存和加载
      • 岭回归
      • 多项式回归
    • 代码总结
      • 线性回归
        • 绘制图像,观察w0、w1、loss的变化过程
        • 以等高线的方式绘制梯度下降的过程
        • 薪水预测
        • 评估误差
        • 把训练好的模型存入文件
      • 加载模型
        • 封装预测模型对象,提供薪资预测服务
      • 岭回归
        • 如何选择合适的超参数C?
      • 多项式回归
        • 基于这组数据训练多项式回归模型
        • 案例:波士顿房屋价格数据分析与房价预测
          • 训练回归模型,预测房屋价格

回归模型

线性回归

输入		输出
0.5      5.0
0.6      5.5
0.8      6.0
1.1      6.8
1.4      7.0
...
y = f(x)

预测函数:y = w0+w1x
x: 输入
y: 输出
w0和w1: 模型参数

所谓模型训练,就是根据已知的x和y,找到最佳的模型参数w0 和 w1,尽可能精确地描述出输入和输出的关系。

5.0 = w0 + w1 × 0.5
5.5 = w0 + w1 × 0.6

单样本误差:

根据预测函数求出输入为x时的预测值:y’ = w0 + w1x,单样本误差为1/2(y’ - y)2

总样本误差:

把所有单样本误差相加即是总样本误差:1/2 Σ(y’ - y)2

损失函数:

loss = 1/2 Σ(w0 + w1x - y)2

所以损失函数就是总样本误差关于模型参数的函数,该函数属于三维数学模型,即需要找到一组w0 w1使得loss取极小值。

案例:画图模拟梯度下降的过程

  1. 整理训练集数据,自定义梯度下降算法规则,求出w0 , w1 ,绘制回归线。
import numpy as np
import matplotlib.pyplot as mp
train_x = np.array([0.5, 0.6, 0.8, 1.1, 1.4])
train_y = np.array([5.0, 5.5, 6.0, 6.8, 7.0])
test_x = np.array([0.45, 0.55, 1.0, 1.3, 1.5])
test_y = np.array([4.8, 5.3, 6.4, 6.9, 7.3])

times = 1000	# 定义梯度下降次数
lrate = 0.01	# 记录每次梯度下降参数变化率
epoches = []	# 记录每次梯度下降的索引
w0, w1, losses = [1], [1], []
for i in range(1, times + 1):
    epoches.append(i)
    loss = (((w0[-1] + w1[-1] * train_x) - train_y) ** 2).sum() / 2
    losses.append(loss)
    d0 = ((w0[-1] + w1[-1] * train_x) - train_y).sum()
    d1 = (((w0[-1] + w1[-1] * train_x) - train_y) * train_x).sum()
    print('{:4}> w0={:.8f}, w1={:.8f}, loss={:.8f}'.format(epoches[-1], w0[-1], w1[-1], losses[-1]))
    w0.append(w0[-1] - lrate * d0)
    w1.append(w1[-1] - lrate * d1)

pred_test_y = w0[-1] + w1[-1] * test_x
mp.figure('Linear Regression', facecolor='lightgray')
mp.title('Linear Regression', fontsize=20)
mp.xlabel('x', fontsize=14)
mp.ylabel('y', fontsize=14)
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.scatter(train_x, train_y, marker='s', c='dodgerblue', alpha=0.5, s=80, label='Training')
mp.scatter(test_x, test_y, marker='D', c='orangered', alpha=0.5, s=60, label='Testing')
mp.scatter(test_x, pred_test_y, c='orangered', alpha=0.5, s=80, label='Predicted')
mp.plot(test_x, pred_test_y, '--', c='limegreen', label='Regression', linewidth=1)
mp.legend()
mp.show()
  1. 绘制随着每次梯度下降,w0,w1,loss的变化曲线。
w0 = w0[:-1]
w1 = w1[:-1]

mp.figure('Training Progress', facecolor='lightgray')
mp.subplot(311)
mp.title('Training Progress', fontsize=20)
mp.ylabel('w0', fontsize=14)
mp.gca().xaxis.set_major_locator(mp.MultipleLocator(100))
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.plot(epoches, w0, c='dodgerblue', label='w0')
mp.legend()
mp.subplot(312)
mp.ylabel('w1', fontsize=14)
mp.gca().xaxis.set_major_locator(mp.MultipleLocator(100))
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.plot(epoches, w1, c='limegreen', label='w1')
mp.legend()

mp.subplot(313)
mp.xlabel('epoch', fontsize=14)
mp.ylabel('loss', fontsize=14)
mp.gca().xaxis.set_major_locator(mp.MultipleLocator(100))
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.plot(epoches, losses, c='orangered', label='loss')
mp.legend()
  1. 基于三维曲面绘制梯度下降过程中的每一个点。
import mpl_toolkits.mplot3d as axes3d

grid_w0, grid_w1 = np.meshgrid(
    np.linspace(0, 9, 500),
    np.linspace(0, 3.5, 500))

grid_loss = np.zeros_like(grid_w0)
for x, y in zip(train_x, train_y):
    grid_loss += ((grid_w0 + x*grid_w1 - y) ** 2) / 2

mp.figure('Loss Function')
ax = mp.gca(projection='3d')
mp.title('Loss Function', fontsize=20)
ax.set_xlabel('w0', fontsize=14)
ax.set_ylabel('w1', fontsize=14)
ax.set_zlabel('loss', fontsize=14)
ax.plot_surface(grid_w0, grid_w1, grid_loss, rstride=10, cstride=10, cmap='jet')
ax.plot(w0, w1, losses, 'o-', c='orangered', label='BGD')
mp.legend()
  1. 以等高线的方式绘制梯度下降的过程。
mp.figure('Batch Gradient Descent', facecolor='lightgray')
mp.title('Batch Gradient Descent', fontsize=20)
mp.xlabel('x', fontsize=14)
mp.ylabel('y', fontsize=14)
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.contourf(grid_w0, grid_w1, grid_loss, 10, cmap='jet')
cntr = mp.contour(grid_w0, grid_w1, grid_loss, 10,
                  colors='black', linewidths=0.5)
mp.clabel(cntr, inline_spacing=0.1, fmt='%.2f',
          fontsize=8)
mp.plot(w0, w1, 'o-', c='orangered', label='BGD')
mp.legend()
mp.show()

线性回归相关API:

import sklearn.linear_model as lm
# 创建模型
model = lm.LinearRegression()
# 训练模型
# 输入为一个二维数组表示的样本矩阵
# 输出为每个样本最终的结果
model.fit(输入, 输出) # 通过梯度下降法计算模型参数
# 预测输出  
# 输入array是一个二维数组,每一行是一个样本,每一列是一个特征。
result = model.predict(array)


浓度		深度		温度			腐蚀速率
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002

0.002	 200      -2              ?
0.003	 300      -4              ?



案例:基于线性回归训练single.txt中的训练样本,使用模型预测测试样本。

import numpy as np
import sklearn.linear_model as lm
import matplotlib.pyplot as mp
# 采集数据
x, y = np.loadtxt('../data/single.txt', delimiter=',', usecols=(0,1), unpack=True)
x = x.reshape(-1, 1)
# 创建模型
model = lm.LinearRegression()  # 线性回归
# 训练模型
model.fit(x, y)
# 根据输入预测输出
pred_y = model.predict(x)
mp.figure('Linear Regression', facecolor='lightgray')
mp.title('Linear Regression', fontsize=20)
mp.xlabel('x', fontsize=14)
mp.ylabel('y', fontsize=14)
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.scatter(x, y, c='dodgerblue', alpha=0.75, s=60, label='Sample')
mp.plot(x, pred_y, c='orangered', label='Regression')
mp.legend()
mp.show()

评估训练结果误差(metrics)

线性回归模型训练完毕后,可以利用测试集评估训练结果误差。sklearn.metrics提供了计算模型误差的几个常用算法:

import sklearn.metrics as sm

# 平均绝对值误差:1/m∑|实际输出-预测输出|
sm.mean_absolute_error(y, pred_y)
# 平均平方误差:SQRT(1/mΣ(实际输出-预测输出)^2)
sm.mean_squared_error(y, pred_y)
# 中位绝对值误差:MEDIAN(|实际输出-预测输出|)
sm.median_absolute_error(y, pred_y)
# R2得分,(0,1]区间的分值。分数越高,误差越小。
sm.r2_score(y, pred_y)

案例:在上一个案例中使用sm评估模型误差。

# 平均绝对值误差:1/m∑|实际输出-预测输出|
print(sm.mean_absolute_error(y, pred_y))
# 平均平方误差:SQRT(1/mΣ(实际输出-预测输 出)^2)
print(sm.mean_squared_error(y, pred_y))
# 中位绝对值误差:MEDIAN(|实际输出-预测输出|)
print(sm.median_absolute_error(y, pred_y))
# R2得分,(0,1]区间的分值。分数越高,误差越小。
print(sm.r2_score(y, pred_y))

模型的保存和加载

模型训练是一个耗时的过程,一个优秀的机器学习是非常宝贵的。可以模型保存到磁盘中,也可以在需要使用的时候从磁盘中重新加载模型即可。不需要重新训练。

模型保存和加载相关API:

import pickle
pickle.dump(内存对象, 磁盘文件) # 保存模型
model = pickle.load(磁盘文件) # 加载模型

案例:把训练好的模型保存到磁盘中。

# 将训练好的模型对象保存到磁盘文件中
with open('../../data/linear.pkl', 'wb') as f:
    pickle.dump(model, f)

# 从磁盘文件中加载模型对象
with open('../../data/linear.pkl', 'rb') as f:
    model = pickle.load(f)
# 根据输入预测输出
pred_y = model.predict(x)

岭回归

普通线性回归模型使用基于梯度下降的最小二乘法,在最小化损失函数的前提下,寻找最优模型参数,于此过程中,包括少数异常样本在内的全部训练数据都会对最终模型参数造成程度相等的影响,异常值对模型所带来影响无法在训练过程中被识别出来。为此,岭回归在模型迭代过程所依据的损失函数中增加了正则项,以限制模型参数对异常样本的匹配程度,进而提高模型面对多数正常样本的拟合精度。

import sklearn.linear_model as lm
# 创建模型
model = lm.Ridge(正则强度,fit_intercept=是否训练截距, max_iter=最大迭代次数)
# 训练模型
# 输入为一个二维数组表示的样本矩阵
# 输出为每个样本最终的结果
model.fit(输入, 输出)
# 预测输出  
# 输入array是一个二维数组,每一行是一个样本,每一列是一个特征。
result = model.predict(array)

案例:加载abnormal.txt文件中的数据,基于岭回归算法训练回归模型。

import numpy as np
import sklearn.linear_model as lm
import matplotlib.pyplot as mp
# 采集数据
x, y = np.loadtxt('../data/single.txt', delimiter=',', usecols=(0,1), unpack=True)
x = x.reshape(-1, 1)
# 创建线性回归模型
model = lm.LinearRegression() 
# 训练模型
model.fit(x, y)
# 根据输入预测输出
pred_y1 = model.predict(x)
# 创建岭回归模型
model = lm.Ridge(150, fit_intercept=True, max_iter=10000) 
# 训练模型
model.fit(x, y)
# 根据输入预测输出
pred_y2 = model.predict(x)

mp.figure('Linear & Ridge', facecolor='lightgray')
mp.title('Linear & Ridge', fontsize=20)
mp.xlabel('x', fontsize=14)
mp.ylabel('y', fontsize=14)
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.scatter(x, y, c='dodgerblue', alpha=0.75,
           s=60, label='Sample')
sorted_indices = x.T[0].argsort()
mp.plot(x[sorted_indices], pred_y1[sorted_indices],
        c='orangered', label='Linear')
mp.plot(x[sorted_indices], pred_y2[sorted_indices],
        c='limegreen', label='Ridge')
mp.legend()
mp.show()

多项式回归

浓度		深度		温度			腐蚀速率
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002
0.001	 100	  -1			0.0002

0.002	 200      -2              ?
0.003	 300      -4              ?

若希望回归模型更好的拟合训练样本数据,可以使用多项式回归器。

一元多项式回归

y=w0 + w1 x + w2 x2 + w3 x3 + … + wd xd

将高次项看做对一次项特征的扩展得到:

y=w0 + w1 x1 + w2 x2 + w3 x3 + … + wd xd

那么一元多项式回归即可以看做为多元线性回归,可以使用LinearRegression模型对样本数据进行模型训练。

所以一元多项式回归的实现需要两个步骤:

  1. 将一元多项式回归问题转换为多元线性回归问题(只需给出多项式最高次数即可)。
  2. 将1步骤得到多项式的结果中 w1 w2 … 当做样本特征,交给线性回归器训练多元线性模型。

使用sklearn提供的数据管线实现两个步骤的顺序执行:

import sklearn.pipeline as pl
import sklearn.preprocessing as sp
import sklearn.linear_model as lm

model = pl.make_pipeline(
    sp.PolynomialFeatures(10),  # 多项式特征扩展器
    lm.LinearRegression())      # 线性回归器

案例:

import numpy as np
import sklearn.pipeline as pl
import sklearn.preprocessing as sp
import sklearn.linear_model as lm
import sklearn.metrics as sm
import matplotlib.pyplot as mp
# 采集数据
x, y = np.loadtxt('../data/single.txt', delimiter=',', usecols=(0,1), unpack=True)
x = x.reshape(-1, 1)
# 创建模型(管线)
model = pl.make_pipeline(
    sp.PolynomialFeatures(10),  # 多项式特征扩展器
    lm.LinearRegression())      # 线性回归器
# 训练模型
model.fit(x, y)
# 根据输入预测输出
pred_y = model.predict(x)
test_x = np.linspace(x.min(), x.max(), 1000).reshape(-1, 1)
pred_test_y = model.predict(test_x)
mp.figure('Polynomial Regression', facecolor='lightgray')
mp.title('Polynomial Regression', fontsize=20)
mp.xlabel('x', fontsize=14)
mp.ylabel('y', fontsize=14)
mp.tick_params(labelsize=10)
mp.grid(linestyle=':')
mp.scatter(x, y, c='dodgerblue', alpha=0.75, s=60, label='Sample')
mp.plot(test_x, pred_test_y, c='orangered', label='Regression')
mp.legend()
mp.show()

过于简单的模型,无论对于训练数据还是测试数据都无法给出足够高的预测精度,这种现象叫做欠拟合。

过于复杂的模型,对于训练数据可以得到较高的预测精度,但对于测试数据通常精度较低,这种现象叫做过拟合。

一个性能可以接受的学习模型应该对训练数据和测试数据都有接近的预测精度,而且精度不能太低。

训练集R2   测试集R2
0.3        0.4        欠拟合:过于简单,无法反映数据的规则
0.9        0.2        过拟合:过于复杂,太特殊,缺乏一般性
0.7        0.6        可接受:复杂度适中,既反映数据的规则,同时又不失一般性

案例:预测波士顿地区房屋价格。

  1. 读取数据,打断原始数据集。 划分训练集和测试集。
import sklearn.datasets as sd
import sklearn.utils as su
# 加载波士顿地区房价数据集
boston = sd.load_boston()
print(boston.feature_names)
# |CRIM|ZN|INDUS|CHAS|NOX|RM|AGE|DIS|RAD|TAX|PTRATIO|B|LSTAT|
# 犯罪率|住宅用地比例|商业用地比例|是否靠河|空气质量|房间数|年限|距中心区距离|路网密度|房产税|师生比|黑人比例|低地位人口比例|
# 打乱原始数据集的输入和输出
x, y = su.shuffle(boston.data, boston.target, random_state=7)
# 划分训练集和测试集
train_size = int(len(x) * 0.8)
train_x, test_x, train_y, test_y = \
    x[:train_size], x[train_size:], \
    y[:train_size], y[train_size:]
  1. 基于岭回归与多项式回归训练模型并测试模型性能。
代码查看代码总结中的波士顿房屋价格数据分析与房价预测案例

代码总结

线性回归

import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
x = np.array([0.5, 0.6, 0.8, 1.1, 1.4])
y = np.array([5.0, 5.5, 6.0, 6.8, 7.0])
data = pd.DataFrame({
     'x':x, 'y':y})
data.plot.scatter(x='x', y='y', s=80)

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第1张图片

# 基于线性回归,梯度下降理论找到最优秀的直线拟合当前这组样本数据
w0, w1 = 1, 1  # 模型参数
lrate = 0.01   # 学习率
times = 1000   # 迭代次数

# 整理一些数组,用于后期画图
epoches, w0s, w1s, losses= [], [], [], []
for i in range(times):
    # 计算当前w0与w1的状态下的loss函数值
    loss = ((w0 + w1*x - y)**2).sum()/2
    epoches.append(i+1)
    w0s.append(w0)
    w1s.append(w1)
    losses.append(loss)
    # 输出模型参数的变化过程
    print('{:4}, w0:{:.8f}, w1:{:.8f}, loss:{:.8f}'.format(i+1, w0, w1, loss))
    # 更新w0与w1 (需要求出w0与w1方向上的偏导数,带入更新公式)
    d0 = (w0 + w1*x - y).sum()
    d1 = (x*(w0 + w1*x - y)).sum()
    w0 = w0 - lrate * d0
    w1 = w1 - lrate * d1
   1, w0:1.00000000, w1:1.00000000, loss:44.17500000
   2, w0:1.20900000, w1:1.19060000, loss:36.53882794
   3, w0:1.39916360, w1:1.36357948, loss:30.23168666
   4, w0:1.57220792, w1:1.52054607, loss:25.02222743
   5, w0:1.72969350, w1:1.66296078, loss:20.71937337
   6, w0:1.87303855, w1:1.79215140, loss:17.16530917
   7, w0:2.00353196, w1:1.90932461, loss:14.22969110
   8, w0:2.12234508, w1:2.01557706, loss:11.80486494
   9, w0:2.23054244, w1:2.11190537, loss:9.80191627
  10, w0:2.32909148, w1:2.19921529, loss:8.14740839
  11, w0:2.41887143, w1:2.27832995, loss:6.78068803
  12, w0:2.50068134, w1:2.34999742, loss:5.65166010
  13, w0:2.57524739, w1:2.41489755, loss:4.71894976
  14, w0:2.64322953, w1:2.47364820, loss:3.94838447
  15, w0:2.70522753, w1:2.52681085, loss:3.31174023
  16, w0:2.76178648, w1:2.57489580, loss:2.78570611
  17, w0:2.81340174, w1:2.61836680, loss:2.35102901
  18, w0:2.86052351, w1:2.65764531, loss:1.99180729
  19, w0:2.90356094, w1:2.69311435, loss:1.69490738
  20, w0:2.94288586, w1:2.72512202, loss:1.44948190
  21, w0:2.97883620, w1:2.75398465, loss:1.24657173
  22, w0:3.01171907, w1:2.77998973, loss:1.07877728
  23, w0:3.04181357, w1:2.80339855, loss:0.93998705
  24, w0:3.06937335, w1:2.82444853, loss:0.82515337
  25, w0:3.09462895, w1:2.84335548, loss:0.73010724
  26, w0:3.11778986, w1:2.86031549, loss:0.65140537
  27, w0:3.13904648, w1:2.87550680, loss:0.58620385
  28, w0:3.15857186, w1:2.88909135, loss:0.53215381
  29, w0:3.17652325, w1:2.90121635, loss:0.48731526
  30, w0:3.19304357, w1:2.91201557, loss:0.45008589
  31, w0:3.20826270, w1:2.92161056, loss:0.41914232
  32, w0:3.22229870, w1:2.93011181, loss:0.39339153
  33, w0:3.23525885, w1:2.93761973, loss:0.37193074
  34, w0:3.24724064, w1:2.94422555, loss:0.35401433
  35, w0:3.25833268, w1:2.95001219, loss:0.33902647
  36, w0:3.26861551, w1:2.95505501, loss:0.32645850
  37, w0:3.27816232, w1:2.95942250, loss:0.31589033
  38, w0:3.28703961, w1:2.96317688, loss:0.30697497
  39, w0:3.29530785, w1:2.96637472, loss:0.29942583
  40, w0:3.30302197, w1:2.96906741, loss:0.29300619
  41, w0:3.31023190, w1:2.97130167, loss:0.28752056
  42, w0:3.31698303, w1:2.97311993, loss:0.28280743
  43, w0:3.32331660, w1:2.97456078, loss:0.27873344
  44, w0:3.32927010, w1:2.97565926, loss:0.27518842
  45, w0:3.33487759, w1:2.97644724, loss:0.27208136
  46, w0:3.34017003, w1:2.97695365, loss:0.26933710
  47, w0:3.34517557, w1:2.97720482, loss:0.26689356
  48, w0:3.34991978, w1:2.97722464, loss:0.26469945
  49, w0:3.35442590, w1:2.97703484, loss:0.26271241
  50, w0:3.35871508, w1:2.97665516, loss:0.26089744
  51, w0:3.36280649, w1:2.97610354, loss:0.25922564
  52, w0:3.36671761, w1:2.97539628, loss:0.25767311
  53, w0:3.37046430, w1:2.97454819, loss:0.25622012
  54, w0:3.37406096, w1:2.97357273, loss:0.25485037
  55, w0:3.37752071, w1:2.97248213, loss:0.25355040
  56, w0:3.38085546, w1:2.97128751, loss:0.25230905
  57, w0:3.38407604, w1:2.96999896, loss:0.25111715
  58, w0:3.38719228, w1:2.96862566, loss:0.24996707
  59, w0:3.39021314, w1:2.96717595, loss:0.24885253
  60, w0:3.39314674, w1:2.96565739, loss:0.24776834
  61, w0:3.39600048, w1:2.96407688, loss:0.24671019
  62, w0:3.39878107, w1:2.96244066, loss:0.24567453
  63, w0:3.40149463, w1:2.96075442, loss:0.24465841
  64, w0:3.40414670, w1:2.95902331, loss:0.24365939
  65, w0:3.40674234, w1:2.95725202, loss:0.24267546
  66, w0:3.40928614, w1:2.95544482, loss:0.24170494
  67, w0:3.41178226, w1:2.95360557, loss:0.24074644
  68, w0:3.41423450, w1:2.95173778, loss:0.23979882
  69, w0:3.41664631, w1:2.94984465, loss:0.23886112
  70, w0:3.41902083, w1:2.94792908, loss:0.23793253
  71, w0:3.42136091, w1:2.94599370, loss:0.23701241
  72, w0:3.42366914, w1:2.94404090, loss:0.23610019
  73, w0:3.42594789, w1:2.94207285, loss:0.23519540
  74, w0:3.42819929, w1:2.94009152, loss:0.23429768
  75, w0:3.43042530, w1:2.93809871, loss:0.23340668
  76, w0:3.43262769, w1:2.93609603, loss:0.23252214
  77, w0:3.43480808, w1:2.93408497, loss:0.23164382
  78, w0:3.43696793, w1:2.93206686, loss:0.23077153
  79, w0:3.43910860, w1:2.93004291, loss:0.22990510
  80, w0:3.44123128, w1:2.92801424, loss:0.22904438
  81, w0:3.44333709, w1:2.92598183, loss:0.22818926
  82, w0:3.44542703, w1:2.92394660, loss:0.22733961
  83, w0:3.44750203, w1:2.92190938, loss:0.22649536
  84, w0:3.44956292, w1:2.91987089, loss:0.22565641
  85, w0:3.45161045, w1:2.91783183, loss:0.22482269
  86, w0:3.45364533, w1:2.91579280, loss:0.22399414
  87, w0:3.45566818, w1:2.91375437, loss:0.22317069
  88, w0:3.45767958, w1:2.91171702, loss:0.22235230
  89, w0:3.45968005, w1:2.90968123, loss:0.22153891
  90, w0:3.46167007, w1:2.90764740, loss:0.22073048
  91, w0:3.46365008, w1:2.90561590, loss:0.21992696
  92, w0:3.46562048, w1:2.90358707, loss:0.21912831
  93, w0:3.46758162, w1:2.90156122, loss:0.21833450
  94, w0:3.46953385, w1:2.89953863, loss:0.21754549
  95, w0:3.47147746, w1:2.89751953, loss:0.21676124
  96, w0:3.47341273, w1:2.89550416, loss:0.21598172
  97, w0:3.47533991, w1:2.89349271, loss:0.21520689
  98, w0:3.47725923, w1:2.89148538, loss:0.21443674
  99, w0:3.47917091, w1:2.88948232, loss:0.21367121
 100, w0:3.48107515, w1:2.88748368, loss:0.21291029
 101, w0:3.48297211, w1:2.88548960, loss:0.21215395
 102, w0:3.48486196, w1:2.88350018, loss:0.21140215
 103, w0:3.48674485, w1:2.88151555, loss:0.21065487
 104, w0:3.48862093, w1:2.87953579, loss:0.20991208
 105, w0:3.49049030, w1:2.87756099, loss:0.20917375
 106, w0:3.49235311, w1:2.87559122, loss:0.20843985
 107, w0:3.49420944, w1:2.87362655, loss:0.20771036
 108, w0:3.49605940, w1:2.87166704, loss:0.20698525
 109, w0:3.49790308, w1:2.86971274, loss:0.20626449
 110, w0:3.49974056, w1:2.86776370, loss:0.20554806
 111, w0:3.50157193, w1:2.86581996, loss:0.20483593
 112, w0:3.50339726, w1:2.86388156, loss:0.20412807
 113, w0:3.50521661, w1:2.86194851, loss:0.20342446
 114, w0:3.50703004, w1:2.86002086, loss:0.20272507
 115, w0:3.50883762, w1:2.85809861, loss:0.20202988
 116, w0:3.51063940, w1:2.85618180, loss:0.20133886
 117, w0:3.51243543, w1:2.85427043, loss:0.20065199
 118, w0:3.51422576, w1:2.85236452, loss:0.19996924
 119, w0:3.51601043, w1:2.85046407, loss:0.19929059
 120, w0:3.51778949, w1:2.84856910, loss:0.19861600
 121, w0:3.51956298, w1:2.84667961, loss:0.19794547
 122, w0:3.52133093, w1:2.84479560, loss:0.19727896
 123, w0:3.52309337, w1:2.84291707, loss:0.19661645
 124, w0:3.52485035, w1:2.84104403, loss:0.19595791
 125, w0:3.52660190, w1:2.83917647, loss:0.19530333
 126, w0:3.52834804, w1:2.83731439, loss:0.19465267
 127, w0:3.53008880, w1:2.83545778, loss:0.19400592
 128, w0:3.53182422, w1:2.83360663, loss:0.19336305
 129, w0:3.53355432, w1:2.83176096, loss:0.19272403
 130, w0:3.53527912, w1:2.82992073, loss:0.19208885
 131, w0:3.53699865, w1:2.82808595, loss:0.19145748
 132, w0:3.53871294, w1:2.82625661, loss:0.19082990
 133, w0:3.54042200, w1:2.82443270, loss:0.19020608
 134, w0:3.54212586, w1:2.82261421, loss:0.18958601
 135, w0:3.54382454, w1:2.82080112, loss:0.18896966
 136, w0:3.54551807, w1:2.81899343, loss:0.18835700
 137, w0:3.54720645, w1:2.81719113, loss:0.18774802
 138, w0:3.54888972, w1:2.81539420, loss:0.18714270
 139, w0:3.55056789, w1:2.81360263, loss:0.18654100
 140, w0:3.55224098, w1:2.81181640, loss:0.18594292
 141, w0:3.55390901, w1:2.81003551, loss:0.18534843
 142, w0:3.55557200, w1:2.80825995, loss:0.18475750
 143, w0:3.55722996, w1:2.80648969, loss:0.18417012
 144, w0:3.55888291, w1:2.80472473, loss:0.18358626
 145, w0:3.56053088, w1:2.80296505, loss:0.18300591
 146, w0:3.56217387, w1:2.80121063, loss:0.18242904
 147, w0:3.56381191, w1:2.79946147, loss:0.18185563
 148, w0:3.56544501, w1:2.79771755, loss:0.18128566
 149, w0:3.56707319, w1:2.79597886, loss:0.18071911
 150, w0:3.56869646, w1:2.79424537, loss:0.18015596
 151, w0:3.57031484, w1:2.79251708, loss:0.17959618
 152, w0:3.57192835, w1:2.79079397, loss:0.17903977
 153, w0:3.57353699, w1:2.78907603, loss:0.17848670
 154, w0:3.57514080, w1:2.78736324, loss:0.17793694
 155, w0:3.57673978, w1:2.78565559, loss:0.17739048
 156, w0:3.57833394, w1:2.78395307, loss:0.17684730
 157, w0:3.57992331, w1:2.78225565, loss:0.17630738
 158, w0:3.58150790, w1:2.78056332, loss:0.17577070
 159, w0:3.58308772, w1:2.77887607, loss:0.17523724
 160, w0:3.58466278, w1:2.77719389, loss:0.17470698
 161, w0:3.58623311, w1:2.77551676, loss:0.17417991
 162, w0:3.58779872, w1:2.77384466, loss:0.17365599
 163, w0:3.58935962, w1:2.77217758, loss:0.17313522
 164, w0:3.59091582, w1:2.77051551, loss:0.17261757
 165, w0:3.59246735, w1:2.76885843, loss:0.17210303
 166, w0:3.59401421, w1:2.76720632, loss:0.17159158
 167, w0:3.59555642, w1:2.76555918, loss:0.17108319
 168, w0:3.59709400, w1:2.76391698, loss:0.17057786
 169, w0:3.59862695, w1:2.76227971, loss:0.17007555
 170, w0:3.60015530, w1:2.76064737, loss:0.16957626
 171, w0:3.60167905, w1:2.75901992, loss:0.16907997
 172, w0:3.60319822, w1:2.75739736, loss:0.16858665
 173, w0:3.60471282, w1:2.75577968, loss:0.16809630
 174, w0:3.60622288, w1:2.75416685, loss:0.16760889
 175, w0:3.60772839, w1:2.75255887, loss:0.16712440
 176, w0:3.60922938, w1:2.75095572, loss:0.16664281
 177, w0:3.61072586, w1:2.74935738, loss:0.16616412
 178, w0:3.61221784, w1:2.74776385, loss:0.16568830
 179, w0:3.61370534, w1:2.74617510, loss:0.16521534
 180, w0:3.61518837, w1:2.74459113, loss:0.16474521
 181, w0:3.61666694, w1:2.74301191, loss:0.16427790
 182, w0:3.61814107, w1:2.74143744, loss:0.16381340
 183, w0:3.61961077, w1:2.73986770, loss:0.16335168
 184, w0:3.62107605, w1:2.73830267, loss:0.16289274
 185, w0:3.62253693, w1:2.73674235, loss:0.16243655
 186, w0:3.62399342, w1:2.73518671, loss:0.16198309
 187, w0:3.62544554, w1:2.73363575, loss:0.16153236
 188, w0:3.62689329, w1:2.73208944, loss:0.16108433
 189, w0:3.62833669, w1:2.73054778, loss:0.16063899
 190, w0:3.62977575, w1:2.72901076, loss:0.16019632
 191, w0:3.63121049, w1:2.72747835, loss:0.15975630
 192, w0:3.63264092, w1:2.72595055, loss:0.15931893
 193, w0:3.63406705, w1:2.72442733, loss:0.15888418
 194, w0:3.63548889, w1:2.72290869, loss:0.15845204
 195, w0:3.63690647, w1:2.72139462, loss:0.15802249
 196, w0:3.63831978, w1:2.71988509, loss:0.15759552
 197, w0:3.63972885, w1:2.71838010, loss:0.15717112
 198, w0:3.64113368, w1:2.71687963, loss:0.15674925
 199, w0:3.64253429, w1:2.71538367, loss:0.15632992
 200, w0:3.64393070, w1:2.71389220, loss:0.15591311
 201, w0:3.64532290, w1:2.71240522, loss:0.15549879
 202, w0:3.64671093, w1:2.71092270, loss:0.15508697
 203, w0:3.64809478, w1:2.70944463, loss:0.15467761
 204, w0:3.64947448, w1:2.70797101, loss:0.15427071
 205, w0:3.65085003, w1:2.70650181, loss:0.15386625
 206, w0:3.65222145, w1:2.70503703, loss:0.15346422
 207, w0:3.65358875, w1:2.70357665, loss:0.15306460
 208, w0:3.65495194, w1:2.70212066, loss:0.15266737
 209, w0:3.65631103, w1:2.70066904, loss:0.15227254
 210, w0:3.65766604, w1:2.69922178, loss:0.15188007
 211, w0:3.65901698, w1:2.69777887, loss:0.15148995
 212, w0:3.66036386, w1:2.69634030, loss:0.15110218
 213, w0:3.66170670, w1:2.69490605, loss:0.15071673
 214, w0:3.66304550, w1:2.69347611, loss:0.15033359
 215, w0:3.66438027, w1:2.69205046, loss:0.14995276
 216, w0:3.66571104, w1:2.69062910, loss:0.14957420
 217, w0:3.66703781, w1:2.68921201, loss:0.14919792
 218, w0:3.66836059, w1:2.68779917, loss:0.14882390
 219, w0:3.66967939, w1:2.68639058, loss:0.14845212
 220, w0:3.67099424, w1:2.68498623, loss:0.14808258
 221, w0:3.67230513, w1:2.68358609, loss:0.14771525
 222, w0:3.67361209, w1:2.68219016, loss:0.14735012
 223, w0:3.67491512, w1:2.68079842, loss:0.14698718
 224, w0:3.67621423, w1:2.67941087, loss:0.14662643
 225, w0:3.67750944, w1:2.67802748, loss:0.14626783
 226, w0:3.67880076, w1:2.67664825, loss:0.14591139
 227, w0:3.68008820, w1:2.67527316, loss:0.14555709
 228, w0:3.68137177, w1:2.67390221, loss:0.14520491
 229, w0:3.68265148, w1:2.67253537, loss:0.14485485
 230, w0:3.68392735, w1:2.67117264, loss:0.14450688
 231, w0:3.68519939, w1:2.66981401, loss:0.14416101
 232, w0:3.68646760, w1:2.66845946, loss:0.14381721
 233, w0:3.68773201, w1:2.66710898, loss:0.14347547
 234, w0:3.68899261, w1:2.66576255, loss:0.14313578
 235, w0:3.69024943, w1:2.66442017, loss:0.14279813
 236, w0:3.69150247, w1:2.66308182, loss:0.14246251
 237, w0:3.69275175, w1:2.66174750, loss:0.14212890
 238, w0:3.69399727, w1:2.66041718, loss:0.14179729
 239, w0:3.69523905, w1:2.65909086, loss:0.14146767
 240, w0:3.69647710, w1:2.65776853, loss:0.14114003
 241, w0:3.69771143, w1:2.65645017, loss:0.14081436
 242, w0:3.69894205, w1:2.65513577, loss:0.14049064
 243, w0:3.70016897, w1:2.65382532, loss:0.14016886
 244, w0:3.70139221, w1:2.65251880, loss:0.13984901
 245, w0:3.70261177, w1:2.65121621, loss:0.13953108
 246, w0:3.70382767, w1:2.64991754, loss:0.13921506
 247, w0:3.70503992, w1:2.64862277, loss:0.13890094
 248, w0:3.70624852, w1:2.64733188, loss:0.13858870
 249, w0:3.70745349, w1:2.64604488, loss:0.13827833
 250, w0:3.70865484, w1:2.64476174, loss:0.13796983
 251, w0:3.70985258, w1:2.64348246, loss:0.13766317
 252, w0:3.71104672, w1:2.64220702, loss:0.13735836
 253, w0:3.71223728, w1:2.64093542, loss:0.13705538
 254, w0:3.71342426, w1:2.63966763, loss:0.13675421
 255, w0:3.71460767, w1:2.63840365, loss:0.13645485
 256, w0:3.71578752, w1:2.63714347, loss:0.13615729
 257, w0:3.71696384, w1:2.63588708, loss:0.13586151
 258, w0:3.71813661, w1:2.63463446, loss:0.13556750
 259, w0:3.71930586, w1:2.63338561, loss:0.13527527
 260, w0:3.72047160, w1:2.63214051, loss:0.13498478
 261, w0:3.72163384, w1:2.63089915, loss:0.13469604
 262, w0:3.72279259, w1:2.62966152, loss:0.13440903
 263, w0:3.72394785, w1:2.62842760, loss:0.13412374
 264, w0:3.72509964, w1:2.62719740, loss:0.13384016
 265, w0:3.72624798, w1:2.62597089, loss:0.13355829
 266, w0:3.72739286, w1:2.62474806, loss:0.13327810
 267, w0:3.72853430, w1:2.62352891, loss:0.13299960
 268, w0:3.72967231, w1:2.62231343, loss:0.13272277
 269, w0:3.73080691, w1:2.62110159, loss:0.13244760
 270, w0:3.73193809, w1:2.61989340, loss:0.13217408
 271, w0:3.73306588, w1:2.61868883, loss:0.13190220
 272, w0:3.73419028, w1:2.61748789, loss:0.13163195
 273, w0:3.73531129, w1:2.61629055, loss:0.13136333
 274, w0:3.73642895, w1:2.61509681, loss:0.13109631
 275, w0:3.73754324, w1:2.61390666, loss:0.13083090
 276, w0:3.73865418, w1:2.61272008, loss:0.13056708
 277, w0:3.73976179, w1:2.61153707, loss:0.13030484
 278, w0:3.74086607, w1:2.61035761, loss:0.13004418
 279, w0:3.74196703, w1:2.60918170, loss:0.12978508
 280, w0:3.74306469, w1:2.60800932, loss:0.12952754
 281, w0:3.74415904, w1:2.60684046, loss:0.12927154
 282, w0:3.74525011, w1:2.60567511, loss:0.12901707
 283, w0:3.74633790, w1:2.60451327, loss:0.12876414
 284, w0:3.74742242, w1:2.60335492, loss:0.12851272
 285, w0:3.74850368, w1:2.60220004, loss:0.12826281
 286, w0:3.74958170, w1:2.60104864, loss:0.12801440
 287, w0:3.75065647, w1:2.59990069, loss:0.12776748
 288, w0:3.75172802, w1:2.59875620, loss:0.12752204
 289, w0:3.75279634, w1:2.59761514, loss:0.12727807
 290, w0:3.75386146, w1:2.59647751, loss:0.12703557
 291, w0:3.75492338, w1:2.59534330, loss:0.12679452
 292, w0:3.75598210, w1:2.59421250, loss:0.12655492
 293, w0:3.75703765, w1:2.59308510, loss:0.12631676
 294, w0:3.75809002, w1:2.59196108, loss:0.12608002
 295, w0:3.75913923, w1:2.59084044, loss:0.12584471
 296, w0:3.76018529, w1:2.58972316, loss:0.12561081
 297, w0:3.76122821, w1:2.58860925, loss:0.12537831
 298, w0:3.76226799, w1:2.58749868, loss:0.12514721
 299, w0:3.76330465, w1:2.58639144, loss:0.12491749
 300, w0:3.76433819, w1:2.58528754, loss:0.12468915
 301, w0:3.76536863, w1:2.58418695, loss:0.12446218
 302, w0:3.76639597, w1:2.58308966, loss:0.12423658
 303, w0:3.76742023, w1:2.58199568, loss:0.12401232
 304, w0:3.76844141, w1:2.58090498, loss:0.12378941
 305, w0:3.76945952, w1:2.57981756, loss:0.12356784
 306, w0:3.77047457, w1:2.57873340, loss:0.12334760
 307, w0:3.77148657, w1:2.57765251, loss:0.12312868
 308, w0:3.77249553, w1:2.57657486, loss:0.12291108
 309, w0:3.77350146, w1:2.57550044, loss:0.12269478
 310, w0:3.77450437, w1:2.57442926, loss:0.12247978
 311, w0:3.77550427, w1:2.57336129, loss:0.12226606
 312, w0:3.77650115, w1:2.57229654, loss:0.12205363
 313, w0:3.77749505, w1:2.57123498, loss:0.12184248
 314, w0:3.77848596, w1:2.57017661, loss:0.12163259
 315, w0:3.77947389, w1:2.56912142, loss:0.12142396
 316, w0:3.78045885, w1:2.56806940, loss:0.12121658
 317, w0:3.78144086, w1:2.56702055, loss:0.12101045
 318, w0:3.78241991, w1:2.56597484, loss:0.12080555
 319, w0:3.78339602, w1:2.56493228, loss:0.12060189
 320, w0:3.78436920, w1:2.56389285, loss:0.12039944
 321, w0:3.78533945, w1:2.56285654, loss:0.12019821
 322, w0:3.78630679, w1:2.56182334, loss:0.11999819
 323, w0:3.78727123, w1:2.56079325, loss:0.11979937
 324, w0:3.78823276, w1:2.55976626, loss:0.11960174
 325, w0:3.78919141, w1:2.55874235, loss:0.11940529
 326, w0:3.79014718, w1:2.55772151, loss:0.11921003
 327, w0:3.79110007, w1:2.55670375, loss:0.11901593
 328, w0:3.79205010, w1:2.55568904, loss:0.11882300
 329, w0:3.79299728, w1:2.55467738, loss:0.11863123
 330, w0:3.79394161, w1:2.55366876, loss:0.11844061
 331, w0:3.79488310, w1:2.55266317, loss:0.11825113
 332, w0:3.79582177, w1:2.55166060, loss:0.11806279
 333, w0:3.79675762, w1:2.55066104, loss:0.11787558
 334, w0:3.79769065, w1:2.54966449, loss:0.11768950
 335, w0:3.79862088, w1:2.54867093, loss:0.11750453
 336, w0:3.79954831, w1:2.54768036, loss:0.11732067
 337, w0:3.80047296, w1:2.54669276, loss:0.11713791
 338, w0:3.80139483, w1:2.54570813, loss:0.11695625
 339, w0:3.80231393, w1:2.54472646, loss:0.11677568
 340, w0:3.80323027, w1:2.54374773, loss:0.11659619
 341, w0:3.80414386, w1:2.54277195, loss:0.11641778
 342, w0:3.80505470, w1:2.54179910, loss:0.11624044
 343, w0:3.80596280, w1:2.54082917, loss:0.11606416
 344, w0:3.80686818, w1:2.53986216, loss:0.11588894
 345, w0:3.80777084, w1:2.53889805, loss:0.11571478
 346, w0:3.80867078, w1:2.53793684, loss:0.11554166
 347, w0:3.80956802, w1:2.53697852, loss:0.11536957
 348, w0:3.81046256, w1:2.53602308, loss:0.11519852
 349, w0:3.81135442, w1:2.53507050, loss:0.11502850
 350, w0:3.81224360, w1:2.53412079, loss:0.11485949
 351, w0:3.81313010, w1:2.53317394, loss:0.11469150
 352, w0:3.81401394, w1:2.53222992, loss:0.11452452
 353, w0:3.81489513, w1:2.53128875, loss:0.11435854
 354, w0:3.81577367, w1:2.53035040, loss:0.11419355
 355, w0:3.81664957, w1:2.52941487, loss:0.11402956
 356, w0:3.81752284, w1:2.52848215, loss:0.11386655
 357, w0:3.81839348, w1:2.52755224, loss:0.11370452
 358, w0:3.81926151, w1:2.52662511, loss:0.11354346
 359, w0:3.82012693, w1:2.52570078, loss:0.11338336
 360, w0:3.82098975, w1:2.52477922, loss:0.11322423
 361, w0:3.82184997, w1:2.52386043, loss:0.11306605
 362, w0:3.82270762, w1:2.52294440, loss:0.11290882
 363, w0:3.82356268, w1:2.52203112, loss:0.11275253
 364, w0:3.82441518, w1:2.52112059, loss:0.11259719
 365, w0:3.82526511, w1:2.52021279, loss:0.11244277
 366, w0:3.82611249, w1:2.51930772, loss:0.11228928
 367, w0:3.82695733, w1:2.51840537, loss:0.11213671
 368, w0:3.82779963, w1:2.51750573, loss:0.11198506
 369, w0:3.82863939, w1:2.51660879, loss:0.11183431
 370, w0:3.82947664, w1:2.51571455, loss:0.11168447
 371, w0:3.83031137, w1:2.51482299, loss:0.11153553
 372, w0:3.83114359, w1:2.51393412, loss:0.11138749
 373, w0:3.83197330, w1:2.51304791, loss:0.11124033
 374, w0:3.83280053, w1:2.51216437, loss:0.11109406
 375, w0:3.83362527, w1:2.51128348, loss:0.11094866
 376, w0:3.83444754, w1:2.51040524, loss:0.11080413
 377, w0:3.83526733, w1:2.50952964, loss:0.11066047
 378, w0:3.83608466, w1:2.50865666, loss:0.11051768
 379, w0:3.83689953, w1:2.50778631, loss:0.11037574
 380, w0:3.83771196, w1:2.50691858, loss:0.11023465
 381, w0:3.83852194, w1:2.50605345, loss:0.11009441
 382, w0:3.83932949, w1:2.50519092, loss:0.10995501
 383, w0:3.84013462, w1:2.50433099, loss:0.10981645
 384, w0:3.84093732, w1:2.50347363, loss:0.10967872
 385, w0:3.84173762, w1:2.50261886, loss:0.10954181
 386, w0:3.84253551, w1:2.50176665, loss:0.10940573
 387, w0:3.84333100, w1:2.50091700, loss:0.10927046
 388, w0:3.84412410, w1:2.50006991, loss:0.10913600
 389, w0:3.84491482, w1:2.49922536, loss:0.10900235
 390, w0:3.84570316, w1:2.49838334, loss:0.10886951
 391, w0:3.84648914, w1:2.49754386, loss:0.10873746
 392, w0:3.84727275, w1:2.49670690, loss:0.10860620
 393, w0:3.84805401, w1:2.49587245, loss:0.10847573
 394, w0:3.84883292, w1:2.49504051, loss:0.10834604
 395, w0:3.84960949, w1:2.49421107, loss:0.10821713
 396, w0:3.85038373, w1:2.49338413, loss:0.10808900
 397, w0:3.85115564, w1:2.49255967, loss:0.10796163
 398, w0:3.85192524, w1:2.49173768, loss:0.10783503
 399, w0:3.85269252, w1:2.49091816, loss:0.10770918
 400, w0:3.85345749, w1:2.49010111, loss:0.10758410
 401, w0:3.85422017, w1:2.48928651, loss:0.10745976
 402, w0:3.85498055, w1:2.48847436, loss:0.10733617
 403, w0:3.85573865, w1:2.48766465, loss:0.10721332
 404, w0:3.85649448, w1:2.48685737, loss:0.10709120
 405, w0:3.85724803, w1:2.48605252, loss:0.10696982
 406, w0:3.85799932, w1:2.48525008, loss:0.10684917
 407, w0:3.85874835, w1:2.48445006, loss:0.10672924
 408, w0:3.85949513, w1:2.48365244, loss:0.10661003
 409, w0:3.86023966, w1:2.48285722, loss:0.10649154
 410, w0:3.86098196, w1:2.48206438, loss:0.10637376
 411, w0:3.86172203, w1:2.48127393, loss:0.10625668
 412, w0:3.86245988, w1:2.48048585, loss:0.10614031
 413, w0:3.86319551, w1:2.47970014, loss:0.10602464
 414, w0:3.86392892, w1:2.47891680, loss:0.10590966
 415, w0:3.86466014, w1:2.47813580, loss:0.10579536
 416, w0:3.86538916, w1:2.47735715, loss:0.10568176
 417, w0:3.86611598, w1:2.47658084, loss:0.10556884
 418, w0:3.86684063, w1:2.47580687, loss:0.10545659
 419, w0:3.86756309, w1:2.47503522, loss:0.10534502
 420, w0:3.86828339, w1:2.47426588, loss:0.10523411
 421, w0:3.86900152, w1:2.47349886, loss:0.10512388
 422, w0:3.86971750, w1:2.47273415, loss:0.10501430
 423, w0:3.87043132, w1:2.47197173, loss:0.10490538
 424, w0:3.87114300, w1:2.47121160, loss:0.10479712
 425, w0:3.87185254, w1:2.47045375, loss:0.10468950
 426, w0:3.87255994, w1:2.46969819, loss:0.10458253
 427, w0:3.87326523, w1:2.46894489, loss:0.10447620
 428, w0:3.87396839, w1:2.46819385, loss:0.10437051
 429, w0:3.87466944, w1:2.46744508, loss:0.10426546
 430, w0:3.87536839, w1:2.46669855, loss:0.10416103
 431, w0:3.87606523, w1:2.46595426, loss:0.10405723
 432, w0:3.87675998, w1:2.46521222, loss:0.10395406
 433, w0:3.87745264, w1:2.46447240, loss:0.10385150
 434, w0:3.87814323, w1:2.46373480, loss:0.10374956
 435, w0:3.87883173, w1:2.46299942, loss:0.10364823
 436, w0:3.87951817, w1:2.46226625, loss:0.10354750
 437, w0:3.88020255, w1:2.46153528, loss:0.10344739
 438, w0:3.88088487, w1:2.46080651, loss:0.10334787
 439, w0:3.88156514, w1:2.46007993, loss:0.10324895
 440, w0:3.88224337, w1:2.45935553, loss:0.10315062
 441, w0:3.88291955, w1:2.45863331, loss:0.10305289
 442, w0:3.88359371, w1:2.45791325, loss:0.10295574
 443, w0:3.88426584, w1:2.45719536, loss:0.10285917
 444, w0:3.88493595, w1:2.45647963, loss:0.10276318
 445, w0:3.88560405, w1:2.45576605, loss:0.10266777
 446, w0:3.88627014, w1:2.45505461, loss:0.10257293
 447, w0:3.88693423, w1:2.45434531, loss:0.10247866
 448, w0:3.88759633, w1:2.45363814, loss:0.10238495
 449, w0:3.88825643, w1:2.45293310, loss:0.10229181
 450, w0:3.88891456, w1:2.45223017, loss:0.10219923
 451, w0:3.88957070, w1:2.45152936, loss:0.10210720
 452, w0:3.89022487, w1:2.45083065, loss:0.10201572
 453, w0:3.89087708, w1:2.45013404, loss:0.10192480
 454, w0:3.89152733, w1:2.44943953, loss:0.10183442
 455, w0:3.89217562, w1:2.44874710, loss:0.10174458
 456, w0:3.89282197, w1:2.44805675, loss:0.10165528
 457, w0:3.89346637, w1:2.44736847, loss:0.10156651
 458, w0:3.89410884, w1:2.44668226, loss:0.10147828
 459, w0:3.89474938, w1:2.44599812, loss:0.10139058
 460, w0:3.89538800, w1:2.44531603, loss:0.10130340
 461, w0:3.89602469, w1:2.44463599, loss:0.10121675
 462, w0:3.89665947, w1:2.44395799, loss:0.10113061
 463, w0:3.89729235, w1:2.44328203, loss:0.10104500
 464, w0:3.89792332, w1:2.44260810, loss:0.10095990
 465, w0:3.89855240, w1:2.44193620, loss:0.10087530
 466, w0:3.89917958, w1:2.44126631, loss:0.10079122
 467, w0:3.89980489, w1:2.44059844, loss:0.10070764
 468, w0:3.90042831, w1:2.43993257, loss:0.10062456
 469, w0:3.90104986, w1:2.43926871, loss:0.10054198
 470, w0:3.90166955, w1:2.43860684, loss:0.10045990
 471, w0:3.90228737, w1:2.43794696, loss:0.10037830
 472, w0:3.90290333, w1:2.43728906, loss:0.10029720
 473, w0:3.90351745, w1:2.43663313, loss:0.10021659
 474, w0:3.90412972, w1:2.43597918, loss:0.10013645
 475, w0:3.90474015, w1:2.43532719, loss:0.10005680
 476, w0:3.90534874, w1:2.43467717, loss:0.09997763
 477, w0:3.90595551, w1:2.43402909, loss:0.09989893
 478, w0:3.90656046, w1:2.43338296, loss:0.09982070
 479, w0:3.90716358, w1:2.43273877, loss:0.09974295
 480, w0:3.90776490, w1:2.43209652, loss:0.09966566
 481, w0:3.90836441, w1:2.43145620, loss:0.09958883
 482, w0:3.90896211, w1:2.43081780, loss:0.09951246
 483, w0:3.90955802, w1:2.43018132, loss:0.09943656
 484, w0:3.91015214, w1:2.42954676, loss:0.09936110
 485, w0:3.91074448, w1:2.42891409, loss:0.09928611
 486, w0:3.91133504, w1:2.42828333, loss:0.09921156
 487, w0:3.91192382, w1:2.42765447, loss:0.09913745
 488, w0:3.91251083, w1:2.42702749, loss:0.09906380
 489, w0:3.91309608, w1:2.42640240, loss:0.09899058
 490, w0:3.91367957, w1:2.42577919, loss:0.09891780
 491, w0:3.91426131, w1:2.42515785, loss:0.09884547
 492, w0:3.91484130, w1:2.42453837, loss:0.09877356
 493, w0:3.91541954, w1:2.42392076, loss:0.09870209
 494, w0:3.91599605, w1:2.42330500, loss:0.09863104
 495, w0:3.91657083, w1:2.42269110, loss:0.09856042
 496, w0:3.91714388, w1:2.42207903, loss:0.09849023
 497, w0:3.91771521, w1:2.42146881, loss:0.09842045
 498, w0:3.91828482, w1:2.42086042, loss:0.09835110
 499, w0:3.91885272, w1:2.42025386, loss:0.09828216
 500, w0:3.91941892, w1:2.41964912, loss:0.09821363
 501, w0:3.91998341, w1:2.41904619, loss:0.09814552
 502, w0:3.92054621, w1:2.41844508, loss:0.09807781
 503, w0:3.92110731, w1:2.41784577, loss:0.09801051
 504, w0:3.92166673, w1:2.41724827, loss:0.09794362
 505, w0:3.92222447, w1:2.41665256, loss:0.09787712
 506, w0:3.92278054, w1:2.41605864, loss:0.09781103
 507, w0:3.92333493, w1:2.41546650, loss:0.09774533
 508, w0:3.92388766, w1:2.41487615, loss:0.09768002
 509, w0:3.92443872, w1:2.41428757, loss:0.09761511
 510, w0:3.92498813, w1:2.41370075, loss:0.09755059
 511, w0:3.92553589, w1:2.41311570, loss:0.09748645
 512, w0:3.92608201, w1:2.41253241, loss:0.09742270
 513, w0:3.92662648, w1:2.41195087, loss:0.09735933
 514, w0:3.92716932, w1:2.41137107, loss:0.09729634
 515, w0:3.92771053, w1:2.41079302, loss:0.09723373
 516, w0:3.92825011, w1:2.41021671, loss:0.09717150
 517, w0:3.92878807, w1:2.40964212, loss:0.09710964
 518, w0:3.92932441, w1:2.40906927, loss:0.09704815
 519, w0:3.92985914, w1:2.40849813, loss:0.09698702
 520, w0:3.93039227, w1:2.40792871, loss:0.09692627
 521, w0:3.93092379, w1:2.40736100, loss:0.09686588
 522, w0:3.93145372, w1:2.40679500, loss:0.09680585
 523, w0:3.93198205, w1:2.40623070, loss:0.09674618
 524, w0:3.93250880, w1:2.40566809, loss:0.09668687
 525, w0:3.93303396, w1:2.40510717, loss:0.09662792
 526, w0:3.93355755, w1:2.40454794, loss:0.09656932
 527, w0:3.93407956, w1:2.40399039, loss:0.09651107
 528, w0:3.93460001, w1:2.40343451, loss:0.09645317
 529, w0:3.93511889, w1:2.40288031, loss:0.09639562
 530, w0:3.93563621, w1:2.40232777, loss:0.09633842
 531, w0:3.93615198, w1:2.40177689, loss:0.09628155
 532, w0:3.93666619, w1:2.40122766, loss:0.09622503
 533, w0:3.93717887, w1:2.40068009, loss:0.09616885
 534, w0:3.93769000, w1:2.40013416, loss:0.09611300
 535, w0:3.93819960, w1:2.39958987, loss:0.09605749
 536, w0:3.93870766, w1:2.39904721, loss:0.09600231
 537, w0:3.93921420, w1:2.39850619, loss:0.09594747
 538, w0:3.93971922, w1:2.39796679, loss:0.09589295
 539, w0:3.94022272, w1:2.39742901, loss:0.09583876
 540, w0:3.94072471, w1:2.39689285, loss:0.09578490
 541, w0:3.94122519, w1:2.39635830, loss:0.09573135
 542, w0:3.94172416, w1:2.39582535, loss:0.09567813
 543, w0:3.94222164, w1:2.39529401, loss:0.09562523
 544, w0:3.94271762, w1:2.39476426, loss:0.09557265
 545, w0:3.94321211, w1:2.39423611, loss:0.09552038
 546, w0:3.94370512, w1:2.39370954, loss:0.09546842
 547, w0:3.94419664, w1:2.39318455, loss:0.09541678
 548, w0:3.94468669, w1:2.39266114, loss:0.09536545
 549, w0:3.94517527, w1:2.39213931, loss:0.09531442
 550, w0:3.94566237, w1:2.39161904, loss:0.09526370
 551, w0:3.94614802, w1:2.39110033, loss:0.09521329
 552, w0:3.94663220, w1:2.39058318, loss:0.09516318
 553, w0:3.94711493, w1:2.39006759, loss:0.09511337
 554, w0:3.94759621, w1:2.38955355, loss:0.09506385
 555, w0:3.94807604, w1:2.38904105, loss:0.09501464
 556, w0:3.94855444, w1:2.38853009, loss:0.09496572
 557, w0:3.94903139, w1:2.38802066, loss:0.09491709
 558, w0:3.94950691, w1:2.38751277, loss:0.09486876
 559, w0:3.94998100, w1:2.38700640, loss:0.09482071
 560, w0:3.95045367, w1:2.38650155, loss:0.09477295
 561, w0:3.95092492, w1:2.38599822, loss:0.09472548
 562, w0:3.95139475, w1:2.38549640, loss:0.09467830
 563, w0:3.95186317, w1:2.38499609, loss:0.09463140
 564, w0:3.95233019, w1:2.38449729, loss:0.09458478
 565, w0:3.95279580, w1:2.38399998, loss:0.09453843
 566, w0:3.95326001, w1:2.38350416, loss:0.09449237
 567, w0:3.95372282, w1:2.38300984, loss:0.09444658
 568, w0:3.95418425, w1:2.38251700, loss:0.09440107
 569, w0:3.95464429, w1:2.38202564, loss:0.09435583
 570, w0:3.95510295, w1:2.38153576, loss:0.09431087
 571, w0:3.95556023, w1:2.38104735, loss:0.09426617
 572, w0:3.95601613, w1:2.38056041, loss:0.09422174
 573, w0:3.95647067, w1:2.38007493, loss:0.09417758
 574, w0:3.95692384, w1:2.37959091, loss:0.09413368
 575, w0:3.95737565, w1:2.37910834, loss:0.09409004
 576, w0:3.95782610, w1:2.37862722, loss:0.09404667
 577, w0:3.95827519, w1:2.37814755, loss:0.09400356
 578, w0:3.95872294, w1:2.37766932, loss:0.09396070
 579, w0:3.95916934, w1:2.37719253, loss:0.09391811
 580, w0:3.95961441, w1:2.37671717, loss:0.09387577
 581, w0:3.96005813, w1:2.37624323, loss:0.09383368
 582, w0:3.96050052, w1:2.37577072, loss:0.09379185
 583, w0:3.96094158, w1:2.37529964, loss:0.09375026
 584, w0:3.96138132, w1:2.37482996, loss:0.09370893
 585, w0:3.96181974, w1:2.37436170, loss:0.09366784
 586, w0:3.96225683, w1:2.37389484, loss:0.09362700
 587, w0:3.96269262, w1:2.37342939, loss:0.09358641
 588, w0:3.96312710, w1:2.37296534, loss:0.09354606
 589, w0:3.96356027, w1:2.37250268, loss:0.09350595
 590, w0:3.96399213, w1:2.37204141, loss:0.09346608
 591, w0:3.96442271, w1:2.37158152, loss:0.09342645
 592, w0:3.96485198, w1:2.37112302, loss:0.09338706
 593, w0:3.96527997, w1:2.37066590, loss:0.09334791
 594, w0:3.96570667, w1:2.37021014, loss:0.09330899
 595, w0:3.96613209, w1:2.36975576, loss:0.09327030
 596, w0:3.96655624, w1:2.36930275, loss:0.09323185
 597, w0:3.96697910, w1:2.36885109, loss:0.09319362
 598, w0:3.96740070, w1:2.36840079, loss:0.09315563
 599, w0:3.96782103, w1:2.36795185, loss:0.09311786
 600, w0:3.96824010, w1:2.36750425, loss:0.09308032
 601, w0:3.96865791, w1:2.36705800, loss:0.09304301
 602, w0:3.96907446, w1:2.36661308, loss:0.09300592
 603, w0:3.96948976, w1:2.36616951, loss:0.09296905
 604, w0:3.96990381, w1:2.36572727, loss:0.09293240
 605, w0:3.97031662, w1:2.36528636, loss:0.09289598
 606, w0:3.97072819, w1:2.36484677, loss:0.09285977
 607, w0:3.97113852, w1:2.36440850, loss:0.09282378
 608, w0:3.97154762, w1:2.36397155, loss:0.09278800
 609, w0:3.97195550, w1:2.36353591, loss:0.09275244
 610, w0:3.97236214, w1:2.36310158, loss:0.09271709
 611, w0:3.97276756, w1:2.36266856, loss:0.09268196
 612, w0:3.97317177, w1:2.36223683, loss:0.09264704
 613, w0:3.97357476, w1:2.36180641, loss:0.09261232
 614, w0:3.97397654, w1:2.36137728, loss:0.09257781
 615, w0:3.97437711, w1:2.36094943, loss:0.09254352
 616, w0:3.97477648, w1:2.36052287, loss:0.09250942
 617, w0:3.97517465, w1:2.36009760, loss:0.09247553
 618, w0:3.97557162, w1:2.35967360, loss:0.09244185
 619, w0:3.97596740, w1:2.35925088, loss:0.09240836
 620, w0:3.97636200, w1:2.35882942, loss:0.09237508
 621, w0:3.97675540, w1:2.35840923, loss:0.09234200
 622, w0:3.97714763, w1:2.35799031, loss:0.09230911
 623, w0:3.97753867, w1:2.35757264, loss:0.09227643
 624, w0:3.97792854, w1:2.35715623, loss:0.09224394
 625, w0:3.97831724, w1:2.35674107, loss:0.09221164
 626, w0:3.97870477, w1:2.35632715, loss:0.09217954
 627, w0:3.97909114, w1:2.35591448, loss:0.09214763
 628, w0:3.97947634, w1:2.35550305, loss:0.09211591
 629, w0:3.97986039, w1:2.35509286, loss:0.09208438
 630, w0:3.98024329, w1:2.35468390, loss:0.09205304
 631, w0:3.98062503, w1:2.35427616, loss:0.09202189
 632, w0:3.98100563, w1:2.35386966, loss:0.09199093
 633, w0:3.98138508, w1:2.35346437, loss:0.09196015
 634, w0:3.98176340, w1:2.35306030, loss:0.09192956
 635, w0:3.98214057, w1:2.35265745, loss:0.09189915
 636, w0:3.98251662, w1:2.35225580, loss:0.09186892
 637, w0:3.98289153, w1:2.35185536, loss:0.09183888
 638, w0:3.98326532, w1:2.35145613, loss:0.09180901
 639, w0:3.98363798, w1:2.35105810, loss:0.09177932
 640, w0:3.98400953, w1:2.35066126, loss:0.09174982
 641, w0:3.98437996, w1:2.35026561, loss:0.09172048
 642, w0:3.98474927, w1:2.34987115, loss:0.09169133
 643, w0:3.98511748, w1:2.34947788, loss:0.09166235
 644, w0:3.98548458, w1:2.34908579, loss:0.09163354
 645, w0:3.98585057, w1:2.34869487, loss:0.09160491
 646, w0:3.98621547, w1:2.34830514, loss:0.09157645
 647, w0:3.98657927, w1:2.34791657, loss:0.09154816
 648, w0:3.98694198, w1:2.34752917, loss:0.09152003
 649, w0:3.98730360, w1:2.34714293, loss:0.09149208
 650, w0:3.98766413, w1:2.34675786, loss:0.09146430
 651, w0:3.98802357, w1:2.34637394, loss:0.09143668
 652, w0:3.98838194, w1:2.34599117, loss:0.09140923
 653, w0:3.98873923, w1:2.34560956, loss:0.09138194
 654, w0:3.98909545, w1:2.34522909, loss:0.09135481
 655, w0:3.98945060, w1:2.34484976, loss:0.09132785
 656, w0:3.98980468, w1:2.34447158, loss:0.09130105
 657, w0:3.99015770, w1:2.34409453, loss:0.09127442
 658, w0:3.99050965, w1:2.34371861, loss:0.09124794
 659, w0:3.99086055, w1:2.34334382, loss:0.09122162
 660, w0:3.99121040, w1:2.34297016, loss:0.09119545
 661, w0:3.99155919, w1:2.34259762, loss:0.09116945
 662, w0:3.99190693, w1:2.34222620, loss:0.09114360
 663, w0:3.99225363, w1:2.34185590, loss:0.09111791
 664, w0:3.99259929, w1:2.34148671, loss:0.09109237
 665, w0:3.99294391, w1:2.34111863, loss:0.09106698
 666, w0:3.99328750, w1:2.34075165, loss:0.09104175
 667, w0:3.99363005, w1:2.34038578, loss:0.09101666
 668, w0:3.99397157, w1:2.34002101, loss:0.09099173
 669, w0:3.99431207, w1:2.33965733, loss:0.09096695
 670, w0:3.99465154, w1:2.33929474, loss:0.09094231
 671, w0:3.99499000, w1:2.33893325, loss:0.09091783
 672, w0:3.99532744, w1:2.33857284, loss:0.09089349
 673, w0:3.99566386, w1:2.33821351, loss:0.09086930
 674, w0:3.99599927, w1:2.33785526, loss:0.09084525
 675, w0:3.99633368, w1:2.33749809, loss:0.09082134
 676, w0:3.99666708, w1:2.33714200, loss:0.09079758
 677, w0:3.99699947, w1:2.33678697, loss:0.09077397
 678, w0:3.99733087, w1:2.33643301, loss:0.09075049
 679, w0:3.99766128, w1:2.33608011, loss:0.09072715
 680, w0:3.99799069, w1:2.33572827, loss:0.09070396
 681, w0:3.99831911, w1:2.33537749, loss:0.09068090
 682, w0:3.99864655, w1:2.33502777, loss:0.09065799
 683, w0:3.99897300, w1:2.33467909, loss:0.09063521
 684, w0:3.99929847, w1:2.33433146, loss:0.09061256
 685, w0:3.99962296, w1:2.33398488, loss:0.09059005
 686, w0:3.99994648, w1:2.33363934, loss:0.09056768
 687, w0:4.00026902, w1:2.33329484, loss:0.09054544
 688, w0:4.00059060, w1:2.33295137, loss:0.09052334
 689, w0:4.00091121, w1:2.33260893, loss:0.09050137
 690, w0:4.00123085, w1:2.33226752, loss:0.09047953
 691, w0:4.00154954, w1:2.33192714, loss:0.09045782
 692, w0:4.00186727, w1:2.33158778, loss:0.09043624
 693, w0:4.00218404, w1:2.33124944, loss:0.09041479
 694, w0:4.00249987, w1:2.33091212, loss:0.09039347
 695, w0:4.00281474, w1:2.33057581, loss:0.09037227
 696, w0:4.00312867, w1:2.33024051, loss:0.09035121
 697, w0:4.00344165, w1:2.32990622, loss:0.09033027
 698, w0:4.00375369, w1:2.32957293, loss:0.09030945
 699, w0:4.00406480, w1:2.32924064, loss:0.09028877
 700, w0:4.00437497, w1:2.32890936, loss:0.09026820
 701, w0:4.00468421, w1:2.32857906, loss:0.09024776
 702, w0:4.00499252, w1:2.32824976, loss:0.09022744
 703, w0:4.00529991, w1:2.32792145, loss:0.09020724
 704, w0:4.00560637, w1:2.32759413, loss:0.09018717
 705, w0:4.00591191, w1:2.32726779, loss:0.09016721
 706, w0:4.00621653, w1:2.32694243, loss:0.09014738
 707, w0:4.00652024, w1:2.32661805, loss:0.09012766
 708, w0:4.00682303, w1:2.32629464, loss:0.09010806
 709, w0:4.00712492, w1:2.32597220, loss:0.09008858
 710, w0:4.00742589, w1:2.32565073, loss:0.09006922
 711, w0:4.00772597, w1:2.32533023, loss:0.09004997
 712, w0:4.00802514, w1:2.32501069, loss:0.09003084
 713, w0:4.00832341, w1:2.32469211, loss:0.09001182
 714, w0:4.00862079, w1:2.32437449, loss:0.08999292
 715, w0:4.00891727, w1:2.32405783, loss:0.08997413
 716, w0:4.00921286, w1:2.32374211, loss:0.08995545
 717, w0:4.00950757, w1:2.32342734, loss:0.08993689
 718, w0:4.00980138, w1:2.32311352, loss:0.08991843
 719, w0:4.01009432, w1:2.32280064, loss:0.08990009
 720, w0:4.01038638, w1:2.32248870, loss:0.08988186
 721, w0:4.01067755, w1:2.32217770, loss:0.08986373
 722, w0:4.01096786, w1:2.32186764, loss:0.08984572
 723, w0:4.01125729, w1:2.32155850, loss:0.08982781
 724, w0:4.01154585, w1:2.32125029, loss:0.08981001
 725, w0:4.01183354, w1:2.32094301, loss:0.08979232
 726, w0:4.01212037, w1:2.32063666, loss:0.08977473
 727, w0:4.01240634, w1:2.32033122, loss:0.08975725
 728, w0:4.01269145, w1:2.32002670, loss:0.08973988
 729, w0:4.01297570, w1:2.31972310, loss:0.08972261
 730, w0:4.01325910, w1:2.31942041, loss:0.08970544
 731, w0:4.01354165, w1:2.31911862, loss:0.08968837
 732, w0:4.01382335, w1:2.31881775, loss:0.08967141
 733, w0:4.01410420, w1:2.31851778, loss:0.08965455
 734, w0:4.01438421, w1:2.31821870, loss:0.08963779
 735, w0:4.01466337, w1:2.31792053, loss:0.08962113
 736, w0:4.01494170, w1:2.31762326, loss:0.08960457
 737, w0:4.01521919, w1:2.31732687, loss:0.08958811
 738, w0:4.01549585, w1:2.31703138, loss:0.08957175
 739, w0:4.01577168, w1:2.31673678, loss:0.08955549
 740, w0:4.01604668, w1:2.31644306, loss:0.08953932
 741, w0:4.01632085, w1:2.31615022, loss:0.08952326
 742, w0:4.01659420, w1:2.31585826, loss:0.08950728
 743, w0:4.01686672, w1:2.31556718, loss:0.08949141
 744, w0:4.01713843, w1:2.31527698, loss:0.08947563
 745, w0:4.01740932, w1:2.31498765, loss:0.08945994
 746, w0:4.01767940, w1:2.31469918, loss:0.08944435
 747, w0:4.01794867, w1:2.31441158, loss:0.08942885
 748, w0:4.01821712, w1:2.31412485, loss:0.08941345
 749, w0:4.01848477, w1:2.31383898, loss:0.08939813
 750, w0:4.01875162, w1:2.31355397, loss:0.08938291
 751, w0:4.01901766, w1:2.31326981, loss:0.08936778
 752, w0:4.01928291, w1:2.31298651, loss:0.08935274
 753, w0:4.01954736, w1:2.31270405, loss:0.08933780
 754, w0:4.01981101, w1:2.31242245, loss:0.08932294
 755, w0:4.02007387, w1:2.31214170, loss:0.08930817
 756, w0:4.02033594, w1:2.31186178, loss:0.08929348
 757, w0:4.02059723, w1:2.31158271, loss:0.08927889
 758, w0:4.02085773, w1:2.31130448, loss:0.08926439
 759, w0:4.02111744, w1:2.31102708, loss:0.08924997
 760, w0:4.02137638, w1:2.31075051, loss:0.08923564
 761, w0:4.02163454, w1:2.31047478, loss:0.08922139
 762, w0:4.02189192, w1:2.31019987, loss:0.08920723
 763, w0:4.02214853, w1:2.30992580, loss:0.08919315
 764, w0:4.02240437, w1:2.30965254, loss:0.08917916
 765, w0:4.02265944, w1:2.30938011, loss:0.08916526
 766, w0:4.02291374, w1:2.30910849, loss:0.08915143
 767, w0:4.02316728, w1:2.30883769, loss:0.08913769
 768, w0:4.02342006, w1:2.30856770, loss:0.08912403
 769, w0:4.02367208, w1:2.30829853, loss:0.08911046
 770, w0:4.02392334, w1:2.30803016, loss:0.08909696
 771, w0:4.02417384, w1:2.30776260, loss:0.08908355
 772, w0:4.02442360, w1:2.30749585, loss:0.08907021
 773, w0:4.02467260, w1:2.30722989, loss:0.08905696
 774, w0:4.02492086, w1:2.30696474, loss:0.08904379
 775, w0:4.02516836, w1:2.30670038, loss:0.08903069
 776, w0:4.02541513, w1:2.30643681, loss:0.08901767
 777, w0:4.02566115, w1:2.30617404, loss:0.08900474
 778, w0:4.02590644, w1:2.30591206, loss:0.08899188
 779, w0:4.02615099, w1:2.30565086, loss:0.08897909
 780, w0:4.02639480, w1:2.30539045, loss:0.08896639
 781, w0:4.02663788, w1:2.30513082, loss:0.08895376
 782, w0:4.02688023, w1:2.30487197, loss:0.08894120
 783, w0:4.02712185, w1:2.30461390, loss:0.08892872
 784, w0:4.02736275, w1:2.30435660, loss:0.08891632
 785, w0:4.02760292, w1:2.30410008, loss:0.08890399
 786, w0:4.02784237, w1:2.30384433, loss:0.08889173
 787, w0:4.02808110, w1:2.30358935, loss:0.08887955
 788, w0:4.02831911, w1:2.30333513, loss:0.08886744
 789, w0:4.02855641, w1:2.30308167, loss:0.08885540
 790, w0:4.02879300, w1:2.30282898, loss:0.08884344
 791, w0:4.02902887, w1:2.30257705, loss:0.08883154
 792, w0:4.02926404, w1:2.30232587, loss:0.08881972
 793, w0:4.02949850, w1:2.30207545, loss:0.08880797
 794, w0:4.02973225, w1:2.30182578, loss:0.08879629
 795, w0:4.02996531, w1:2.30157686, loss:0.08878468
 796, w0:4.03019766, w1:2.30132869, loss:0.08877314
 797, w0:4.03042931, w1:2.30108127, loss:0.08876167
 798, w0:4.03066027, w1:2.30083459, loss:0.08875027
 799, w0:4.03089054, w1:2.30058865, loss:0.08873893
 800, w0:4.03112011, w1:2.30034344, loss:0.08872767
 801, w0:4.03134899, w1:2.30009898, loss:0.08871647
 802, w0:4.03157719, w1:2.29985525, loss:0.08870534
 803, w0:4.03180470, w1:2.29961225, loss:0.08869428
 804, w0:4.03203152, w1:2.29936998, loss:0.08868328
 805, w0:4.03225767, w1:2.29912844, loss:0.08867235
 806, w0:4.03248313, w1:2.29888763, loss:0.08866148
 807, w0:4.03270792, w1:2.29864754, loss:0.08865068
 808, w0:4.03293203, w1:2.29840817, loss:0.08863994
 809, w0:4.03315547, w1:2.29816952, loss:0.08862927
 810, w0:4.03337824, w1:2.29793158, loss:0.08861866
 811, w0:4.03360034, w1:2.29769436, loss:0.08860812
 812, w0:4.03382177, w1:2.29745786, loss:0.08859764
 813, w0:4.03404254, w1:2.29722206, loss:0.08858722
 814, w0:4.03426264, w1:2.29698698, loss:0.08857686
 815, w0:4.03448208, w1:2.29675260, loss:0.08856657
 816, w0:4.03470086, w1:2.29651892, loss:0.08855634
 817, w0:4.03491899, w1:2.29628595, loss:0.08854617
 818, w0:4.03513645, w1:2.29605367, loss:0.08853606
 819, w0:4.03535327, w1:2.29582209, loss:0.08852601
 820, w0:4.03556943, w1:2.29559121, loss:0.08851602
 821, w0:4.03578495, w1:2.29536103, loss:0.08850609
 822, w0:4.03599982, w1:2.29513153, loss:0.08849623
 823, w0:4.03621404, w1:2.29490273, loss:0.08848642
 824, w0:4.03642762, w1:2.29467461, loss:0.08847667
 825, w0:4.03664055, w1:2.29444718, loss:0.08846697
 826, w0:4.03685285, w1:2.29422043, loss:0.08845734
 827, w0:4.03706451, w1:2.29399436, loss:0.08844776
 828, w0:4.03727553, w1:2.29376897, loss:0.08843824
 829, w0:4.03748592, w1:2.29354426, loss:0.08842878
 830, w0:4.03769568, w1:2.29332022, loss:0.08841938
 831, w0:4.03790480, w1:2.29309686, loss:0.08841003
 832, w0:4.03811330, w1:2.29287416, loss:0.08840074
 833, w0:4.03832117, w1:2.29265214, loss:0.08839150
 834, w0:4.03852842, w1:2.29243078, loss:0.08838232
 835, w0:4.03873504, w1:2.29221009, loss:0.08837319
 836, w0:4.03894105, w1:2.29199007, loss:0.08836412
 837, w0:4.03914643, w1:2.29177070, loss:0.08835510
 838, w0:4.03935120, w1:2.29155199, loss:0.08834614
 839, w0:4.03955535, w1:2.29133394, loss:0.08833723
 840, w0:4.03975889, w1:2.29111654, loss:0.08832838
 841, w0:4.03996182, w1:2.29089980, loss:0.08831957
 842, w0:4.04016414, w1:2.29068371, loss:0.08831082
 843, w0:4.04036585, w1:2.29046827, loss:0.08830213
 844, w0:4.04056695, w1:2.29025347, loss:0.08829348
 845, w0:4.04076745, w1:2.29003932, loss:0.08828489
 846, w0:4.04096735, w1:2.28982582, loss:0.08827635
 847, w0:4.04116664, w1:2.28961295, loss:0.08826786
 848, w0:4.04136534, w1:2.28940073, loss:0.08825942
 849, w0:4.04156344, w1:2.28918914, loss:0.08825103
 850, w0:4.04176095, w1:2.28897819, loss:0.08824269
 851, w0:4.04195786, w1:2.28876787, loss:0.08823440
 852, w0:4.04215418, w1:2.28855819, loss:0.08822616
 853, w0:4.04234991, w1:2.28834913, loss:0.08821798
 854, w0:4.04254505, w1:2.28814070, loss:0.08820984
 855, w0:4.04273961, w1:2.28793290, loss:0.08820174
 856, w0:4.04293358, w1:2.28772572, loss:0.08819370
 857, w0:4.04312697, w1:2.28751917, loss:0.08818571
 858, w0:4.04331978, w1:2.28731324, loss:0.08817776
 859, w0:4.04351201, w1:2.28710792, loss:0.08816986
 860, w0:4.04370366, w1:2.28690322, loss:0.08816201
 861, w0:4.04389473, w1:2.28669914, loss:0.08815421
 862, w0:4.04408524, w1:2.28649567, loss:0.08814645
 863, w0:4.04427516, w1:2.28629281, loss:0.08813874
 864, w0:4.04446452, w1:2.28609056, loss:0.08813107
 865, w0:4.04465331, w1:2.28588892, loss:0.08812346
 866, w0:4.04484153, w1:2.28568788, loss:0.08811588
 867, w0:4.04502919, w1:2.28548745, loss:0.08810835
 868, w0:4.04521628, w1:2.28528762, loss:0.08810087
 869, w0:4.04540281, w1:2.28508839, loss:0.08809344
 870, w0:4.04558878, w1:2.28488976, loss:0.08808604
 871, w0:4.04577420, w1:2.28469173, loss:0.08807869
 872, w0:4.04595905, w1:2.28449429, loss:0.08807139
 873, w0:4.04614335, w1:2.28429744, loss:0.08806413
 874, w0:4.04632709, w1:2.28410119, loss:0.08805691
 875, w0:4.04651029, w1:2.28390552, loss:0.08804974
 876, w0:4.04669293, w1:2.28371045, loss:0.08804261
 877, w0:4.04687502, w1:2.28351596, loss:0.08803552
 878, w0:4.04705657, w1:2.28332205, loss:0.08802847
 879, w0:4.04723757, w1:2.28312873, loss:0.08802147
 880, w0:4.04741803, w1:2.28293598, loss:0.08801451
 881, w0:4.04759794, w1:2.28274382, loss:0.08800759
 882, w0:4.04777732, w1:2.28255223, loss:0.08800071
 883, w0:4.04795615, w1:2.28236122, loss:0.08799388
 884, w0:4.04813445, w1:2.28217079, loss:0.08798708
 885, w0:4.04831222, w1:2.28198092, loss:0.08798033
 886, w0:4.04848944, w1:2.28179163, loss:0.08797361
 887, w0:4.04866614, w1:2.28160290, loss:0.08796694
 888, w0:4.04884231, w1:2.28141474, loss:0.08796031
 889, w0:4.04901794, w1:2.28122715, loss:0.08795371
 890, w0:4.04919305, w1:2.28104012, loss:0.08794716
 891, w0:4.04936763, w1:2.28085365, loss:0.08794064
 892, w0:4.04954169, w1:2.28066775, loss:0.08793417
 893, w0:4.04971523, w1:2.28048240, loss:0.08792773
 894, w0:4.04988824, w1:2.28029760, loss:0.08792133
 895, w0:4.05006073, w1:2.28011337, loss:0.08791497
 896, w0:4.05023271, w1:2.27992969, loss:0.08790865
 897, w0:4.05040417, w1:2.27974655, loss:0.08790236
 898, w0:4.05057511, w1:2.27956397, loss:0.08789612
 899, w0:4.05074554, w1:2.27938194, loss:0.08788991
 900, w0:4.05091546, w1:2.27920046, loss:0.08788374
 901, w0:4.05108486, w1:2.27901951, loss:0.08787760
 902, w0:4.05125376, w1:2.27883912, loss:0.08787150
 903, w0:4.05142215, w1:2.27865926, loss:0.08786544
 904, w0:4.05159004, w1:2.27847995, loss:0.08785942
 905, w0:4.05175742, w1:2.27830117, loss:0.08785343
 906, w0:4.05192429, w1:2.27812294, loss:0.08784748
 907, w0:4.05209067, w1:2.27794523, loss:0.08784156
 908, w0:4.05225655, w1:2.27776806, loss:0.08783568
 909, w0:4.05242193, w1:2.27759143, loss:0.08782983
 910, w0:4.05258681, w1:2.27741532, loss:0.08782402
 911, w0:4.05275119, w1:2.27723975, loss:0.08781825
 912, w0:4.05291508, w1:2.27706470, loss:0.08781250
 913, w0:4.05307848, w1:2.27689017, loss:0.08780680
 914, w0:4.05324139, w1:2.27671617, loss:0.08780112
 915, w0:4.05340381, w1:2.27654270, loss:0.08779548
 916, w0:4.05356574, w1:2.27636974, loss:0.08778988
 917, w0:4.05372718, w1:2.27619731, loss:0.08778431
 918, w0:4.05388814, w1:2.27602539, loss:0.08777877
 919, w0:4.05404862, w1:2.27585399, loss:0.08777327
 920, w0:4.05420861, w1:2.27568310, loss:0.08776779
 921, w0:4.05436813, w1:2.27551273, loss:0.08776235
 922, w0:4.05452716, w1:2.27534287, loss:0.08775695
 923, w0:4.05468571, w1:2.27517352, loss:0.08775157
 924, w0:4.05484379, w1:2.27500468, loss:0.08774623
 925, w0:4.05500140, w1:2.27483635, loss:0.08774092
 926, w0:4.05515853, w1:2.27466852, loss:0.08773565
 927, w0:4.05531519, w1:2.27450120, loss:0.08773040
 928, w0:4.05547138, w1:2.27433437, loss:0.08772519
 929, w0:4.05562709, w1:2.27416805, loss:0.08772000
 930, w0:4.05578235, w1:2.27400223, loss:0.08771485
 931, w0:4.05593713, w1:2.27383691, loss:0.08770973
 932, w0:4.05609145, w1:2.27367209, loss:0.08770464
 933, w0:4.05624531, w1:2.27350776, loss:0.08769958
 934, w0:4.05639870, w1:2.27334392, loss:0.08769455
 935, w0:4.05655163, w1:2.27318058, loss:0.08768955
 936, w0:4.05670410, w1:2.27301772, loss:0.08768458
 937, w0:4.05685612, w1:2.27285536, loss:0.08767964
 938, w0:4.05700768, w1:2.27269348, loss:0.08767473
 939, w0:4.05715878, w1:2.27253209, loss:0.08766985
 940, w0:4.05730943, w1:2.27237119, loss:0.08766500
 941, w0:4.05745963, w1:2.27221077, loss:0.08766018
 942, w0:4.05760937, w1:2.27205083, loss:0.08765538
 943, w0:4.05775867, w1:2.27189137, loss:0.08765062
 944, w0:4.05790751, w1:2.27173239, loss:0.08764588
 945, w0:4.05805591, w1:2.27157389, loss:0.08764117
 946, w0:4.05820386, w1:2.27141586, loss:0.08763650
 947, w0:4.05835137, w1:2.27125831, loss:0.08763184
 948, w0:4.05849844, w1:2.27110123, loss:0.08762722
 949, w0:4.05864506, w1:2.27094463, loss:0.08762263
 950, w0:4.05879125, w1:2.27078849, loss:0.08761806
 951, w0:4.05893699, w1:2.27063283, loss:0.08761352
 952, w0:4.05908230, w1:2.27047763, loss:0.08760900
 953, w0:4.05922717, w1:2.27032289, loss:0.08760452
 954, w0:4.05937160, w1:2.27016863, loss:0.08760006
 955, w0:4.05951560, w1:2.27001482, loss:0.08759563
 956, w0:4.05965917, w1:2.26986148, loss:0.08759122
 957, w0:4.05980230, w1:2.26970860, loss:0.08758684
 958, w0:4.05994501, w1:2.26955618, loss:0.08758249
 959, w0:4.06008729, w1:2.26940422, loss:0.08757816
 960, w0:4.06022914, w1:2.26925271, loss:0.08757386
 961, w0:4.06037056, w1:2.26910166, loss:0.08756958
 962, w0:4.06051156, w1:2.26895106, loss:0.08756533
 963, w0:4.06065214, w1:2.26880091, loss:0.08756111
 964, w0:4.06079229, w1:2.26865122, loss:0.08755691
 965, w0:4.06093202, w1:2.26850197, loss:0.08755274
 966, w0:4.06107133, w1:2.26835318, loss:0.08754859
 967, w0:4.06121023, w1:2.26820483, loss:0.08754447
 968, w0:4.06134870, w1:2.26805693, loss:0.08754037
 969, w0:4.06148676, w1:2.26790947, loss:0.08753629
 970, w0:4.06162441, w1:2.26776245, loss:0.08753224
 971, w0:4.06176164, w1:2.26761588, loss:0.08752822
 972, w0:4.06189846, w1:2.26746974, loss:0.08752422
 973, w0:4.06203487, w1:2.26732405, loss:0.08752024
 974, w0:4.06217087, w1:2.26717879, loss:0.08751628
 975, w0:4.06230646, w1:2.26703397, loss:0.08751235
 976, w0:4.06244164, w1:2.26688958, loss:0.08750845
 977, w0:4.06257642, w1:2.26674563, loss:0.08750457
 978, w0:4.06271079, w1:2.26660211, loss:0.08750071
 979, w0:4.06284475, w1:2.26645903, loss:0.08749687
 980, w0:4.06297832, w1:2.26631637, loss:0.08749306
 981, w0:4.06311148, w1:2.26617414, loss:0.08748927
 982, w0:4.06324425, w1:2.26603234, loss:0.08748550
 983, w0:4.06337661, w1:2.26589096, loss:0.08748175
 984, w0:4.06350858, w1:2.26575001, loss:0.08747803
 985, w0:4.06364015, w1:2.26560948, loss:0.08747433
 986, w0:4.06377133, w1:2.26546937, loss:0.08747065
 987, w0:4.06390211, w1:2.26532969, loss:0.08746700
 988, w0:4.06403250, w1:2.26519042, loss:0.08746336
 989, w0:4.06416249, w1:2.26505158, loss:0.08745975
 990, w0:4.06429210, w1:2.26491315, loss:0.08745616
 991, w0:4.06442131, w1:2.26477514, loss:0.08745259
 992, w0:4.06455014, w1:2.26463754, loss:0.08744904
 993, w0:4.06467858, w1:2.26450035, loss:0.08744552
 994, w0:4.06480664, w1:2.26436358, loss:0.08744201
 995, w0:4.06493431, w1:2.26422722, loss:0.08743853
 996, w0:4.06506160, w1:2.26409126, loss:0.08743506
 997, w0:4.06518850, w1:2.26395572, loss:0.08743162
 998, w0:4.06531502, w1:2.26382058, loss:0.08742820
 999, w0:4.06544117, w1:2.26368585, loss:0.08742480
1000, w0:4.06556693, w1:2.26355153, loss:0.08742142
# 可视化
# 通过求得的w0与w1,得到拟合直线
pred_y = w0 + w1 * x
data['pred_y'] = pred_y

ax = data.plot(x='x', y='pred_y', color='orangered')
data.plot.scatter(x='x', y='y', s=80, ax=ax)

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第2张图片

绘制图像,观察w0、w1、loss的变化过程

plt.subplot(3,1,1)
plt.grid(linestyle=':')
plt.ylabel('w0')
plt.plot(epoches, w0s, color='dodgerblue', label='w0')
plt.subplot(3,1,2)
plt.grid(linestyle=':')
plt.ylabel('w1')
plt.plot(epoches, w1s, color='dodgerblue', label='w1')
plt.subplot(3,1,3)
plt.grid(linestyle=':')
plt.ylabel('loss')
plt.plot(epoches, losses, color='orangered', label='loss')
plt.tight_layout()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第3张图片

以等高线的方式绘制梯度下降的过程

import mpl_toolkits.mplot3d as axes3d

grid_w0, grid_w1 = np.meshgrid(
    np.linspace(0, 9, 500),
    np.linspace(0, 3.5, 500))

grid_loss = np.zeros_like(grid_w0)
for xs, ys in zip(x, y):
    grid_loss += ((grid_w0 + xs*grid_w1 - ys) ** 2) / 2

plt.figure('Batch Gradient Descent', facecolor='lightgray')
plt.title('Batch Gradient Descent', fontsize=20)
plt.xlabel('w0', fontsize=14)
plt.ylabel('w1', fontsize=14)
plt.tick_params(labelsize=10)
plt.grid(linestyle=':')
plt.contourf(grid_w0, grid_w1, grid_loss, 10, cmap='jet')
cntr = plt.contour(grid_w0, grid_w1, grid_loss, 10,
                  colors='black', linewidths=0.5)
plt.clabel(cntr, inline_spacing=0.1, fmt='%.2f',
          fontsize=8)
plt.plot(w0s, w1s, 'o-', c='orangered', label='BGD')
plt.legend()
plt.show()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第4张图片

薪水预测

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
# 加载数据
data = pd.read_csv('../data/Salary_Data.csv')
data.plot.scatter(x='YearsExperience', y='Salary')

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第5张图片

# 训练线性回归模型
import sklearn.linear_model as lm
model = lm.LinearRegression()
x, y = data.loc[:, :'YearsExperience'], data['Salary']
model.fit(x, y)

# 通过训练好的模型,绘制回归线
pred_y = model.predict(x)  # 计算30个样本的预测输出
data['pred_y'] = pred_y

ax = data.plot.scatter(x='YearsExperience', y='Salary')
data.plot(x='YearsExperience', y='pred_y', color='orangered', ax=ax)

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第6张图片

# 计算15年工作经验、18年工作经验的薪水预测结果
# test_x = [15, 18]
test_x = [[15], [18]]
model.predict(test_x)
array([167541.63502049, 195891.52198486])

评估误差

把训练集中的30个样本当做测试样本,输出预测结果并评估误差

import sklearn.metrics as sm
print(sm.mean_absolute_error(y, pred_y))
# print(sm.mean_squared_error(y, pred_y))
print(sm.median_absolute_error(y, pred_y))
4644.2012894435375
4017.9270292179935

可以根据误差的大小来判断当前模型是否达到业务标准,如果达到要求,则可以筹备上线。 如果没有达到,找原因,优化模型、样本。

sm.r2_score(y, pred_y)
0.9569566641435086

把训练好的模型存入文件

import pickle
with open('salary_model.pkl', 'wb') as f:
    pickle.dump(model, f)
print('dump success!')
dump success!

加载模型

import numpy as np
import pickle
import sklearn.linear_model as lm
# 加载模型
with open('salary_model.pkl', 'rb') as f:
    model = pickle.load(f)

model.predict([[15.5]])
array([172266.61618122])

封装预测模型对象,提供薪资预测服务

class SalaryPredictionModel():
    
    def __init__(self):
        with open('salary_model.pkl', 'rb') as f:
            self.model = pickle.load(f)
        
    def predict(self, exps):
        """
        预测薪水
        exps:接收一个array(存储每个人的工作年限)
        """
        exps = np.array(exps).reshape(-1, 1)  
        return self.model.predict(exps)
# 使用封装好的对象,实现预测业务
model = SalaryPredictionModel()
model.predict([5, 6, 7, 8, 9, 10, 14, 13.5, 5.3, 1.2])
array([ 73042.01180594,  82491.9741274 ,  91941.93644885, 101391.89877031,
       110841.86109176, 120291.82341322, 158091.67269904, 153366.69153831,
        75877.00050238,  37132.15498441])

岭回归

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
# 加载数据
data = pd.read_csv('../data/Salary_Data2.csv')
data.head(3)
YearsExperience Salary
0 1.1 39343
1 1.3 46205
2 1.5 37731
# 训练普通的线性回归模型,观察
import sklearn.linear_model as lm

x, y = data.loc[:, :'YearsExperience'], data['Salary']
model = lm.LinearRegression()
model.fit(x, y)
pred_y = model.predict(x)
plt.grid(linestyle=':')
plt.scatter(data['YearsExperience'], data['Salary'], s=60, label='points')
plt.plot(data['YearsExperience'], pred_y, color='orangered', label='regression')
plt.legend()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第7张图片

# 训练一个岭回归模型,这样可以更好的拟合这些普通样本
model = lm.Ridge(100)
model.fit(x, y)
pred_y = model.predict(x)
plt.grid(linestyle=':')
plt.scatter(data['YearsExperience'], data['Salary'], s=60, label='points')
plt.plot(data['YearsExperience'], pred_y, color='orangered', label='regression')
plt.legend()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第8张图片

如何选择合适的超参数C?

import sklearn.metrics as sm
# 找一组测试数据,针对不同的C训练不同的模型,比较每个模型r2得分即可。
test_data = data.iloc[0:-3:3]
test_x, test_y = test_data.loc[:, :'YearsExperience'], test_data['Salary']
pred_test_y = model.predict(test_x)
sm.r2_score(test_y, pred_test_y)
0.9196733030163875

多项式回归

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
# 加载数据
data = pd.read_csv('../data/Salary_Data.csv')
data.head(3)
YearsExperience Salary
0 1.1 39343
1 1.3 46205
2 1.5 37731
# 训练普通的线性回归模型,观察
import sklearn.linear_model as lm

x, y = data.loc[:, :'YearsExperience'], data['Salary']
model = lm.LinearRegression()
model.fit(x, y)
pred_y = model.predict(x)
plt.grid(linestyle=':')
plt.scatter(data['YearsExperience'], data['Salary'], s=60, label='points')
plt.plot(data['YearsExperience'], pred_y, color='orangered', label='regression')
plt.legend()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第9张图片

基于这组数据训练多项式回归模型

import sklearn.pipeline as pl
import sklearn.preprocessing as sp

# 通过数据管线,链接两个操作,特征扩展->回归模型
model = pl.make_pipeline(
    sp.PolynomialFeatures(10), lm.LinearRegression())
model.fit(x, y)
# 输出模型的r2得分
pred_y = model.predict(x)
print(sm.r2_score(y, pred_y))

# 从x的最小值到最大值拆出200个x值,预测得到200个y值,绘制模型曲线。
test_x = np.linspace(x.min(), x.max(), 200)
pred_test_y = model.predict(test_x.reshape(-1, 1))
# 可视化
plt.grid(linestyle=':')
plt.scatter(data['YearsExperience'], data['Salary'], s=60, label='points')
plt.plot(test_x, pred_test_y, color='orangered', label='poly regression')
plt.legend()
0.980983738515142



机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第10张图片

案例:波士顿房屋价格数据分析与房价预测

import numpy as np
import matplotlib.pyplot as plt
import sklearn.datasets as sd
import sklearn.utils as su
import pandas as pd
# 加载数据集
boston = sd.load_boston()
# print(boston.DESCR)
x, y, header = boston.data, boston.target, boston.feature_names
# 针对当前数据集,做简单的数据分析
data = pd.DataFrame(x, columns=header)
data['y'] = y
data.describe()
CRIM ZN INDUS CHAS NOX RM AGE DIS RAD TAX PTRATIO B LSTAT y
count 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000 506.000000
mean 3.593761 11.363636 11.136779 0.069170 0.554695 6.284634 68.574901 3.795043 9.549407 408.237154 18.455534 356.674032 12.653063 22.532806
std 8.596783 23.322453 6.860353 0.253994 0.115878 0.702617 28.148861 2.105710 8.707259 168.537116 2.164946 91.294864 7.141062 9.197104
min 0.006320 0.000000 0.460000 0.000000 0.385000 3.561000 2.900000 1.129600 1.000000 187.000000 12.600000 0.320000 1.730000 5.000000
25% 0.082045 0.000000 5.190000 0.000000 0.449000 5.885500 45.025000 2.100175 4.000000 279.000000 17.400000 375.377500 6.950000 17.025000
50% 0.256510 0.000000 9.690000 0.000000 0.538000 6.208500 77.500000 3.207450 5.000000 330.000000 19.050000 391.440000 11.360000 21.200000
75% 3.647423 12.500000 18.100000 0.000000 0.624000 6.623500 94.075000 5.188425 24.000000 666.000000 20.200000 396.225000 16.955000 25.000000
max 88.976200 100.000000 27.740000 1.000000 0.871000 8.780000 100.000000 12.126500 24.000000 711.000000 22.000000 396.900000 37.970000 50.000000
data.pivot_table(index='CHAS', values='y')
y
CHAS
0.0 22.093843
1.0 28.440000
data['DIS'].plot.box()

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第11张图片

# 判断每个字段与房价之间的关系
data.plot.scatter(x='RM', y='y')

机器学习02-(损失函数loss、梯度下降、线性回归、评估训练、模型加载、岭回归、多项式回归)_第12张图片

训练回归模型,预测房屋价格
  1. 整理数据集(输入集、输出集)
  2. 打乱数据集,拆分测试集、训练集。
  3. 选择模型,使用训练集训练模型,用测试集测试。
# 整理数据集(输入集、输出集)
x, y = data.iloc[:, :-1], data['y']
# 打乱数据集。  su:  sklearn.utils
# random_state:随机种子。 若两次随机操作使用的随机种子相同,则随机结果也相同。
x, y = su.shuffle(x, y, random_state=7)
# 拆分测试集、训练集。
train_size = int(len(x) * 0.9)
train_x, test_x, train_y, test_y = \
    x[:train_size], x[train_size:], y[:train_size], y[train_size:]
train_x.shape, test_x.shape, train_y.shape, test_y.shape
((455, 13), (51, 13), (455,), (51,))
# 选择线性模型,使用训练集训练模型,用测试集测试。
import sklearn.linear_model as lm
import sklearn.metrics as sm

model = lm.LinearRegression()
model.fit(train_x, train_y)  # 针对训练集数据进行训练
pred_test_y = model.predict(test_x)  # 针对测试集数据进行测试
print(sm.r2_score(test_y, pred_test_y))
print(sm.mean_absolute_error(test_y, pred_test_y))
0.8188356183218533
2.405641089772746
# 选择岭回归模型,使用训练集训练模型,用测试集测试
model = lm.Ridge(3)
model.fit(train_x,train_y)# 针对训练集数据进行训练
pred_test_y = model.predict(test_x)# 针对测试集数据进行测试
print(sm.r2_score(test_y,pred_test_y)) 
print(sm.mean_absolute_error(test_y,pred_test_y))
0.8106201351332835
2.512282622308928
# 选择多项式回归模型,使用训练集训练模型,用测试集测试
import sklearn.pipeline as pl
import sklearn.preprocessing as sp
model = pl.make_pipeline(sp.PolynomialFeatures(2),lm.Ridge(50))
model.fit(train_x,train_y)# 针对训练集数据进行训练
pred_test_y = model.predict(test_x) # 针对测试集数据进行测试
print(sm.r2_score(test_y,pred_test_y))
print(sm.mean_absolute_error(test_y,pred_test_y))
0.8936132765852787
2.008395111436512

你可能感兴趣的:(机器学习,机器学习,线性回归)