神经网络之梯度下降法python代码实现02

0. 前言

前面通过梯度下降寻找二次函数的最小值,下面将用代码实现用梯度下降法完成一个拟合问题。

1. 梯度下降代码实现

先随机构造一些散点,再用一条直线来拟合这些散点

import random
import matplotlib.pyplot as plt

# 构造数据
X = [i/100 for i in range(100)]  # 100个数据(0-0.99)
Y = [3 * x + 4 + random.random() for x in X]

w = random.random()  # 权重
b = random.random()  # 偏移
alpha = 0.1  # 步长
for iter_num in range(30):  # 迭代次数是30次
    for x, y in zip(X, Y):
        z = x * w + b
        loss = (z - y) ** 2  # 损失
        # 求导
        dw = 2 * (z - y) * x
        db = 2 * (z - y) * 1

        w = w - alpha * dw
        b = b - alpha * db
        print("w = {};b = {};loss = {}".format(w, b, loss))

部分输出结果:

w = 0.5161362777293207;b = 1.091370998862002;loss = 15.09026402614829
w = 0.5239581798497372;b = 1.8735612109036524;loss = 15.295538195344054
w = 0.5359989816198797;b = 2.4756012994107754;loss = 9.06130670424162
w = 0.5459542091584253;b = 2.8074422173622944;loss = 2.7529598706726652
w = 0.5612593985604557;b = 3.190071952413056;loss = 3.660137853625399
w = 0.5753001129531091;b = 3.4708862402661245;loss = 1.9714166065606513
w = 0.5939481217192544;b = 3.7816863863685453;loss = 2.41491827043215
w = 0.606378859277197;b = 3.9592683514820126;loss = 0.7883838583390176
w = 0.6150067755024674;b = 4.067117304297892;loss = 0.29078491558704467
w = 0.6207930794920327;b = 4.131409570848618;loss = 0.10333738845573845
w = 0.6293291946204466;b = 4.216770722132757;loss = 0.18216315371384226
# 最后:
w = 3.031373924113116;b = 4.455633887500919;loss = 0.004845875634096102

可视化

# 可视化
plt.plot(X, Y, "ro")  # 画样本点
z = [w*x+b for x in X]
plt.plot(X, z, "g-")  # 画拟合直线
plt.show()

可视化结果:

神经网络之梯度下降法python代码实现02_第1张图片

你可能感兴趣的:(机器学习)