【八】机器学习之路——梯度下降法python实现

  前面的博客线性回归python实现讲了如何用python里sklearn自带的linearRegression()函数来拟合数据的实现方式。今天咱们来介绍下,如何用梯度下降法拟合数据。

  还记得梯度下降法是怎么做的吗?忘记的同学可以回头看下前面的博客线性回归。咱们还是拿之前博客线性回归python实现里的数据进行讲解。

  假设咱们现在想用 y=θ1x+θ0 y = θ 1 ⋅ x + θ 0 来拟合这组数据,由于在梯度下降的算法里,咱们是通过不断的改变 θ θ 的值,经过多次迭代,最终判断 J(θ) J ( θ ) 达到我们的预期范围,则停止迭代。一般梯度下降法迭代很多次后都会收敛,通过 J(θ) J ( θ ) 的表达式

J(θ)=12mmi=1(hθ(xi)yi)2 J ( θ ) = 1 2 ⋅ m ∑ i = 1 m ( h θ ( x i ) − y i ) 2


  我们可以看出来代价函数图像是一个类似于开口向上的碗,只要通过不断的迭代,函数是肯定会收敛到一个最小值的。之前计算的

θj:=θjαθjJ(θ) θ j := θ j − α ⋅ ∂ ∂ θ j ⋅ J ( θ )


  因为咱们拟合的函数为 y=θ1x+θ0 y = θ 1 ⋅ x + θ 0 ,因此只需要求得 θjJ(θ) ∂ ∂ θ j ⋅ J ( θ ) 即可。具体的求导过程不多说,很简单,大家用笔算下就可以。

θ1J(θ)=i1(hθ(xi)yi)xi ∂ ∂ θ 1 ⋅ J ( θ ) = ∑ 1 i ( h θ ( x i ) − y i ) ⋅ x i


θ0J(θ)=i1(hθ(xi)yi) ∂ ∂ θ 0 ⋅ J ( θ ) = ∑ 1 i ( h θ ( x i ) − y i )


  下一步就是写代码来实现了,话不多说直接上代码: (昨天博客写到这就停了,原因是我的代码挂了,始终不收敛,搞了很久,一直不明白什么原因,我先把我写的代码贴上来大家一起看一下)

import matplotlib.pyplot as plt
#数据是我自己编的,故意编的很像一个单调递增的一次函数
x_train = [100,80,120,75,60,43,140,132,63,55,74,44,88]
y_train = [120,92,143,87,60,50,167,147,80,60,90,57,99]
#步长
alpha = 0.00001
diff = [0,0]
cnt = 0
#计算样本个数
m = len(x_train)
#初始化参数的值,拟合函数为 y=theta0+theta1*x
theta0 = 0
theta1 = 0
#误差
error0=0
error1=0
#退出迭代的两次误差差值的阈值
epsilon=0.000001

def h(x):
    return theta1*x+theta0
#开始迭代
while True:
    cnt=cnt+1
    diff = [0,0]
    #梯度下降
    for i in range(m):
        diff[0]+=h(x_train[i])-y[i]
        diff[1]+=(h(x_train[i])-y[i])*x_train[i]
    theta0=theta0-alpha/m*diff[0]
    theta1=theta1-alpha/m*diff[1]

    error1=0
    #计算两次迭代的误差的差值,小于阈值则退出迭代,输出拟合结果
    for i in range(len(x_train)):
        error1 += (y[i] - (theta0 + theta1 * x_train[i])) ** 2 / 2

    if abs(error1 - error0) < epsilon:
        break
    else:
        error0 = error1
plt.plot(x_train,[h(x) for x in x_train])
plt.plot(x_train,y_train,'bo')
print(theta1,theta0)
plt.show()


最终plot的图如下图所示,



  折线为收敛的结果,蓝色的点为我的数据,可以看出来,基本拟合。

  然后我又在网上搜到了一个国外前辈写的一段梯度下降法的讲解,并且有Python的实现代码,CSDN的一位博主翻译了一下该讲解,参考 梯度下降法实现线性回归,接下来,我把这个大神的代码拷贝下来,导入了我的数据同样完美拟合,代码如下:

#x = [100,80,120,75,60,43,140,132,63,55,74,44,88]
#y = [120,92,143,87,60,50,167,147,80,60,90,57,99]
from numpy import *
import matplotlib.pyplot as plt
import pandas as pd
# y = mx + b
# m is slope, b is y-intercept
#求误差函数,就是预测值和实际值相减然后求平方,最后再取平均值
def compute_error_for_line_given_points(b, m, points):
    totalError = 0
    #将数据分别赋值为x,y
    for i in range(0, len(points)):
        x = points[i, 0]
        y = points[i, 1]
        totalError += (y - (m * x + b)) ** 2
    return totalError / float(len(points))

#梯度下降算法实现,迭代一次后输出一组参数值
def step_gradient(b_current, m_current, points, learningRate):
    b_gradient = 0
    m_gradient = 0
    N = float(len(points))
    for i in range(0, len(points)):
        x = points[i, 0]
        y = points[i, 1]
        b_gradient += -(2/N) * (y - ((m_current * x) + b_current))
        m_gradient += -(2/N) * x * (y - ((m_current * x) + b_current))
    new_b = b_current - (learningRate * b_gradient)
    new_m = m_current - (learningRate * m_gradient)
    return [new_b, new_m]



def gradient_descent_runner(points, starting_b, starting_m, learning_rate, num_iterations):
    b = starting_b
    m = starting_m
    for i in range(num_iterations):
        b, m = step_gradient(b, m, array(points), learning_rate)
    return [b, m]


def run():
    points = genfromtxt('/Users/cailei/Cai_Lei/AI/Testdata/GradientDescentData1.csv', delimiter=",")
    data = pd.read_csv('/Users/cailei/Cai_Lei/AI/Testdata/GradientDescentData1.csv',names=['x','y'])
    learning_rate = 0.0001
    initial_b = 0 # initial y-intercept guess
    initial_m = 0 # initial slope guess
    num_iterations = 1000
    print "Starting gradient descent at b = {0}, m = {1}, error = {2}".format(initial_b, initial_m, compute_error_for_line_given_points(initial_b, initial_m, points))
    print "Running..."
    [b, m] = gradient_descent_runner(points, initial_b, initial_m, learning_rate, num_iterations)
    print "After {0} iterations b = {1}, m = {2}, error = {3}".format(num_iterations, b, m, compute_error_for_line_given_points(b, m, points))

    plt.plot(data.x,data.y,'bo')
    plt.plot(data.x,data.x*m+b)
    plt.show()


if __name__ == '__main__':
    run()


最终输出的图如下图所示:

【八】机器学习之路——梯度下降法python实现_第1张图片

你可能感兴趣的:(机器学习,Python)