动手实践一个两层网络的前向传播和反向传播

「Trask帅哥建议自己动手撸一撸代码,所以就撸起袖子从头完全地实践一遍前向传播和反向传播」


动手实践一个两层网络的前向传播和反向传播_第1张图片

图来自Amazon李沐大神的教程

实现一个两层神经网络(不包括输入层)的前向传播和反向传播(忽略了偏置项b):

In [57]:
import numpy as np
In [58]:
def sigmoid(x):
    return (1/(1+np.exp(-x)))
In [96]:
# inputs
x = np.array([[1,4,5]])

# output
y = 0.8

# weights
w1 = np.random.random((2,3))*0.01
w2 = np.random.random((1,2))*0.01


# learning rate 
lr = 0.8
                     

for iter in range(100):
                          
    # forward propagation
    z1 = np.dot(x,w1.T) 
    a1 = sigmoid(z1)
    z2 = np.dot(a1,w2.T) 
    yhat = sigmoid(z2)

    # back propagation
    error = y-yhat
    
    if (iter%10) ==0:
        print ("Error:" + str(np.mean(np.abs(error))))

    # error term of output layer
    error_term = error*sigmoid(z2)*(1-sigmoid(z2))

    # error term of hidden layer
    hidden_error_term = error_term*w2*sigmoid(z1)*(1-sigmoid(z1))

    # update weights
    w2 += lr*error_term*z1
    w1 += lr*np.dot(hidden_error_term.T,x)

Error:0.298930739945
Error:0.281807054731
Error:0.196754899056
Error:0.073289097735
Error:0.0263131702631
Error:0.0102680484412
Error:0.00416908599532
Error:0.00172122476079
Error:0.000715569521438
Error:0.000298348787412


= = 调了很久error一直不变化,才发现脑残的我没把前向放进去迭代

你可能感兴趣的:(动手实践一个两层网络的前向传播和反向传播)