反向传播与梯度下降

Releation

反向传播法梯度下降法在深度网络上的具体实现方式

for each training instance the backpropagation algorithm first makes a prediction (forward pass), measures the error, then goes through each layer in reverse to measure the error contribution from each connection (reverse pass), and finally slightly tweaks the connection weights to reduce the error (Gradient Descent step)



BP

  • from https://flashgene.com/archives/35118.html
    反向传播与梯度下降_第1张图片

  • from Backpropagation

Assign all network inputs and output 
Initialize all weights with small random numbers, typically between -1 and 1 

repeat 

    for every pattern in the training set 

        Present the pattern to the network 

	// Propagated the input forward through the network:
            for each layer in the network 
                for every node in the layer 
                    1. Calculate the weight sum of the inputs to the node 
                    2. Add the threshold to the sum 
                    3. Calculate the activation for the node 
                end 
            end 

	// Propagate the errors backward through the network
             for every node in the output layer 
                calculate the error signal 
            end 

            for all hidden layers 
                for every node in the layer 
                    1. Calculate the node's signal error 
                    2. Update each node's weight in the network 
                end 
            end 

	// Calculate Global Error
            Calculate the Error Function 

    end 

while ((maximum  number of iterations < than specified) AND 
          (Error Function is > than specified))



Good References

  • 神经网络中误差反向传播(back propagation)算法的工作原理: simply and clear with derivation equations
  • book Neural Networks and Deep Learning: soooooooooo detailed!
    • Chapter 2: Using neural nets to recognize handwritten digits
    • Chapter 3: How the backpropagation algorithm works
  • scikit-learn 1.17. Neural network models (supervised): the package I used
  • Principles of training multi-layer neural network using backpropagation: computation with example
  • 深度学习(三)梯度下降和反向传播算法: understand 导数derivative 和 梯度gradient
  • 零基础入门深度学习(二):神经网络和反向传播算法: explain with images and equations
  • 反向传播算法(过程及公式推导): pure derivation
  • 神经网络(二):反向传播步骤(BP): explain with sildes of B站吴恩达机器学习视频P43-P56 https://www.bilibili.com/video/BV164411S78V?p=56
  • 反向传播算法推导: at last combine with CNN反向传播算法总结

你可能感兴趣的:(机器学习)