updating the weights in neural network

The most common way to determine the value of weights in the neural network is so-called back propagation method. In back propagation method, the aim is to minimize the difference between output result and desired result by gradient descent. The difference between output result and desired result is called the error of the neural network. And, the process of minimizing error of the neural network by tuning the weights in the neural network is called learning.

你可能感兴趣的:(NetWork)