Machine Learning No.5: Neural networks

1. advantage: when number of features is too large, so previous algorithm is not a good way to learn complex nonlinear hypotheses.

2. representation

"activation" of unit i in layer j

matrix of weights controlling function mapping from layer j to layer j+1

3. sample

Machine Learning No.5: Neural networks_第1张图片

we have the neural expressions

Machine Learning No.5: Neural networks_第2张图片

if network has sj units in layer j, sj+1 units in layer j+1, then θ(j) will be of dimension sj+1 * (s+ 1).

4. forward propagation:

add 

5. cost function

L: total no. of layers in network

s_l: no. of units(not counting bias unit) in layer l

Machine Learning No.5: Neural networks_第3张图片

6. gradient computation

need code to compute:

backpropagation algorithm:

sample network:

Machine Learning No.5: Neural networks_第4张图片

Machine Learning No.5: Neural networks_第5张图片

Pace:

Machine Learning No.5: Neural networks_第6张图片

Machine Learning No.5: Neural networks_第7张图片

7. gradient checking

 Machine Learning No.5: Neural networks_第8张图片

8. random initialization

Machine Learning No.5: Neural networks_第9张图片

9. sum.

Machine Learning No.5: Neural networks_第10张图片

Machine Learning No.5: Neural networks_第11张图片

Machine Learning No.5: Neural networks_第12张图片

 

你可能感兴趣的:(learning,machine)