Note coursera-machine learning

This is the note of machine learning course on Cousera. I will continuously update this blog.


------------------------------不华丽的分割线-----------------------------------------

* 同学分享了一个网址,包括了这门课程的video,ppt和pdf.

* 鉴于ppt包括了我所有的工作内容。所以直接上网址,这个就不再写啦。


* https://class.coursera.org/ml-005/lecture

----------------------------------------------------------------------------------------------


1. Gradient descent

Note coursera-machine learning_第1张图片


- be care of the local optimum

Note coursera-machine learning_第2张图片


-------------------------------------------***********************************---------------------------------------------------------------


Note coursera-machine learning_第3张图片


-------------------------------------------------*******************************--------------------------------------------

Note coursera-machine learning_第4张图片



2. Linear Algebra

This part is relatively easy.

scalar multiplication (数乘)

identity matrix



3.Multivariate Linear Regression

the idea of vector and matrix

---------------------#####################-------------------------------------------------------
Note coursera-machine learning_第5张图片

------------------------------**************************-------------------------------

Note coursera-machine learning_第6张图片

---------------------------------------******************-------------------------------------------
Normalization

Get every feature into approximately a -1<= x <=1 range.

Note coursera-machine learning_第7张图片



About learning rate:
If gradient descent is not working, using smaller learning rate.

For sufficiently small learning rate, the cost function should decrease on every iteration.
But if learning rate is too small, gradient descent can be slow to converge.

Choice of features is an art.



4. Logistic regression

The reason why to use logistic regression is the range of value.

Note coursera-machine learning_第8张图片

The main part is sigmoid function.
Sigmoid function and logistic function are the same thing.
Object: fit theta to the data

Note coursera-machine learning_第9张图片

--------------------------*****************-------------------------------------------------

Note coursera-machine learning_第10张图片


------------------------------------***********----------------------------------------------

Note coursera-machine learning_第11张图片


Cost function for logistic regression is quite different from the one for linear regression.

Note coursera-machine learning_第12张图片


Note coursera-machine learning_第13张图片


Cost function 的这个处理技巧很常用。


Note coursera-machine learning_第14张图片


---------------------------------*****************-------------------------------------------

Note coursera-machine learning_第15张图片


---------------------------------********************---------------------------------

If we deal with large data, these algorithms are much faster than gradient descent algorithm.

Note coursera-machine learning_第16张图片


--------------------------------------------*******************---------------------------------------

Multiclass Classification


Note coursera-machine learning_第17张图片


--------------------------------------------***************-----------------------------------


Note coursera-machine learning_第18张图片




5. Regularization


Regularization can help to reduce overfitting.


Note coursera-machine learning_第19张图片


-----------------------------------------------------------------*********************************-----------------------------------------------------


Note coursera-machine learning_第20张图片


In the real task, it's hard to judge which features are useful. So we will shrink all thetas except theta 0 (actually theta 0 doesn't make a big difference).


Note coursera-machine learning_第21张图片


----------------------------------------------***********************------------------------------------------

Note coursera-machine learning_第22张图片


-----------------------------------------------******************************----------------------------------------------------------


Regularized Logistic Regression


Note coursera-machine learning_第23张图片




6. Neural Networks


终于到NN啦。由于deep learning的火爆,NN也是容光焕发啊。Andrew Ng 在2011年录的课程中,就流露出了对NN的重视,也提到自己在做相关方面的研究。google brain以及他之后的工作和成就,已经有目共睹了。


Note coursera-machine learning_第24张图片


-----------------------------------------********************************------------------------------------------------------------------


This picture is for vectorization.


Note coursera-machine learning_第25张图片



---------------------------------------------------------********************************-----------------------------------------------------------------


Note coursera-machine learning_第26张图片


Neural networks learn its own features.

----------------------------------------------***********************--------------------------------------------

Note coursera-machine learning_第27张图片

-------------------------------------------************************-------------------------------------


It is tricky to use hidden layers to implement complex computing.
Note coursera-machine learning_第28张图片

----------------------------------------------*****************---------------------------------------

Note coursera-machine learning_第29张图片

你可能感兴趣的:(machine,learning)