Note: Introduction to Machine Learning - 01 - Baby steps towards linear regression

image.png

Recommended book: Mathematics for Machine learning (free online): https://mml-book.github.io/

image.png
image.png
image.png
image.png

Simple linear regression

Simple linear regression is a simple term.

Loss function:

Baby linear regression

Consider slope only model:

The loss:

image.png

How to find .

Baby gradient descent

image.png

Have a initial guess, based on the derivative. We can guess which direction to go.

Update rule:

Here is the learning rate.

It's tricky to find global minimum for non-convex function.

If you learning rate is too large:

Baby gradient descent cont.

We need to compute the derivative of the loss:

We get:

Baby analytical solution

At the minimum:

We obtain:

Back to simple linear regression

Introducing partial derivatives

You first fix .

Update rules for each parameter:

In vector form:

Compute the gradient

We need partial derivatives:

\begin{aligned} \frac{\partial \mathcal{L}}{\partial \beta_{0}} &=-\frac{2}{n} \sum\left(y_{i}-\beta_{0}-\beta_{1} x_{i}\right) \\ \frac{\partial \mathcal{L}}{\partial \beta_{1}} &=-\frac{2}{n} \sum\left(y_{i}-\beta_{0}-\beta_{1} x_{i}\right) x_{i} \end{aligned}

Reference

https://www.youtube.com/watch?v=lWGdFeMsjzg&feature=youtu.be

你可能感兴趣的:(Note: Introduction to Machine Learning - 01 - Baby steps towards linear regression)