线性回归以及求解方式

Linear Regression


Hypothesis:

Parameters:

Cost Function:

Goal:

minimize


Gradient Descent


Outline:

  • start with some
  • Keep changing to reduce ,until we end up at a minimum

Algorithm:

repeat until convergence



tips:

  • (simultaneously update and )
  • learning rate
    if alpha is too small, gradient descent can be slow
    if alpha is too large, gradient descent can overshoot the minimum. it may fail to converge or even diverge
线性回归以及求解方式_第1张图片
积分部分解释

多元线性回归


Hypothesis:

问题有那个特征量,则预测函数为:

假设
写成矩阵形式:

故:

Cost Function:

Multiple Gradient Descent


Algorithm:






Feature Scaling(特征缩放)


Goal:

Get every feature into approximately a range

Aligorithm



线性回归以及求解方式_第2张图片
Fearture Scaling

Mean Normalization(缩放到接近0水平)


Goal

Replace with to make features have approximately zero mean

Algorithm


Polynomial Regression


线性回归以及求解方式_第3张图片
图片.png

Advantage and Disadvantage between gradient descent and normal equation:

线性回归以及求解方式_第4张图片
图片.png

Normal Equation(正规方程法)


线性回归以及求解方式_第5张图片
对代价函数求偏导
线性回归以及求解方式_第6张图片
图片.png

你可能感兴趣的:(线性回归以及求解方式)