机器学习学习笔记(一)——多元线性回归(Multivariate Linear Regression)

多元线性回归(multivariate linear regression):

在线性回归问题(Linear regression)中,引入多个特征变量(Multiple Features)作为输入,也被称为“多元线性回归(Multivariate Linear Regression)”.

符号定义

假设函数(hypothesis function):

The multivariable form of the hypothesis function accommodating these multiple features is as follows:

Using the definition of matrix multiplication, our multivariable hypothesis function can be concisely represented as:

代价函数(cost function) :

J(\Theta )=\frac{1}{2m}\sum_{i=1}^{m}(h_{\Theta }(x_{i})-y_{i})^{2}

问题描述

找到\theta值,使得J\left ( \theta \right )最小。\overset{minimized}{\Theta }J(\theta )

 

 梯度下降算法(gradient descent):

Gradient Descent for Multiple Variables:

The gradient descent equation itself is generally the same form; we just have to repeat it for our 'n' features:

机器学习学习笔记(一)——多元线性回归(Multivariate Linear Regression)_第1张图片

In other words:

正规方程法(normal equation): 

 The normal equation formula is given below:

注:当数据量超过10,000或者说很大时,正规方程法会很慢。

你可能感兴趣的:(机器学习,机器学习)