【深度学习】多变量线性回归算法

多变量线性回归算法

    • 一、问题描述
    • 二、概要
    • 三、代码实现(.m)
    • 四、代码下载

一、问题描述

分析多变量(房屋尺寸、卧室数量)影响因素下,线性回归问题(房价)。

二、概要

1.假设函数
h θ ( x ) = θ 0 + θ 1 x \begin{aligned} h_{\theta}(x)=\theta_{0}+\theta_{1}x \end{aligned} hθ(x)=θ0+θ1x

2.代价函数:
J ( θ 0 , θ 1 ) = 1 2 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) 2 \begin{aligned} J(\theta_{0},\theta_{1})=\frac{1}{2m}\sum_{i=1}^{m}\left (h_{\theta}( x^{(i)})-y^{(i)} \right )^{2} \end{aligned} J(θ0,θ1)=2m1i=1m(hθ(x(i))y(i))2
3.训练方法: 梯度下降法
θ j = θ j − α ∂ ∂ θ j J ( θ 0 , θ 1 ) = θ j − α 1 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) x j ( i ) \begin{aligned} \theta_{j}=\theta_{j}-\alpha \frac{\partial}{\partial \theta_{j}}J(\theta_{0},\theta_{1})=\theta_{j}-\alpha\frac{1}{m}\sum_{i=1}^{m}\left ( h_{\theta}(x^{(i)})-y^{(i)}\right )x_{j}^{(i)} \end{aligned} θj=θjαθjJ(θ0,θ1)=θjαm1i=1m(hθ(x(i))y(i))xj(i)
4.特征归一化: 最大最小标准化
x = x − μ σ \begin{aligned} x=\frac{x-\mu }{\sigma } \end{aligned} x=σxμ

三、代码实现(.m)

1.主文件
ex2.m

clear ; close all; clc
%% ================ Part 1: Feature Normalization ================
% Load Data
data = load('files\ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);
% Feature Normalization
[X_norm, mu, sigma] = featureNormalize(X);
x = [ones(m,1), X_norm];
%% ================ Part 2: Gradient Descent ================
% Choose some alpha value
alpha = 0.01;
num_iteration = 400;
% Init Theta and Run Gradient Descent 
theta = zeros(3, 1);
[theta_group, J_group] = gradientDescent(x, y, theta, alpha, num_iteration);
% Plot the convergence graph
figure;
plot(1:numel(J_group), J_group, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');
% Display gradient descent's result
fprintf('Theta computed from gradient descent: \n');
fprintf(' %f \n', theta_group(:,num_iteration));
fprintf('\n');
%% ================ Part 3: Estimation ================
% Estimate the price of a 1650 sq-ft, 3 br house
% Feature Normalization
X = [1650,3];
X_norm = (X-mu) ./ sigma;
x = [1, X_norm];
price = x * theta_group(:,num_iteration); 
fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
         '(using gradient descent):\n $%f\n'], price);

2.调用函数
featureNormalize.m

function [X_norm, mu, sigma] = featureNormalize(X)
%Init
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));

for i = 1:size(X,2)
    %mean
    mu(i) = mean(X(:,i));
    %Standard Deviation
    sigma(i) = std(X(:,i));
    %Min-Max Normalization
    X_norm(:,i) = (X(:,i)-mu(i)*ones(size(X,1),1)) / sigma(i);
end 
end

computeCost.m

function J = computeCost(x,y,theta)
%Number of Examples
m = length(y);
%Hypothetical Function
h = x * theta;
%Loss Function
J = (h-y)' * (h-y) / (2*m);

gradientDescent.m

function [theta_group,J_group] = gradientDescent(x,y,theta,alpha,num_iteration)
%Number of Features & Examples
num_feature = size(x,2);
m = length(y);
%Define theta_group & J_group
theta_group = zeros(num_feature,num_iteration);
J_group = zeros(num_iteration,1);
%Init theta_group & J_group
theta_group(:,1) = theta;
J_group(1) = computeCost(x,y,theta);

for i = 2:num_iteration
    %Hypothetical Function
    h = x * theta_group(:,i-1);
    %Gradient Descent
    for j = 1:num_feature
        theta_group(j,i) = theta_group(j,i-1) - alpha * (h - y)'*x(:,j) / m;
    end
    %Loss Function
    J_group(i) = computeCost(x,y,theta_group(:,i));
end

3.运行结果

当住房面积为 1650 f e e t 2 feet^{2} feet2,卧室数量为3时,对应房价为: $ 289263.855733 289263.855733 289263.855733

四、代码下载

链接:https://pan.baidu.com/s/1Vdy5z8aHOnIK1ZC0Le-X2w
提取码:8r4c

你可能感兴趣的:(深度学习)