吴恩达机器学习MATLAB代码笔记(1)梯度下降

吴恩达机器学习MATLAB代码笔记(1)梯度下降

单变量线性回归

1.标记数据点(Plotting the Date)

fprintf(‘Plotting Data’)
data = load(‘D:\代码笔记\吴恩达机器学习\machine-learning-ex1\machine-learning-ex1\ex1\ex1data1.txt’);
X = data(:, 1)%文本第一列
Y = data(:, 2)%文本第二列
size(X)
size(Y)
m = length(Y); % number of training examples

% Plot Data 绘制散点图
scatter(X, Y,’.’)

运行结果:
吴恩达机器学习MATLAB代码笔记(1)梯度下降_第1张图片

2.损失函数(computeCost)

function J = computeCost(X, Y, theta)
% Compute cost for linear regression
% J = computeCost(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(Y); % number of training examples

X = [ones(m, 1), X]; % Add a column of ones to x,增加常数项
J = 0;
y = 0;

%由估计参数计算损失函数值
for i=1:m
y = theta.* X(i, 1:2);
cost = Y(i,1) - y;
J= J+ cost;
end
end

3.单变量梯度下降函数gradient_descent
function [ theta ] = gradient_descent( X, Y, theta, alpha)
%梯度下降法训练参数theta
%学习率alpha
m= length(X);
J=0
H= 0;
G= 0;
U= 1000;

while(J < U)

U = J;

for i = 1: m
H = H+ theta(:,1)+theta(:,2).*X(i)- Y(i);
G = G+ (theta(:,1)+theta(:,2).*X(i)- Y(i)).*X(i);
H = (1/m).*H;
G = (1/m).*G;
end

theta(:,1)= theta(:,1)-alphaH;
theta(:,2)= theta(:,1)-alpha
G;
J= computeCost(X, Y, theta);
end

end

gradient_descent( X, Y, theta, alpha)

ans =

0.0153    0.1249

你可能感兴趣的:(MATLAB,matlab,机器学习)