ML编程作业: Linear Regression

17年coursera课程,week2最后的编程作业。晚了两个月做(雾)

小节

week2主要讲线性回归。h(θ) = θ’ * X。
这里有两种求θ的方法,梯度下降(Gradient Descent)和正规方程(Normal Equation)。
正规方程的计算pinv(X’ * X)速度是n .^ 3,速度慢。
梯度下降需要调学习率

课程论坛地址

我的作业

能够通过测试

1. warmUpExercise

A = eye(5); % matlab热身

2. plotData

plot(x, y, 'r.');
xlabel('population');
ylabel('profit');

3. computeCost
计算损失函数,损失函数 = mean(sum((theta * x - y).^2)))
这里写图片描述

sli =   X * theta;
sli2 = sli - y;
J = sum(sli2 .^ 2) / (2 * m);

4. gradientDescent

ML编程作业: Linear Regression_第1张图片

    % Save the cost J in every iteration    
    J_history(iter) = computeCost(X, y, theta); % 第一次损失历史

    % 计算theta1 和 theta2(这个在多参数的情况下一样,多参数的写的优雅些)
    theta1 =  theta(1) - alpha*sum((X * theta - y) .* X(:, 1))/m;
    theta2 =  theta(2) - alpha*sum((X * theta -y) .* X(:, 2))/m;
    % 同时更新 theta1 theta2
    theta = [theta1;theta2];
    % converse
    if  0 == computeCost(X, y, theta)
        return
    end

附加题

5. computeCostMulti

sli =   X * theta;
sli2 = sli - y;
J = sum(sli2 .^ 2) / (2 * m);

6. gradientDescentMulti

numtheta = size(X, 2); % theta num is the X col num
for iter = 1:num_iters

    % ====================== YOUR CODE HERE ======================
    % Instructions: Perform a single gradient step on the parameter vector
    %               theta. 
    %
    % Hint: While debugging, it can be useful to print out the values
    %       of the cost function (computeCostMulti) and gradient here.
    %

    % ============================================================

    % Save the cost J in every iteration    
    J_history(iter) = computeCostMulti(X, y, theta);
    mytheta = zeros(numtheta, 1);
    for i = 1 : numtheta
    mytheta(i) =  theta(i) - alpha*sum((X * theta - y) .* X(:, i))/m;

    end
    theta = mytheta;
    if  0 == computeCostMulti(X, y, theta)
        return
    end
end

7. featureNormalize
这里Normalize 是对每列特征先减去该列平均值除以该特征的标准差

for i = 1 : size(X, 2)
    mu(1, i) = mean(X(:, i));
    sigma(1, i) = std(X(:, i)) + eps;% 
    X_norm(:, i) = (X_norm(:, i) -  mu(1, i)) ./  sigma(1, i);
end

8. normalEqn
正规方程,求闭式解。这个可以用线代的知识推导。

theta = pinv(X' * X) * X' * y;

再结

线性回归的公式主要在对课件截图中

你可能感兴趣的:(matlab)