【Matlab学习手记】多元非线性回归

介绍两种方法做多元非线性回归:lsqcurvefit、Adagrad 法。

lsqcurvefit是Matlab提供的内置函数,用于基于最小二乘的曲线拟合问题;Adagrad是一种基于梯度的迭代方法,特点在步长不固定,随着迭代次数变化。

clear; clc;
% 多元非线性回归模型
% z = w1 * exp(-x / w2) + w3 * y
% z = 10 * exp(-x / 5) + 2 * y
len = 20;
rng('default');
x = randi(len + 1, len, 1) / 5;
y = randi(len + 1, len, 1) / 5;
z = 10 * exp(-x / 2) + 0.5 * y;
ratio = 0.0;
z = z + ratio * max(z) * rand(len, 1);
X = [x, y];
fun = @(var, X)var(1) * exp(-X(:, 1) / var(2)) + var(3) * X(:, 2);
w = lsqcurvefit(fun, [1, 1, 1], X, z);
disp(['lsqcurvefit 计算结果:', num2str(w)]);
% 梯度下降法学习
% obj = 1 / (2 * len) * (w1 * exp(-x / w2) + w3 * y - z)' * (w1 * exp(-x / w2) + w3 * y - z)
alpha = 5;   % 学习率大收敛快,可能有震荡
iteMax = 10000;
w1 = 1;
w2 = 1;
w3 = 1;
initW = [w1; w2; w3];
err = 1e-6;
J = zeros(iteMax, 1);
G = zeros(3, 1);
e = 0.1;
for i = 1 : iteMax
    gradW1 = 1 / len * (exp(-x / w2))' * (w1 * exp(-x / w2) + w3 * y - z);
    gradW2 = 1 / len * (w1 * x .* exp(-x / w2) / w2^2)' * (w1 * exp(-x / w2) + w3 * y - z);
    gradW3 = 1 / len * y' * (w1 * exp(-x / w2) + w3 * y - z);
    grad = [gradW1; gradW2; gradW3];
    % Adagrad 法    x = x + yita * inv(G) * grad;
    G = G + grad.^2;
    newW = initW - alpha * diag(1 ./ sqrt(G + e)) * grad;
    if norm(newW - initW) < err
        J(i + 1 : end) = [];
        disp(['梯度下降法迭代次数:', num2str(i)]);
        disp(['梯度下降法迭代结果:', num2str(newW')]);
        break;
    else
        initW = newW;
        w1 = newW(1);
        w2 = newW(2);
        w3 = newW(3);
        J(i) = 1 / (2 * len) * (w1 * exp(-x / w2) + w3 * y - z)' * (w1 * exp(-x / w2) + w3 * y - z);
    end
end
% 绘图
subplot(1, 2, 1)
loglog(J, 'LineWidth', 2)
legend(['alpha = ', num2str(alpha)]);
xFit1 = linspace(min(x), max(x), 30);
yFit1 = linspace(min(y), max(y), 30);
[xFit2, yFit2] = meshgrid(xFit1, yFit1);
zFit = w1 * exp(-xFit2 / w2) + w3 * yFit2;
subplot(1, 2, 2)
scatter3(x, y, z, 'filled');
hold on
mesh(xFit2, yFit2, zFit);
hold off
legend('Points', 'Fitting', 'Location','NorthOutside');

结果打印


Local minimum found.

Optimization completed because the size of the gradient is less than
the default value of the function tolerance.



lsqcurvefit 计算结果:10           2         0.5
梯度下降法迭代次数:1934
梯度下降法迭代结果:10            2          0.5

【Matlab学习手记】多元非线性回归_第1张图片

【Matlab学习手记】多元非线性回归_第2张图片

你可能感兴趣的:(机器学习和人工智能)