梯度下降法求解目标函数最小值的简单例子

梯度下降法讲得比较简单透彻的好文:梯度下降法

假设目标函数为:y = (x1-3)^2 + x2^2 - 50

则目标函数对 x1 的偏导为:dx1 = 2*(x1-3),对 x2 的偏导为 dx2 = 2*x2

设置初始值后迭代,代码如下,可直接运行

clc
clear

x1 = [-10:0.1:10];
x2 = [-10:0.1:10];
y = (x1-3).^2 + x2.^2 - 50;
%y is an object function which can be regarded as a loss function

dx1 = 2*(x1-3);
dx2 = 2*x2;
%dx1 and dx2 are the gradient functions of y

res = [-20,10];
alpha = 0.05;
%Initial parameters

for i = 1:1000
    dx1 = 2*res(1) - 6;
    dx2 = 2*res(2);
    res = res - alpha*[dx1 dx2];
    y(i) = (res(1)-3)^2 + res(2)^2-50;
    if(i>1)
        if abs(y(i) - y(i-1)) <= 0.00001
            %change the accuracy to get diffirent results
            fprintf("i stop in %d and min y = %f",i,y(i));
            fprintf("\nfinal [x1,x2] = [%f,%f]\n",res);
            break
        end
    end
end

 

你可能感兴趣的:(目标跟踪)