共轭梯度法

共轭梯度法求解线性方程组:

        参考链接:第八章 计算方法——共轭梯度法求解线性方程组.pdf

        当线性方程组Ax=b的系数A是正定矩阵,可以采用共轭梯度法对该方程组进行求解,可以证明如下的二次函数

f(x)=\frac{1}{2}x^{T}Ax-b^{T}x

取得极小值点x是方程Ax=b的解。共轭梯度法是把求解线性方程组的问题转化为求解一个与之等价的二次函数极小化的问题。从任意给定的初始点出发,沿一组关于矩阵A的共轭方向进行线性搜索,在无舍入误差的假定下,最多迭代n次(n为矩阵A的阶数),就可以求得二次函数的极小点,也就是线性方程组Ax=b的解。迭代公式如下:

x_{k+1}=x_{k}+\alpha_{k}d_{k}

        共轭梯度法中关键的两点是迭代公式中最佳步长\alpha_{k}和搜索方向d_{k}的确定。其中\alpha_{k}可以通过一元函数\phi(\alpha)=f(x_{k}+\alpha d_{k})的极小化来求得:

 \alpha_{k}=\frac{r_{k}^{T}d_{k}}{d_{k}^{T}Ad_{k}}

其中,残差 r_{k}=b-Ax_{k}.

        取d_{0}=r_{0}=b-Ax_{0},则d_{k+1}=r_{k+1}+\beta_{k}d_{k},要求d_{k+1}d_{k}关于A是共轭方向,即满足d_{k+1}^{T}Ad_{k}=0,可得:

\beta_{k}=-\frac{r_{k+1}^{T}Ad_{k}}{d_{k}^{T}Ad_{k}}

完整伪代码如下:

(1) 给定初始向量x_{0}和精度\varepsilon>0

(2) 计算r_{0}=b-Ax_{0},取d_{0}=r_{0}

(3) for k=0 to n-1

        \alpha_{k}=\frac{r_{k}^{T}d_{k}}{d_{k}^{T}Ad_{k}}

        x_{k+1}=x_{k}+\alpha_{k}d_{k}

        r_{k+1}=b-Ax_{k+1}

        若\left \| r_{k+1} \right \| \leqslant \varepsilon或k+1=n,则输出近似解x_{k+1},停止;否者,继续;

        \beta_{k}=-\frac{r_{k+1}^{T}Ad_{k}}{d_{k}^{T}Ad_{k}}

        d_{k+1}=r_{k+1}+\beta_{k}d_{k}

    

matlab代码:

% ep = 0.001
% r_0 = b - A * x_0
% d_0 = r_0
% for k = 0 to n-1
%     alpha_k = (r_k^T * d_k) / (d_k^T * A * d_k)
%     x_(k+1) = x_k + alpha_k * d_k
%     r_(k+1) = r_k - alpha_k * A * d_k
%     if r_(k+1)^T * r_(k+1) <= ep || k+1=n
%         break
%     endif
%     beta_k = -(r_(k+1)^T * d_k) / (d_k^T * A * d_k)
%     d_(k+1) = r_(k+1) + beta_k * d_k
% endfor

% 求解线性方程组 A * x = b 的解,A是对称正定矩阵(保证有唯一解)
% 求二次函数 f(x) = (1/2) * x^T * A * x - b^T * x 的极小值点
% f'(x) = A * x - b

% A = [9,18,9,-27;18,45,0,-45;9,0,126,9;-27,-45,9,135];
% b = [1;2;16;8];
% ep = 0.000001;
% x = [0;0;0;0];
% x = cg(A, b, ep, x);

function x = cg(A, b, ep, x)
% A正定矩阵;b向量;ep最小误差;x初始向量解
[n,~] = size(A);
r = b - A * x;
d = r;
for k = 0:n-1
    alpha = (r' * r) / (d' * A * d);
    x = x + alpha * d;
    r = r - alpha * A * d;
    if (r' * r <= ep)
        break;
    end
    beta = -(r' * A * d) / (d' * A * d);
    d = r + beta * d;
end

你可能感兴趣的:(优化,线性代数,机器学习)