VINS-Mono源码分析5— vins_estimator4(边缘化)

VINS-Mono源码分析5— vins_estimator4【边缘化】

  • 1.边缘化理论
  • 2.边缘化代码分析

1.边缘化理论

在阅读本篇博客前,建议读友先去看看崔神的文章《VINS论文推导及代码解析》的第6节和高神的《视觉SLAM十四讲》的10.2.3节的内容,边缘化是VINS-Mono系统中比较难理解的地方,本人理解的也不是很透彻,只是想把自己的一些浅见分享出来,希望各位大佬看到后能够指点一二。
边缘化是指将滑窗中旧的状态删除掉,但却保留下旧的状态所带来的约束关系的一个过程,相当于将约束信息转换为优化变量的先验信息。
我们都知道优化问题主要就是求解下式,

                 H Δ x = g H\Delta x=g HΔx=g

这里, H = J T J H=J^TJ H=JTJ g = J T e g=J^Te g=JTe,所以

                J T J Δ x = J T e J^TJ\Delta x=J^Te JTJΔx=JTe

如果, Δ x = [ Δ x 1 Δ x 2 ] \Delta x=\left[\begin{matrix}\Delta x_1\\\Delta x_2\end{matrix}\right] Δx=[Δx1Δx2] H = [ H 11 H 12 H 21 H 22 ] H=\left[\begin{matrix}H_{11}&H_{12}\\H_{21}&H_{22}\end{matrix}\right] H=[H11H21H12H22] g = [ g 1 g 2 ] g=\left[\begin{matrix}g_1\\g_2\end{matrix}\right] g=[g1g2],并且 Δ x 2 \Delta x_2 Δx2要被边缘化掉,则根据下式,

            [ H 11 H 12 H 21 H 22 ] [ Δ x 1 Δ x 2 ] = [ g 1 g 2 ] \left[\begin{matrix}H_{11}&H_{12}\\H_{21}&H_{22}\end{matrix}\right]\left[\begin{matrix}\Delta x_1\\\Delta x_2\end{matrix}\right]=\left[\begin{matrix}g_1\\g_2\end{matrix}\right] [H11H21H12H22][Δx1Δx2]=[g1g2]

可得,

       ( H 11 − H 12 H 22 − 1 H 21 ) Δ x 1 = g 1 − H 12 H 22 − 1 g 2 (H_{11}-H_{12}{H_{22}}^{-1}H_{21})\Delta x_1=g_1-H_{12}{H_{22}}^{-1}g_2 (H11H12H221H21)Δx1=g1H12H221g2

H ∗ = H 11 − H 12 H 22 − 1 H 21 = J α T J α H^*=H_{11}-H_{12}{H_{22}}^{-1}H_{21}=J^T_\alpha J_\alpha H=H11H12H221H21=JαTJα g ∗ = g 1 − H 12 H 22 − 1 g 2 = J α T e α g^*=g_1-H_{12}{H_{22}}^{-1}g_2=J^T_\alpha e_\alpha g=g1H12H221g2=JαTeα,则参与边缘化迭代过程的增量方程为
                 H ∗ Δ x 1 = g ∗ H^*\Delta x_1=g^* HΔx1=g
                J α T J α Δ x 1 = J α T e α J^T_\alpha J_\alpha\Delta x_1=J^T_\alpha e_\alpha JαTJαΔx1=JαTeα

并且上式中,雅克比矩阵 J α J_\alpha Jα在参与边缘化迭代过程中保持不变。下面推导残差 e α e_\alpha eα,对 g = J T e g=J^Te g=JTe进行一阶泰勒展开可得,

     g = g 0 + ∂ g ∂ x ∣ x = x 0 d x = g 0 + ∂ g ∂ e ∂ e ∂ x ∣ x = x 0 d x = g 0 + J T J d x = g 0 + H d x g=g_0+\frac{\partial g}{\partial x}|_{x=x_0}\text{d}x=g_0+\frac{\partial g}{\partial e}\frac{\partial e}{\partial x}|_{x=x_0}\text{d}x=g_0+J^TJ\text{d}x=g_0+H\text{d}x g=g0+xgx=x0dx=g0+egxex=x0dx=g0+JTJdx=g0+Hdx

所以,
          g = [ g 1 g 2 ] = [ g 01 g 02 ] + [ H 11 H 12 H 21 H 22 ] [ d x 1 d x 2 ] g=\left[\begin{matrix}g_1\\g_2\end{matrix}\right]=\left[\begin{matrix}g_{01}\\g_{02}\end{matrix}\right]+\left[\begin{matrix}H_{11}&H_{12}\\H_{21}&H_{22}\end{matrix}\right]\left[\begin{matrix}\text{d} x_1\\\text{d} x_2\end{matrix}\right] g=[g1g2]=[g01g02]+[H11H21H12H22][dx1dx2]

            g 1 = g 01 + H 11 d x 1 + H 12 d x 2 g_1=g_{01}+H_{11}\text{d} x_1+H_{12}\text{d} x_2 g1=g01+H11dx1+H12dx2
            g 2 = g 02 + H 21 d x 1 + H 22 d x 2 g_2=g_{02}+H_{21}\text{d} x_1+H_{22}\text{d} x_2 g2=g02+H21dx1+H22dx2

又因为 g ∗ = g 1 − H 12 H 22 − 1 g 2 = J α T e α g^*=g_1-H_{12}{H_{22}}^{-1}g_2=J^T_\alpha e_\alpha g=g1H12H221g2=JαTeα,所以,

      g ∗ = g 01 + H 11 d x 1 + H 12 d x 2 − H 12 H 22 − 1 ( g 02 + H 21 d x 1 + H 22 d x 2 ) g^*=g_{01}+H_{11}\text{d} x_1+H_{12}\text{d} x_2-H_{12}{H_{22}}^{-1}(g_{02}+H_{21}\text{d} x_1+H_{22}\text{d} x_2) g=g01+H11dx1+H12dx2H12H221(g02+H21dx1+H22dx2)
      g ∗ = ( g 01 − H 12 H 22 − 1 g 02 ) + ( H 11 − H 12 H 22 − 1 H 21 ) d x 1 g^*=(g_{01}-H_{12}{H_{22}}^{-1}g_{02})+(H_{11}-H_{12}{H_{22}}^{-1}H_{21})\text{d} x_1 g=(g01H12H221g02)+(H11H12H221H21)dx1

g 0 ∗ = g 01 − H 12 H 22 − 1 g 02 g_0^*=g_{01}-H_{12}{H_{22}}^{-1}g_{02} g0=g01H12H221g02,可得

g ∗ = g 0 ∗ + H ∗ d x 1 = J α T ( J α T ) + g 0 ∗ + J α T J α d x 1 = J α T [ ( J α T ) + g 0 ∗ + J α d x 1 ] g^*=g_0^*+H^*\text{d} x_1=J^T_\alpha(J^T_\alpha)^+g_0^*+J^T_\alpha J_\alpha \text{d} x_1=J^T_\alpha[(J^T_\alpha)^+g_0^*+J_\alpha \text{d} x_1] g=g0+Hdx1=JαT(JαT)+g0+JαTJαdx1=JαT[(JαT)+g0+Jαdx1]

所以参与边缘化迭代过程的残差方程为,

             e α = ( J α T ) + g 0 ∗ + J α d x 1 e_\alpha=(J^T_\alpha)^+g_0^*+J_\alpha \text{d} x_1 eα=(JαT)+g0+Jαdx1

至此,我们就得到了 J α J_\alpha Jα e α e_\alpha eα

2.边缘化代码分析

先看estimator.cpp文件中optimization()函数中的如下代码,

if (marginalization_flag == MARGIN_OLD)
{
    ......
}
else
{
    ......
}

以上代码分为边缘化最老帧和次新帧两种情况,当边缘化最老帧时

//定义存储边缘化相关信息的指针变量marginalization_info
MarginalizationInfo *marginalization_info = new MarginalizationInfo();
//参数的传递,逆深度的获取,从vectotr到double的转变
vector2double();
//如果上一次边缘化的相关信息指针last_marginalization_info不为空
if (last_marginalization_info)
{
    vector<int> drop_set;
    for (int i = 0; i < static_cast<int>(last_marginalization_parameter_blocks.size()); i++)
    {
         //在last_marginalization_parameter_blocks中标记para_Pose[0]和
         //para_SpeedBias[0]为边缘化参数
        if (last_marginalization_parameter_blocks[i] == para_Pose[0] ||
            last_marginalization_parameter_blocks[i] == para_SpeedBias[0])
            drop_set.push_back(i);
    }
    // 用上一次的边缘化相关信息定义marginalization_factor
    MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);
    //用上一次的边缘化相关信息构造residual_block_info
    ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL,
                                                                   last_marginalization_parameter_blocks,
                                                                   drop_set);
    //添加上一次的边缘化相关信息残差块信息
    marginalization_info->addResidualBlockInfo(residual_block_info);
}

这里主要是为以后得到H矩阵做一些前期铺垫工作,比如标记哪些参数块要保留,哪些要边缘化,给一些记录边缘化相关信息的变量赋值,如下所示的变量

marginalization_info

factors: 
是一个容器,vector<ResidualBlockInfo *>

parameter_block_size
parameter_block_size[参数块地址] =参数块的尺寸

parameter_block_idx
parameter_block_idx[参数块地址] = 海森矩阵中位置
按容器中存储顺序来说,前m个位置要边缘化掉,后n个保留下来。

接着往下阅读代码

if (pre_integrations[1]->sum_dt < 10.0)
{
    IMUFactor* imu_factor = new IMUFactor(pre_integrations[1]);
    ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(imu_factor, NULL,
                                                               vector<double *>{para_Pose[0], para_SpeedBias[0], para_Pose[1], para_SpeedBias[1]},
                                                               vector<int>{0, 1});
    marginalization_info->addResidualBlockInfo(residual_block_info);
}

添加IMU残差块信息,作用同上。

int feature_index = -1;
for (auto &it_per_id : f_manager.feature)
{
    it_per_id.used_num = it_per_id.feature_per_frame.size();
    if (!(it_per_id.used_num >= 2 && it_per_id.start_frame < WINDOW_SIZE - 2))
        continue;

    ++feature_index;

    int imu_i = it_per_id.start_frame, imu_j = imu_i - 1;
    if (imu_i != 0)
        continue;

    Vector3d pts_i = it_per_id.feature_per_frame[0].point;

    for (auto &it_per_frame : it_per_id.feature_per_frame)
    {
        imu_j++;
        if (imu_i == imu_j)
            continue;

        Vector3d pts_j = it_per_frame.point;
        if (ESTIMATE_TD)
        {
           //默认不执行这里
           ......
        }
        else
        {
            ProjectionFactor *f = new ProjectionFactor(pts_i, pts_j);
            ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(f, loss_function,
                                                                           vector<double *>{para_Pose[imu_i], para_Pose[imu_j], para_Ex_Pose[0], para_Feature[feature_index]},
                                                                           vector<int>{0, 3});
            marginalization_info->addResidualBlockInfo(residual_block_info);
        }
    }
}
}

添加视觉残差块信息,作用同上。多说一句,边缘化最老帧看到的所有特征点。

marginalization_info->preMarginalize();
void MarginalizationInfo::preMarginalize()
{
    for (auto it : factors)
    {
        it->Evaluate();

        std::vector<int> block_sizes = it->cost_function->parameter_block_sizes();
        for (int i = 0; i < static_cast<int>(block_sizes.size()); i++)
        {
            long addr = reinterpret_cast<long>(it->parameter_blocks[i]);
            int size = block_sizes[i];
            if (parameter_block_data.find(addr) == parameter_block_data.end())
            {
                double *data = new double[size];
                memcpy(data, it->parameter_blocks[i], sizeof(double) * size);
                parameter_block_data[addr] = data;
            }
        }
    }
}

得到 J J J矩阵,计算如下参数

parameter_block_data
parameter_block_data[参数块地址] = 参数块的数据

接着往下读

marginalization_info->marginalize();
void MarginalizationInfo::marginalize()
{
    ......

    TicToc t_summing;
    Eigen::MatrixXd A(pos, pos);
    Eigen::VectorXd b(pos);
    A.setZero();
    b.setZero();
 
    ......

    //TODO
    Eigen::MatrixXd Amm = 0.5 * (A.block(0, 0, m, m) + A.block(0, 0, m, m).transpose());
    Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes(Amm);

    //ROS_ASSERT_MSG(saes.eigenvalues().minCoeff() >= -1e-4, "min eigenvalue %f", saes.eigenvalues().minCoeff());

    Eigen::MatrixXd Amm_inv = saes.eigenvectors() * Eigen::VectorXd((saes.eigenvalues().array() > eps).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() * saes.eigenvectors().transpose();
    //printf("error1: %f\n", (Amm * Amm_inv - Eigen::MatrixXd::Identity(m, m)).sum());

    Eigen::VectorXd bmm = b.segment(0, m);
    Eigen::MatrixXd Amr = A.block(0, m, m, n);
    Eigen::MatrixXd Arm = A.block(m, 0, n, m);
    Eigen::MatrixXd Arr = A.block(m, m, n, n);
    Eigen::VectorXd brr = b.segment(m, n);
    A = Arr - Arm * Amm_inv * Amr;
    b = brr - Arm * Amm_inv * bmm;

    
    Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes2(A);
    Eigen::VectorXd S = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array(), 0));
    Eigen::VectorXd S_inv = Eigen::VectorXd((saes2.eigenvalues().array() > eps).select(saes2.eigenvalues().array().inverse(), 0));

    Eigen::VectorXd S_sqrt = S.cwiseSqrt();
    Eigen::VectorXd S_inv_sqrt = S_inv.cwiseSqrt();

    linearized_jacobians = S_sqrt.asDiagonal() * saes2.eigenvectors().transpose();
    linearized_residuals = S_inv_sqrt.asDiagonal() * saes2.eigenvectors().transpose() * b;
}

J J J矩阵构造 H H H矩阵,并且得到 J α J_\alpha Jα ( J α T ) + g 0 ∗ (J^T_\alpha)^+g_0^* (JαT)+g0,即linearized_jacobianslinearized_residuals

for (int i = 1; i <= WINDOW_SIZE; i++)
{
    addr_shift[reinterpret_cast<long>(para_Pose[i])] = para_Pose[i - 1];
    addr_shift[reinterpret_cast<long>(para_SpeedBias[i])] = para_SpeedBias[i - 1];
}
for (int i = 0; i < NUM_OF_CAM; i++)
    addr_shift[reinterpret_cast<long>(para_Ex_Pose[i])] = para_Ex_Pose[i];
if (ESTIMATE_TD)
{
    addr_shift[reinterpret_cast<long>(para_Td[0])] = para_Td[0];
}
//得到要保留的参数块
vector<double *> parameter_blocks = marginalization_info->getParameterBlocks(addr_shift);

if (last_marginalization_info)
    delete last_marginalization_info;
//传参
last_marginalization_info = marginalization_info;
last_marginalization_parameter_blocks = parameter_blocks;

这块主要就是一些扫尾工作,给如下变量赋值,

keep_block_size
keep_block_size[保留下的参数块地址] =参数块的尺寸

keep_block_idx
parameter_block_idx[保留下的参数块地址] = 海森矩阵中位置

keep_block_data
parameter_block_data[保留下的参数块地址] = 参数块的数据

并且将边缘化信息marginalization_info赋给last_marginalization_info ,得到last_marginalization_parameter_blocks。为下一次的非线性优化提供边缘化约束做准备。

下一次非线性优化的边缘化约束部分

if (last_marginalization_info)
{
    // construct new marginlization_factor
    MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);
    problem.AddResidualBlock(marginalization_factor, NULL,
                             last_marginalization_parameter_blocks);
}
bool MarginalizationFactor::Evaluate(double const *const *parameters, double *residuals, double **jacobians) const
{
    int n = marginalization_info->n;
    int m = marginalization_info->m;
    Eigen::VectorXd dx(n);
    for (int i = 0; i < static_cast<int>(marginalization_info->keep_block_size.size()); i++)
    {
        int size = marginalization_info->keep_block_size[i];
        int idx = marginalization_info->keep_block_idx[i] - m;
        Eigen::VectorXd x = Eigen::Map<const Eigen::VectorXd>(parameters[i], size);
        Eigen::VectorXd x0 = Eigen::Map<const Eigen::VectorXd>(marginalization_info->keep_block_data[i], size);

        if (size != 7)
            dx.segment(idx, size) = x - x0;
        else
        {
            dx.segment<3>(idx + 0) = x.head<3>() - x0.head<3>();
            dx.segment<3>(idx + 3) = 2.0 * Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec();
            if (!((Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).w() >= 0))
            {
                dx.segment<3>(idx + 3) = 2.0 * -Utility::positify(Eigen::Quaterniond(x0(6), x0(3), x0(4), x0(5)).inverse() * Eigen::Quaterniond(x(6), x(3), x(4), x(5))).vec();
            }
        }
    }
    Eigen::Map<Eigen::VectorXd>(residuals, n) = marginalization_info->linearized_residuals + marginalization_info->linearized_jacobians * dx;

    if (jacobians)
    {

        for (int i = 0; i < static_cast<int>(marginalization_info->keep_block_size.size()); i++)
        {
            if (jacobians[i])
            {
                int size = marginalization_info->keep_block_size[i], local_size = marginalization_info->localSize(size);
                int idx = marginalization_info->keep_block_idx[i] - m;
                Eigen::Map<Eigen::Matrix<double, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>> jacobian(jacobians[i], n, size);
                jacobian.setZero();
                jacobian.leftCols(local_size) = marginalization_info->linearized_jacobians.middleCols(idx, local_size);
            }
        }
    }
    return true;
}

residuals就是 e α e_\alpha eα

当边缘化次新帧时,主要关注如下代码

if (last_marginalization_info)
{
    vector<int> drop_set;
    for (int i = 0; i < static_cast<int>(last_marginalization_parameter_blocks.size()); i++)
    {
        ROS_ASSERT(last_marginalization_parameter_blocks[i] != para_SpeedBias[WINDOW_SIZE - 1]);
        //在last_marginalization_parameter_blocks中标记para_Pose[9]为边缘化参数
        if (last_marginalization_parameter_blocks[i] == para_Pose[WINDOW_SIZE - 1])
            drop_set.push_back(i);
    }
    // construct new marginlization_factor
    MarginalizationFactor *marginalization_factor = new MarginalizationFactor(last_marginalization_info);
    ResidualBlockInfo *residual_block_info = new ResidualBlockInfo(marginalization_factor, NULL,
                                                                   last_marginalization_parameter_blocks,
                                                                   drop_set);

    marginalization_info->addResidualBlockInfo(residual_block_info);
}

你可能感兴趣的:(VINS-Mono源码分析5— vins_estimator4(边缘化))