完成初始化之后,正式开始后端求解。
包含
重点介绍介绍各个ResidualBolck的构建以及marginalization
此部分和1.7小结均属于marg部分,在下一篇博客中单独讲解。
//特征匀速模型补偿特征的坐标
//pts_i_td 处理时间同步误差和Rolling shutter时间后,角点在归一化平面的坐标。
//TR / ROW * row_i就是相机 rolling 到这一行时所用的时间(认为rolling shutter camera同一行像素的曝光时间相同)
Eigen::Vector3d pts_i_td, pts_j_td;
pts_i_td = pts_i - (td - td_i + TR / ROW * row_i) * velocity_i;
pts_j_td = pts_j - (td - td_j + TR / ROW * row_j) * velocity_j;
本部分具体推导可参考博客,写的很详细。
具体记录一下自己手推过程中遇到的问题:
因为 − R w b j T p w b j -R_{wb_j}^Tp_{wb_j} −RwbjTpwbj而担心后面的 R w b j T R_{wb_j}^T RwbjT了。
//对time offset求导(2x1)
if (jacobians[4])
{
Eigen::Map<Eigen::Vector2d> jacobian_td(jacobians[4]);
jacobian_td = reduce * ric.transpose() * Rj.transpose() * Ri * ric * velocity_i / inv_dep_i * -1.0 +
sqrt_info * velocity_j.head(2); //后面的项看不懂,为啥要管第j帧的速度?
}
//视觉测量残差的协方差矩阵
void Estimator::setParameter()
{
for (int i = 0; i < NUM_OF_CAM; i++)
{
tic[i] = TIC[i];
ric[i] = RIC[i];
}
f_manager.setRic(ric);
//这里假设标定相机内参时的重投影误差△u=1.5 pixel,(Sigma)^(-1)=(1.5/f * I(2x2))^(-1) = (f/1.5 * I(2x2))
ProjectionFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity();
ProjectionTdFactor::sqrt_info = FOCAL_LENGTH / 1.5 * Matrix2d::Identity();
td = TD;
}
具体推导可参考VIO课程 或者 可参考博客
IMU部分所有的Jacobian:
//使用短时gyro的数据对预积分q进行补偿,对应论文IV.A节
Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg));
jacobian_pose_i.block<3, 3>(O_R, O_R) = -(Utility::Qleft(Qj.inverse() * Qi) * Utility::Qright(corrected_delta_q)).bottomRightCorner<3, 3>();
...
Eigen::Quaterniond corrected_delta_q = pre_integration->delta_q * Utility::deltaQ(dq_dbg * (Bgi - pre_integration->linearized_bg));
jacobian_pose_j.block<3, 3>(O_R, O_R) = Utility::Qleft(corrected_delta_q.inverse() * Qi.inverse() * Qj).bottomRightCorner<3, 3>();
可视化看的更直观:
在midPointIntegration
中
//jacobian传递
jacobian = F * jacobian;
//协方差传递
covariance = F * covariance * F.transpose() + V * noise * V.transpose();
对应崔华坤PDF中7.2节,需结合pose graph来理解,主要是msg的理解和构建residualBlock的选点策略,具体见pose_graph博客。
防止优化结果在零空间变化,固定第一帧的位姿。(暂时还不知道为什么只固定yaw而不管tilt)
包括求取yaw方向旋转rot_diff
,这里对快速重定位的输出进行了处理,后续会用:
drift_correct_r, drift_correct_t
relo_relative_t, relo_relative_q, relo_relative_yaw
rot_diff
指 ( R b e f o r e _ a f t e r ) . y a w (R_{before\_after}).yaw (Rbefore_after).yaw对应的旋转矩阵,这里用于在优化后保持第[0]帧的yaw不变(补偿 Rs[0],Ps[0],Vs[0]),防止零空间发生变化,后面的RPV跟着[0]相应改变。
其他的fix第[0]帧的方法参考之前的博客第4节
// 优化一次之后,求出优化前后的第一帧的T,用于后面作用到所有的轨迹上去
// 同时这里为防止优化结果往零空间变化,会根据优化前后第一帧的位姿差进行修正。
void Estimator::double2vector()
{
//窗口第一帧优化前的位姿
Vector3d origin_R0 = Utility::R2ypr(Rs[0]);//R[0]
Vector3d origin_P0 = Ps[0];
if (failure_occur)
{
origin_R0 = Utility::R2ypr(last_R0);
origin_P0 = last_P0;
failure_occur = 0;
}
//窗口第一帧优化后的位姿 q(wxyz)
Vector3d origin_R00 = Utility::R2ypr(Quaterniond(para_Pose[0][6],
para_Pose[0][3],
para_Pose[0][4],
para_Pose[0][5]).toRotationMatrix());
double y_diff = origin_R0.x() - origin_R00.x();//(R_before_after).yaw(指向被减,变换到before)
//TODO:了解欧拉角奇异点
Matrix3d rot_diff = Utility::ypr2R(Vector3d(y_diff, 0, 0));
if (abs(abs(origin_R0.y()) - 90) < 1.0 || abs(abs(origin_R00.y()) - 90) < 1.0)
{
ROS_DEBUG("euler singular point!");
rot_diff = Rs[0] * Quaterniond(para_Pose[0][6],
para_Pose[0][3],
para_Pose[0][4],
para_Pose[0][5]).toRotationMatrix().transpose();
}
// 根据位姿差做修正,即保证第一帧优化前后位姿不变(似乎只是yaw不变,那tilt呢?)
for (int i = 0; i <= WINDOW_SIZE; i++)
{
Rs[i] = rot_diff * Quaterniond(para_Pose[i][6], para_Pose[i][3], para_Pose[i][4], para_Pose[i][5]).normalized().toRotationMatrix();
Ps[i] = rot_diff * Vector3d(para_Pose[i][0] - para_Pose[0][0],
para_Pose[i][1] - para_Pose[0][1],
para_Pose[i][2] - para_Pose[0][2]) + origin_P0;
Vs[i] = rot_diff * Vector3d(para_SpeedBias[i][0],
para_SpeedBias[i][1],
para_SpeedBias[i][2]);
Bas[i] = Vector3d(para_SpeedBias[i][3],
para_SpeedBias[i][4],
para_SpeedBias[i][5]);
Bgs[i] = Vector3d(para_SpeedBias[i][6],
para_SpeedBias[i][7],
para_SpeedBias[i][8]);
}
//外参
for (int i = 0; i < NUM_OF_CAM; i++)
{
tic[i] = Vector3d(para_Ex_Pose[i][0],
para_Ex_Pose[i][1],
para_Ex_Pose[i][2]);
ric[i] = Quaterniond(para_Ex_Pose[i][6],
para_Ex_Pose[i][3],
para_Ex_Pose[i][4],
para_Ex_Pose[i][5]).toRotationMatrix();
}
//转为逆深度,并置flag
VectorXd dep = f_manager.getDepthVector();
for (int i = 0; i < f_manager.getFeatureCount(); i++)
dep(i) = para_Feature[i][0];
f_manager.setDepth(dep);
//time offset
if (ESTIMATE_TD)
td = para_Td[0][0];
// relative info between two loop frame
if(relocalization_info)
{
Matrix3d relo_r;
Vector3d relo_t;
relo_r = rot_diff * Quaterniond(relo_Pose[6], relo_Pose[3], relo_Pose[4], relo_Pose[5]).normalized().toRotationMatrix();
relo_t = rot_diff * Vector3d(relo_Pose[0] - para_Pose[0][0],
relo_Pose[1] - para_Pose[0][1],
relo_Pose[2] - para_Pose[0][2]) + origin_P0;//保证第[0]帧不变之后,+origin_P0转为世界系下的t
//优化前后loop closure frame v 的drift T_prev_now
double drift_correct_yaw;
//loop closure frame优化前后的yaw drift, prev-now = (R_prev_now).yaw
drift_correct_yaw = Utility::R2ypr(prev_relo_r).x() - Utility::R2ypr(relo_r).x();
//r变化R_prev_now
drift_correct_r = Utility::ypr2R(Vector3d(drift_correct_yaw, 0, 0));
//t变化t_now_prev = w_t_prev - Rprev_now * w_t_now
drift_correct_t = prev_relo_t - drift_correct_r * relo_t;
//loop closure frame v与relo frame local(窗口内的local loop帧)的relative pose:T_local_v,可能用于快速重定位
//Rw_local^(-1)*( (w)tw_v - (w)t_w_local ) = R_local_w * (w)t_local_v = (local)t_local_v(表示Tlocal_v的平移部分,指向是从local指向v)
relo_relative_t = relo_r.transpose() * (Ps[relo_frame_local_index] - relo_t);
//Rw_local^(-1)*Rw_v = Rlocal_v
relo_relative_q = relo_r.transpose() * Rs[relo_frame_local_index];
//TODO:(R_local_v).yaw
relo_relative_yaw = Utility::normalizeAngle(Utility::R2ypr(Rs[relo_frame_local_index]).x() - Utility::R2ypr(relo_r).x());
//cout << "vins relo " << endl;
//cout << "vins relative_t " << relo_relative_t.transpose() << endl;
//cout << "vins relative_yaw " <
relocalization_info = 0;
}
}
marg部分比较复杂,另开一帖单独讲解,详见:marg的博客
不赘述。
不赘述。
在marg的博客中详细讲解。
去掉未三角化出正深度的landmark。
其他的不再赘述,简单注释:
//发布后端优化后的1.PQV 2.带时间戳的pose(PQ) 3.重定位后的pose
pubOdometry(estimator, header);//"odometry"
//历史KF的t(应该是那10个大红点)
pubKeyPoses(estimator, header);//"key_poses"
//使用外参转为camera pose
pubCameraPose(estimator, header);//"camera_pose"
//发布现有的和即将被marg掉的landmark
pubPointCloud(estimator, header);//"history_cloud"
//发布transform Twb和Tic
pubTF(estimator, header);//"extrinsic" Tbc
//当2nd时KF时,发布2nd的Twb 和 2nd中的feature的2D,归一化3D,camera 3D坐标
pubKeyframe(estimator);//"keyframe_point"、"keyframe_pose"
//在double2vector()中求得relo_relative_t,relo_relative_q,代表T_local_v,后面用于重定位
if (relo_msg != NULL)
pubRelocalization(estimator);//"relo_relative_pose"
主要是更新IMU参数[P,Q,V,ba,bg,a,g]
在前面取出本次measurement之后,由于IMU频率较高,buf中可能还剩有一些imu数据,需要对这些数据先predit进行预积分,结果保存在全局变量tmp_P, tmp_Q, tmp_V, tmp_Ba, tmp_Bg, acc_0, gyr_0(预积分初值)
中。
下一篇重点讲解Marginalization。