/-----------------------------------
本篇文章首发于知乎专栏
https://zhuanlan.zhihu.com/QIQI-HITwh
/---------------------------------------------------------------------------------
时隔两周,我又回来了。
本期内容如题,ZED 2的SDK功能还是挺多的,包括轨迹跟踪,实时建图等等。虽然由于是商业产品,我看不到他们的源代码,但是根据使用情况来看,ZED 2内部是采用了IMU和光学信息融合的算法,并且IMU的数据在决策权重上占了更大的比例,至于为什么我会在后文讲到。关于ORB_SALM2,它给我们提供了目前来说效果比较好的实时定位和稀疏地图构建功能,同时支持单目、双目和RGB-D摄像头的SLAM算法,但是目前来说大多数SLAM工程应用都是用来做导航避障,单单的稀疏地图并不能满足我们的要求,于是我决定充分利用两者的信息,以达到最佳的效果。
目录
1 安装ZED 2 SDK和ORB_SLAM2
2 使用Qt Creator开发ROS工程
3 适配ORB_SALM2于双目相机ZED 2
4 创建功能包用于保存SLAM地图和三维点云图
5 RVIZ、URDF配置
安装ZED 2 SDK和ZED 2 ROS功能包
ZED SDK 3.1 - Download
https://www.stereolabs.com/docs/installation/jetson/
https://github.com/stereolabs/zed-ros-wrapper
安装ORB_SLAM2
如果你只在ROS环境下使用ORB_SLAM2的话,仅需要下载以下功能包
https://github.com/appliedAI-Initiative/orb_slam_2_ros
否则,下载同时下载编译ORB_SLAM2源码
https://github.com/raulmur/ORB_SLAM2
由于ros_qtc_plugin不支持arm64系统,故直接用Qt-Creator开发
这个时候你已经完成下载以上两个功能包,将功能包放入已经创建好的ROS工作空间后编译
$ rosdep install --from-paths src --ignore-src -r -y
$ catkin_make -DCMAKE_BUILD_TYPE=Release
$ source ./devel/setup.bash
-DCATKIN_DEVEL_PREFIX=…/devel -DCMAKE_INSTALL_PREFIX=…/install
默认你已经安装配置好了Qt Creator,参考以下文档来完成
http://wiki.ros.org/IDEs#QtCreator
1、修改Qt Creator.Desktop文件
$ sudo vim /usr/share/applications/org.qt-project.qtcreator.desktop
2、更改CMakeLists.txt,在文件头添加以下内容,目的是为了在Qt工程树中显示所有文件
$ sudo vim catkin_qi/src/CMakeLists.txt
project(catkin_qi)
#Add custom (non compiling) targets so launch scripts and python files show up in QT Creator's project view.
file(GLOB_RECURSE EXTRA_FILES */*)
# 以下的‘ROS_Packages’是我自定义的名字,到时所有功能包都会显示在该目录下
add_custom_target(ROS_Packages ALL WORKING_DIRECTORY ${PROJECT_SOURCE_DIR} SOURCES ${EXTRA_FILES})
3、打开Qt-Creator,打开CMakeLists.txt,接下来等待软件Cmake,完成后工程树就会自动展示出来了,尝试’Build’来验证是否配置成功。
orb_slam_2_ros目前只是支持了RealSense等系列相机,于是我们需要自己更改源码来适配ZED2,要更改的文件不多,我列举一下:
在’lanuch’目录内添加一个自己的lanuch的文件,内容先复制其它’stereo’模板,最后修改的文件如下:
<?xml version="1.0"?>
<launch>
<!-- launch the zed_ros_wrapper node-->
<include file="$(find zed_wrapper)/launch/zed2.launch">
</include>
<node name="orb_slam2_stereo" pkg="orb_slam2_ros"
type="orb_slam2_ros_stereo" output="screen">
<remap from="image_left/image_color_rect" to="/zed2/zed_node/left/image_rect_color" />
<remap from="image_right/image_color_rect" to="/zed2/zed_node/right/image_rect_color" />
<param name="publish_pointcloud" type="bool" value="true" />
<param name="publish_pose" type="bool" value="true" />
<param name="localize_only" type="bool" value="false" />
<param name="reset_map" type="bool" value="false" />
<!-- static parameters -->
<param name="load_map" type="bool" value="true" />
<param name="map_file" type="string" value="/home/qi/catkin_qi/src/tx2_slam/map/bin/zed2_slam_Map.bin" />
<param name="settings_file" type="string" value="$(find tx2_slam)/config/Zed2Stereo.yaml" />
<param name="voc_file" type="string" value="$(find tx2_slam)/Vocabulary/ORBvoc.txt" />
<param name="pointcloud_frame_id" type="string" value="map" />
<param name="camera_frame_id" type="string" value="camera_link" />
<param name="min_num_kf_in_map" type="int" value="5" />
</node>
<!-- Position respect to base frame (i.e. "base_link) -->
<arg name="cam_pos_x" default="0.0" />
<arg name="cam_pos_y" default="0.0" />
<arg name="cam_pos_z" default="0.0" />
<!-- Orientation respect to base frame (i.e. "base_link) -->
<arg name="cam_roll" default="0.0" />
<arg name="cam_pitch" default="0.0" />
<arg name="cam_yaw" default="0.0" />
<!-- ROS URDF description of the ZED -->
<group if="true">
<param name="zed2_description"
command="$(find xacro)/xacro '$(find tx2_slam)/urdf/zed_descr.urdf.xacro'
camera_name:=tx2zed2
camera_model:=zed2
base_frame:=zed2_base_link
cam_pos_x:=$(arg cam_pos_x)
cam_pos_y:=$(arg cam_pos_y)
cam_pos_z:=$(arg cam_pos_z)
cam_roll:=$(arg cam_roll)
cam_pitch:=$(arg cam_pitch)
cam_yaw:=$(arg cam_yaw)"
/>
<node name="zed2_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" output="screen" required="true">
<remap from="robot_description" to="zed2_description" />
</node>
</group>
<!-- Launch rivz display -->
<node name="rviz" pkg="rviz" type="rviz" args="-d $(find tx2_slam)/config/rviz_config.rviz" output="screen" />
<node name="MapBuild" pkg="tx2_slam" type="MapBuild" output="screen">
<param name="prefix" type="string" value="$(find tx2_slam)/map/pcd_files/" />
</node>
</launch>
其中你需要更改的有:添加ZED2的lanuch文件、remap ros-topic names(这样orb_ros就能收到zed2节点的左右图片信息了),以及其它节点。
然后就是配置属于ZED2自己的.yaml文件了,你可以参考以下内容来配置:
其中fx、fy、cx、cy、k1、k2、k3、p1,p2是相机的内参和畸变矫正参数,你可以在你安装ZED SDK的目录下找到存储有相关参数的文件。至于图片的分辨率和帧率看情况来定,不过要和ZED2 的设置保持一致(关于如何设置ZED2 输出分辨率下文有说明)。
当然,每个相机出厂参数不可能一模一样,条件允许的话最好还是用camera_calibration标定一下相机。
%YAML:1.0
# Camera calibration and distortion parameters (OpenCV)
Camera.fx: 527.2025
Camera.fy: 527.245
Camera.cx: 618.31
Camera.cy: 367.1445
# get the rectified image from zed-ros-wrapper, so set all zero
Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0
Camera.k3: 0.0
# more to see https://www.stereolabs.com/docs/video/camera-controls/
Camera.width: 2560
Camera.height: 720
Camera.fps: 30.0
# stereo baseline(m) times fx
Camera.bf: 63.1965
# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1
# Close/Far threshold. Baseline times.
# set it according to the actual effect
ThDepth: 35
# the depth in program equal to the real depth times the DepthMapFactor
#DepthMapFactor: 1000.0
#--------------------------------------------------------------------------------------------
# Stereo Rectification. Only if you need to pre-rectify the images.
# Camera.fx, .fy, etc must be the same as in LEFT.P
#--------------------------------------------------------------------------------------------
LEFT.height: 720
LEFT.width: 2560
LEFT.D: !!opencv-matrix
rows: 1
cols: 5
dt: d
data: [0.0, 0.0, 0.0, 0.0, 0.0]
LEFT.K: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [527.2025, 0.0, 618.31, 0.0, 527.245, 367.1445, 0.0, 0.0, 1.0]
LEFT.R: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
LEFT.P: !!opencv-matrix
rows: 3
cols: 4
dt: d
data: [527.2025, 0.0, 618.31, 0.0, 0.0, 527.245, 367.1445, 0.0, 0.0, 0.0, 1.0, 0.0]
RIGHT.height: 720
RIGHT.width: 2560
RIGHT.D: !!opencv-matrix
rows: 1
cols: 5
dt: d
data: [0.0, 0.0, 0.0, 0.0, 0.0]
RIGHT.K: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [527.2025, 0.0, 618.31, 0.0, 527.245, 367.1445, 0.0, 0.0, 1.0]
RIGHT.R: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
RIGHT.P: !!opencv-matrix
rows: 3
cols: 4
dt: d
data: [527.2025, 0.0, 618.31, 0.0, 0.0, 527.245, 367.1445, 0.0, 0.0, 0.0, 1.0, 0.0]
#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------
# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 1200
# ORB Extractor: Scale factor between levels in the scale pyramid
ORBextractor.scaleFactor: 1.2
# ORB Extractor: Number of levels in the scale pyramid
ORBextractor.nLevels: 8
# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7
设置ZED 2输出的分辨率和帧率,一般来说分辨率越高效果越好,但是需要同时考虑负载情况,不过我们总得相信我们的TX2不是/狗头。
在zed-ros-wraper功能包的zed wrapper_nodelet.cpp里添加以下两行代码来设置。
//set the Zed2 ouput
mZedParams.camera_resolution = sl::RESOLUTION::HD720;
mZedParams.camera_fps = 30;
mConnStatus = mZed.open(mZedParams);
保存三维点云图
这里我是直接将ZED2节点的点云存为pcd文件供下次使用的,新建一个功能包创建以下节点:注意这里订阅的是’/zed2/zed_node/mapping/fused_cloud’话题,运行节点,取最后一帧pcd文件作为最终的点云图。下次直接读取pcd文件然后发布相关数据到rviz中即可。
以下为使用’pcl_viewer’查看的效果。
# pragma once
#include
#include
#include
#include
#include
#include
#include
#include
class MapBuild
{
public:
~MapBuild();
MapBuild();
private:
};
class PointCloudToPCD
{
protected:
ros::NodeHandle nh_;
private:
std::string prefix_;
std::string filename_;
bool binary_;
bool compressed_;
std::string fixed_frame_;
tf2_ros::Buffer tf_buffer_;
tf2_ros::TransformListener tf_listener_;
public:
std::string cloud_topic_;
ros::Subscriber sub_;
////////////////////////////////////////////////////////////////////////////////
// Callback
void cloud_cb (const pcl::PCLPointCloud2::ConstPtr& cloud)
{
if ((cloud->width * cloud->height) == 0)
return;
ROS_INFO ("Received %d data points in frame %s with the following fields: %s",
(int)cloud->width * cloud->height,
cloud->header.frame_id.c_str (),
pcl::getFieldsList (*cloud).c_str ());
Eigen::Vector4f v = Eigen::Vector4f::Zero ();
Eigen::Quaternionf q = Eigen::Quaternionf::Identity ();
if (!fixed_frame_.empty ()) {
if (!tf_buffer_.canTransform (fixed_frame_, cloud->header.frame_id, pcl_conversions::fromPCL (cloud->header.stamp), ros::Duration (3.0))) {
ROS_WARN("Could not get transform!");
return;
}
Eigen::Affine3d transform;
transform = tf2::transformToEigen (tf_buffer_.lookupTransform (fixed_frame_, cloud->header.frame_id, pcl_conversions::fromPCL (cloud->header.stamp)));
v = Eigen::Vector4f::Zero ();
v.head<3> () = transform.translation ().cast<float> ();
q = transform.rotation ().cast<float> ();
}
std::stringstream ss;
if (filename_ != "")
{
ss << filename_ << ".pcd";
}
else
{
ss << prefix_ << cloud->header.stamp << ".pcd";
}
ROS_INFO ("Data saved to %s", ss.str ().c_str ());
pcl::PCDWriter writer;
if(binary_)
{
if(compressed_)
{
writer.writeBinaryCompressed (ss.str (), *cloud, v, q);
}
else
{
writer.writeBinary (ss.str (), *cloud, v, q);
}
}
else
{
writer.writeASCII (ss.str (), *cloud, v, q, 8);
}
}
////////////////////////////////////////////////////////////////////////////////
PointCloudToPCD () : binary_(false), compressed_(false), tf_listener_(tf_buffer_)
{
// Check if a prefix parameter is defined for output file names.
ros::NodeHandle priv_nh("~");
if (priv_nh.getParam ("prefix", prefix_))
{
ROS_INFO_STREAM ("PCD file prefix is: " << prefix_);
}
else if (nh_.getParam ("prefix", prefix_))
{
ROS_WARN_STREAM ("Non-private PCD prefix parameter is DEPRECATED: "
<< prefix_);
}
priv_nh.getParam ("fixed_frame", fixed_frame_);
priv_nh.getParam ("binary", binary_);
priv_nh.getParam ("compressed", compressed_);
priv_nh.getParam ("filename", filename_);
if(binary_)
{
if(compressed_)
{
ROS_INFO_STREAM ("Saving as binary compressed PCD");
}
else
{
ROS_INFO_STREAM ("Saving as binary uncompressed PCD");
}
}
else
{
ROS_INFO_STREAM ("Saving as ASCII PCD");
}
if (filename_ != "")
{
ROS_INFO_STREAM ("Saving to fixed filename: " << filename_);
}
cloud_topic_ = "/zed2/zed_node/mapping/fused_cloud";
sub_ = nh_.subscribe (cloud_topic_, 1, &PointCloudToPCD::cloud_cb, this);
ROS_INFO ("Listening for incoming data on topic %s",
nh_.resolveName (cloud_topic_).c_str ());
}
};
保存ORB_SLAM2地图
orbslam2_ros功能包为我们提供了地图保存和加载功能,稍微看下它的源码就会发现,你可以创建一个ROS server client请求位于orb_slam2_ros的服务:
如此以来,下次在相同环境下就可以直接加载先前建好的地图了(记得修改lanuch文件里的加载地图参数)。
ros::init (argc, argv, "MapBuild", ros::init_options::AnonymousName);
ros::NodeHandle n;
ros::ServiceClient client = n.serviceClient<orb_slam2_ros::SaveMap>("/orb_slam2_stereo/save_map");
orb_slam2_ros::SaveMap srv;
srv.request.name = "/home/qi/catkin_qi/src/tx2_slam/map/bin/zed2_slam_Map.bin";
if (client.call(srv))
{
ROS_INFO("Ready to create Map");
}
else
{
ROS_ERROR("Failed to call service SaveMap");
return 1;
}
URDF参考上面lanuch文件里的配置,记得在节点里发布tf关联让摄像头模型随着位置变化一起动起来。
RVIZ配置,可以根据自己的情况设置添加的插件和订阅的话题,因为内容太多我就只放图片形式的配置了:
下期再见。