linux 相机,linux下使用大恒相机实时运行ORB-SLAM

平台:Ubuntu 16.04

相机: MER-301-125U3M (大恒其他型号的USB相机亦可)

本文是在原有ORB-SLAM源码基础上增加大恒相机的运行示例,用大恒相机实时运行ORB-SLAM。

主要涉及两个地方得修改:

采图。之前的example是读本地图片传入ORB_SLAM2::System,现在改成用大恒相机实时采图传入

相机参数。之前的example是用数据集已有的参数,现在是换成自己相机的实际参数

1. 安装大恒的SDK

从大恒官网下载linux的安装包,名称如Galaxy_Linux-x86_Gige-U3_32bits-64bits_1.2.1911.9122.tar.gz

解压后进入Galaxy_Linux-x86_Gige-U3_32bits-64bits_1.2.1911.9122目录,

在终端中运行./Galaxy_camera.run,便会生成安装目录Galaxy_camera

大恒的头文件和库文件:

头文件: 在./Galaxy_camera/inc目录,有DxImageProc.h和GxIAPI.h两个文件

库文件: 在系统库目录下,/usr/lib/libgxiapi.so,所以cmake的时候可以直接去链接

2. 下载 ORB-SLAM2 源码

下载源码:

git clone https://github.com/raulmur/ORB_SLAM2.git ORB_SLAM2

以下所有操作的根目录为上述下载的路径ORB_SLAM2。

首先按照上面的库的github首页的编译操作,安装好依赖后,编译源码。

保证源码自身编译没有问题之后,再进行以下步骤。

3. 用大恒相机运行ORB-SLAM

是在原有的Example基础上,加一个用大恒相机运行ORB-SLAM的例子

3.1 涉及修改和新增的目录结构如下

.

├── CMakeLists.txt (修改)

├── DxImageProc.h (新增)

├── GxIAPI.h (新增)

└── Examples

└── Monocular

├── mono_daheng.cc (新增)

└── daheng.yaml (新增)

.为根目录ORB_SLAM2

GxIAPI.h和DxImageProc.h 为大恒的头文件,从大恒安装目录的inc目录里复制过来

mono_daheng.cc为新增的用大恒相机运行ORB-SLAM的源文件

daheng.yaml为新增的大恒相机的参数文件

3.2 修改根目录下的CMakeLists.txt

在原有的CMakeLists.txt的文件末尾加上如下内容

add_executable(mono_daheng

Examples/Monocular/mono_daheng.cc)

target_link_libraries(mono_daheng ${PROJECT_NAME} gxiapi)

3.3 daheng.yaml

在 ./Examples/Monocular/ 目录下 创建 daheng.yaml 文件,是大恒相机的参数文件,

录入内容如下,里面的相机参数可根据自己相机的情况录入。

关于相机的标定,可以自己写opencv标定程序来标定。

或者使用现有的标定程序来标定,比如ros里有个camera_calibration程序可用于标定。

%YAML:1.0

#--------------------------------------------------------------------------------------------

# Camera Parameters. Adjust them!

#--------------------------------------------------------------------------------------------

# Camera calibration and distortion parameters (OpenCV)

Camera.fx: 2360.89

Camera.fy: 2363.79

Camera.cx: 1029.18

Camera.cy: 821.37

Camera.k1: -0.095

Camera.k2: 0.353

Camera.p1: 0.0

Camera.p2: 0.0

# Camera frames per second

Camera.fps: 20.0

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)

Camera.RGB: 1

#--------------------------------------------------------------------------------------------

# ORB Parameters

#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image

ORBextractor.nFeatures: 1000

# ORB Extractor: Scale factor between levels in the scale pyramid

ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid

ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold

# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.

# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST

# You can lower these values if your images have low contrast

ORBextractor.iniThFAST: 20

ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------

# Viewer Parameters

#--------------------------------------------------------------------------------------------

Viewer.KeyFrameSize: 0.05

Viewer.KeyFrameLineWidth: 1

Viewer.GraphLineWidth: 0.9

Viewer.PointSize:2

Viewer.CameraSize: 0.08

Viewer.CameraLineWidth: 3

Viewer.ViewpointX: 0

Viewer.ViewpointY: -0.7

Viewer.ViewpointZ: -1.8

Viewer.ViewpointF: 500

3.4 mono_daheng.cc

在 ./Examples/Monocular/ 目录下 创建 mono_daheng.cc 文件,是用大恒相机运行ORB-SLAM的源文件,

录入内容如下:

#include

#include

#include

#include

#include

#include

#include

#include "GxIAPI.h"

#include "DxImageProc.h"

using namespace std;

//Show error message

#define GX_VERIFY(emStatus) \

if (emStatus != GX_STATUS_SUCCESS) \

{ \

GetErrorString(emStatus); \

return emStatus; \

}

//Show error message, close device and lib

#define GX_VERIFY_EXIT(emStatus) \

if (emStatus != GX_STATUS_SUCCESS) \

{ \

GetErrorString(emStatus); \

GXCloseDevice(hDevice); \

hDevice = NULL; \

GXCloseLib(); \

printf("\n"); \

return emStatus; \

}

//Get description of error

void GetErrorString(GX_STATUS);

int main(int argc, char **argv)

{

if(argc != 3)

{

cerr << endl << "Usage: ./mono_tum path_to_vocabulary path_to_settings" << endl;

return 1;

}

int nImages = 1500; // use 1500 images

// Create SLAM system. It initializes all system threads and gets ready to process frames.

ORB_SLAM2::System SLAM(argv[1],argv[2],ORB_SLAM2::System::MONOCULAR,true);

// Vector for tracking time statistics

vector vTimesTrack;

vTimesTrack.resize(nImages);

cout << endl << "-------" << endl;

cout << "Start processing sequence ..." << endl;

cout << "The number of images to be used: " << nImages << endl << endl;

// daheng

GX_STATUS status = GX_STATUS_SUCCESS;

GX_DEV_HANDLE hDevice = NULL;

uint32_t nDeviceNum = 0;

//Initialize libary

status = GXInitLib();

if (status != GX_STATUS_SUCCESS) {

GetErrorString(status);

GXCloseLib();

return 0;

}

//Get device enumerated number

status = GXUpdateDeviceList(&nDeviceNum, 1000);

if (status != GX_STATUS_SUCCESS) {

GetErrorString(status);

GXCloseLib();

return 0;

}

//If no device found, app exit

if(nDeviceNum <= 0) {

printf("\n");

GXCloseLib();

return 0;

}

//Open first device enumerated

status = GXOpenDeviceByIndex(1, &hDevice);

if (status != GX_STATUS_SUCCESS) {

GetErrorString(status);

GXCloseLib();

return 0;

}

//Set acquisition mode

status = GXSetEnum(hDevice, GX_ENUM_ACQUISITION_MODE, GX_ACQ_MODE_CONTINUOUS);

GX_VERIFY_EXIT(status);

//Set trigger mode

status = GXSetEnum(hDevice, GX_ENUM_TRIGGER_MODE, GX_TRIGGER_MODE_OFF);

GX_VERIFY_EXIT(status);

//Device start acquisition

status = GXStreamOn(hDevice);

if(status != GX_STATUS_SUCCESS) {

GX_VERIFY_EXIT(status);

}

PGX_FRAME_BUFFER pFrameBuffer = NULL;

// Main loop

cv::Mat im;

struct timeval tv;

for(int ni=0; ni

{

// Read image from file

// Get a frame from Queue

status = GXDQBuf(hDevice, &pFrameBuffer, 1000);

if(status != GX_STATUS_SUCCESS) {

if (status == GX_STATUS_TIMEOUT) {

continue;

}

else {

GetErrorString(status);

break;

}

}

if(pFrameBuffer->nStatus != GX_FRAME_STATUS_SUCCESS) {

printf("\n", pFrameBuffer->nStatus);

}

// image

int width = pFrameBuffer->nWidth;

int height = pFrameBuffer->nHeight;

// cout <

cv::Mat image(cv::Size(width, height), CV_8UC1, (void*)pFrameBuffer->pImgBuf, cv::Mat::AUTO_STEP);

im = image.clone();

gettimeofday(&tv, NULL);

double tframe = (double)(tv.tv_sec) + (tv.tv_usec*1.0)/1000000.0;

if(im.empty())

{

cerr << endl << "im is empty: " << endl;

continue;

}

#ifdef COMPILEDWITHC11

std::chrono::steady_clock::time_point t1 = std::chrono::steady_clock::now();

#else

std::chrono::monotonic_clock::time_point t1 = std::chrono::monotonic_clock::now();

#endif

// Pass the image to the SLAM system

SLAM.TrackMonocular(im,tframe);

#ifdef COMPILEDWITHC11

std::chrono::steady_clock::time_point t2 = std::chrono::steady_clock::now();

#else

std::chrono::monotonic_clock::time_point t2 = std::chrono::monotonic_clock::now();

#endif

double ttrack= std::chrono::duration_cast<:chrono::duration> >(t2 - t1).count();

vTimesTrack[ni]=ttrack;

// // Wait to load the next frame

// double T=0;

// if(ni

// T = vTimestamps[ni+1]-tframe;

// else if(ni>0)

// T = tframe-vTimestamps[ni-1];

// if(ttrack

// usleep((T-ttrack)*1e6);

// GXQBuf to continue grab image

status = GXQBuf(hDevice, pFrameBuffer);

if(status != GX_STATUS_SUCCESS) {

GetErrorString(status);

break;

}

}

//Device stop acquisition

status = GXStreamOff(hDevice);

if(status != GX_STATUS_SUCCESS) {

GX_VERIFY_EXIT(status);

}

//Close device

status = GXCloseDevice(hDevice);

if(status != GX_STATUS_SUCCESS) {

GetErrorString(status);

hDevice = NULL;

GXCloseLib();

return status;

}

//Release libary

status = GXCloseLib();

if(status != GX_STATUS_SUCCESS) {

GetErrorString(status);

return status;

}

// Stop all threads

SLAM.Shutdown();

// Tracking time statistics

sort(vTimesTrack.begin(),vTimesTrack.end());

float totaltime = 0;

for(int ni=0; ni

{

totaltime+=vTimesTrack[ni];

}

cout << "-------" << endl << endl;

cout << "median tracking time: " << vTimesTrack[nImages/2] << endl;

cout << "mean tracking time: " << totaltime/nImages << endl;

// Save camera trajectory

SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");

return 0;

}

//----------------------------------------------------------------------------------

/**

\brief Get description of input error code

\param emErrorStatus error code

\return void

*/

//----------------------------------------------------------------------------------

void GetErrorString(GX_STATUS emErrorStatus)

{

char *error_info = NULL;

size_t size = 0;

GX_STATUS emStatus = GX_STATUS_SUCCESS;

// Get length of error description

emStatus = GXGetLastError(&emErrorStatus, NULL, &size);

if(emStatus != GX_STATUS_SUCCESS)

{

printf("\n");

return;

}

// Alloc error resources

error_info = new char[size];

if (error_info == NULL)

{

printf("\n");

return ;

}

// Get error description

emStatus = GXGetLastError(&emErrorStatus, error_info, &size);

if (emStatus != GX_STATUS_SUCCESS)

{

printf("\n");

}

else

{

printf("%s\n", (char*)error_info);

}

// Realease error resources

if (error_info != NULL)

{

delete []error_info;

error_info = NULL;

}

}

3.5 编译运行

在根目录下执行如下脚本,重新编译修改后的源码

./build.sh

编译成功后会在./Examples/Monocular/目录下生成可执行程序mono_daheng,这个便是用大恒相机运行ORB-SLAM的可执行程序,

按照如下方式传参运行:

./Examples/Monocular/mono_daheng Vocabulary/ORBvoc.txt Examples/Monocular/daheng.yaml

运行时保证大恒相机已经连接上,否则不能采图。

4. 参考

关于大恒相机在linux的使用,可以参考之前的一篇博客:

linux下用大恒相机采图

https://www.jianshu.com/p/eb490fae2cf7

关于相机的标定,从网上找到一篇使用ros的camera_calibration包的博客,但是没有尝试过,大家可以尝试一下:

Ubuntu 16.04 ROS标定相机——无坑篇

https://blog.csdn.net/qq_36804363/article/details/89269776

你可能感兴趣的:(linux,相机)