机器人视觉项目:视觉检测识别+机器人跟随(21)

在ros+groovy上安装kinect驱动:
OpenNI,SensorKinect,NITE
首先安装依赖项:
sudo apt-get install git-core cmake freeglut3-dev pkg-config build-essential libxmu-dev libxi-dev libusb-1.0-0-dev doxygen graphviz mono-complete
【1】安装OpenNI:
mkdir ~/kinect
cd ~/kinect
git clone https://github.com/OpenNI/OpenNI.git
cd OpenNI
git checkout unstable
cd Platform/Linux/CreateRedist/
chmod +x RedistMaker
./RedistMaker
cd ../Redist/OpenNI-Bin-Dev-Linux-x64-v1.5.8.5
sudo ./install.sh
执行到./RedistMaker时会报错:
/bin/sh: 1: javac: not found
make[1]: *** [../../../Bin/x64-Release/org.OpenNI.jar] Error 127
make: *** [Wrappers/OpenNI.java] Error 2
原来是没有安装java开发工具和java运行环境,安装一下:
sudo apt-get install openjdk-7-jdk openjdk-7-jre
安装完成后再往下安装OpenNI
【2】安装SensorKinect:
cd ~/kinect/
git clone https://github.com/ph4m/SensorKinect.git
cd SensorKinect
git checkout unstable
cd Platform/Linux/CreateRedist/
chmod +x RedistMaker
./RedistMaker
cd ../Redist/Sensor-Bin-Linux-x64-v5.1.2.1/
chmod +x install.sh
sudo ./install.sh
【3】安装NITE:
由于OpenNI被卖了还是怎么的,www.openni.org官网失效了,所以只能预先在网上下载,我下载的是NITE-Bin-Linux-x86-v1.5.2.23.tar.zip,下载地址:pan.baidu.com/s/1gd9XdIV
cd ~/kinect
tar -xvjpf nite-bin-linux-x64-v1.5.2.23.tar.bz2
cd NITE-Bin-Dev-Linux-x64-v1.5.2.23/Data
(.zip提取出来就是.bz2格式)
在Data文件夹中有三个文件:Sample-Scene.xml, Sample-Tracking.xml, and Sample-User.xml,分别修改三个文件:
改为:

然后执行:
cd ..
sudo ./install.sh

以上执行完后kinect驱动就安装好了,下面执行自带的例子验证安装是否安装成功:
cd ~/kinect/OpenNI/Platform/Linux/Bin/x64-Release
./Sample-NiSimpleSkeleton
(将kinect插上电脑,人站在kinect前应该会显示head的坐标)
cd ~/kinect/NITE-Bin-Dev-Linux-x64-v1.5.2.23/Samples/Bin/x64-Release
./Sample-PointViewer
(将kinect插上电脑,人站在kinect前应该会显示depth图)

接下来
安装两个库:
         sudo apt-get install ros-indigo-openni-launch
         sudo apt-get install ros-indigo-openni-camera
下面安装openni_tracker,实现骨骼点检测功能的关键库:
         cd ~/catkin_ws/src
         git clone https://github.com/ros-drivers/openni_tracker.git 在ubuntu上git下载太慢,直接去github上load下来编译
         cd ..
         catkin_make
需要安装的就这些,在实现之前还有3个地方需要检查一下:
1.检查一下openni_tracker这个package所在的路径是否已经添加到ROS_PACKAGE_PATH中,如果没有后面会报错找不到openni_tracker这个package
    在终端中输入:echo $ROS_PACKAGE_PATH,如果没有显示openni_tracker这个package所在的路径,则在终端中输入:
    export ROS_PACKAGE_PATH=/home/user_name/catkin_ws/src:/opt/ros/indigo/share:/opt/ros/indigo/stacks(具体内容是在原来ROS_PACKAGE_PATH值的基础上加上openni_tracker的路径,以“:”分隔)
2.source /home/user_name/catkin_ws/devel/setup.bash,也可以把这句话加到.bashrc中,这样就不必每次打开终端都输一遍了
3.rosrun rqt_reconfigure rqt_reconfigure
打开设置窗口,在camera下选择driver,然后勾选depth_registration。具体过程和作用可以参考官网说明:http://wiki.ros.org/openni_launch/Tutorials/QuickStart

以上步骤都做好后,下面就是实现功能了:
roslaunch openni_launch openni.launch camera:=openni ……………………………………....(1)
rosrun openni_tracker openni_tracker ……………………………………………………………..(2)
rosrun rviz rviz…………………………………………………………………………………………(3)

在rviz中,把Global Options > Fixed Frame 改成openni_depth_optical_frame(如果没有在(1)指令最后添加camera:=openni,那么Fixed Frame只有camera_depth_optical_frame可选);
点击Add(左下角),选择PointCloud2,确定;
把 PointCloud2 >Topic改成/openni/depth_registered/points;
再点击Add,选择TF;
适当调整可视框中的角度和放大倍数,站在kinect前面,可能需要等一下就能看到骨骼点数据(虽然现实的骨骼点有点杂乱,没有将所有显示的关节点连接起来,win SDK做到了将关节点连接起来显示一个人体骨骼轮廓)。

 

你可能感兴趣的:(机器人,2018暑期)