jetson TX2安装TensorRT

文章目录

  • 1. deepstream-l4t镜像
    • 1.1 拉取镜像
    • 1.2 启动镜像
    • 1.3 镜像换源
  • 2. 安装软件
    • 2.1 软件升级
    • 2.2 安装gstreamer
    • 2.3 安装opencv
  • 3. 安装TensorRT
    • 3.1TensorRT安装参考
    • 3.2 下载安装包
    • 3.3 安装cuda
    • 3.4 安装cudnn
    • 3.5 安装TensorRT
  • 4. 测试安装

1. deepstream-l4t镜像

1.1 拉取镜像

通过以下命令拉取镜像

docker pull nvcr.io/nvidia/deepstream-l4t:5.0.1-20.09-samples

l4t means Linux for Tegra

关于BaseSamplesIoT镜像的说明见https://ngc.nvidia.com/catalog/containers/nvidia:deepstream-l4t

1.2 启动镜像

docker run --gpus all --name=jetson_test --privileged --ipc=host -p 23222:22 -p 34389:3389 -itd -v /data/yzm_iavs/:/data/yzm_iavs nvcr.io/nvidia/deepstream-l4t:5.0.1-20.09-samples /bin/bash

1.3 镜像换源

通过修改/etc/apt/sources.list,可以将jetson的apt源换为国内源

deb https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic main restricted universe multiverse
# deb-src https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic main restricted universe multiverse
deb https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-updates main restricted universe multiverse
# deb-src https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-updates main restricted universe multiverse
deb https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-backports main restricted universe multiverse
# deb-src https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-backports main restricted universe multiverse
deb https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-security main restricted universe multiverse
# deb-src https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-security main restricted universe multiverse
deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ xenial main multiverse restricted universe # for opencv

# deb https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-proposed main restricted universe multiverse
# deb-src https://mirrors.tuna.tsinghua.edu.cn/ubuntu-ports/ bionic-proposed main restricted universe multiverse

2. 安装软件

2.1 软件升级

apt-get update && apt-get upgrade -y

2.2 安装gstreamer

apt-get install libgstreamer-plugins-base1.0-dev libgstreamer1.0-dev libgstrtspserver-1.0-dev libx11-dev

2.3 安装opencv

  • opencv在cuda等安装后安装
  • opencv依赖
apt-get install cmake git libgtk2.0-dev pkg-config  libavcodec-dev libavformat-dev libswscale-dev
apt-get install libtbb2  libtbb-dev libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev
add-apt-repository "deb http://security.ubuntu.com/ubuntu xenial-security main"
apt-get update
apt install libjasper1 libjasper-dev 
apt-get install qtbase5-dev qtdeclarative5-dev
  • 配置cmake
cmake -D WITH_QT=ON \
      -D WITH_CUDA=ON \
      -D BUILD_TIFF=ON \
      -D BUILD_TESTS=OFF \
      -D BUILD_PERF_TESTS=OFF \
      -D OPENCV_GENERATE_PKGCONFIG=ON \
      -D CMAKE_INSTALL_PREFIX=/usr/local \
      -D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib/modules/ \
      -D BUILD_opencv_xfeatures2d=OFF  ..
  • build
make -j4
make install

3. 安装TensorRT

3.1TensorRT安装参考

https://developer.download.nvidia.com/assets/embedded/docs/JP_4.5_DP/20.09-Jetson-CUDA-X-AI-Developer-Preview-Installation-Instructions.pdf?Gs3vQ_4IJu3viKzX7vFd0sk11MkIJy_o3nRlQ2mp-GdTRuA4nFdxQMJ2rGcUuqUAiFBn7Hnph_TmrAzEZvj1oCVw-7k2yVfc358KOHTd4BHm54xMjkPELxMp1S2yKPvSJ26plQCKMd5940KCDhjkLjy6HtB43j6rSgqJNvCuUQJD8Eo8Ihw9jcXaO21XgrqOzvdXko4MRI9pzhc2mPURjlvkB7c

3.2 下载安装包

安装包链接:https://developer.nvidia.com/20.09_Jetson_CUDA-X_AI_DP_Xavier,下载后并解压。

3.3 安装cuda

dpkg -i cuda-repo-l4t-10-2-local-10.2.89_1.0-1_arm64.deb
apt-key add /var/cuda-repo-10-2-local-10.2.89/7fa2af80.pub
apt-get -y update
apt-get -y install cuda-toolkit-10-2 libgomp1 libfreeimage-dev libopenmpi-dev

3.4 安装cudnn

dpkg -i libcudnn8_8.0.2.39-1+cuda10.2_arm64.deb
dpkg -i libcudnn8-dev_8.0.2.39-1+cuda10.2_arm64.deb
dpkg -i libcudnn8-doc_8.0.2.39-1+cuda10.2_arm64.deb

3.5 安装TensorRT

dpkg -i libnvinfer7_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvinfer-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvinfer-plugin7_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvinfer-plugin-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvonnxparsers7_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvonnxparsers-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvparsers7_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvparsers-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvinfer-bin_7.2.0-1+cuda10.2_arm64.deb
dpkg -i libnvinfer-doc_7.2.0-1+cuda10.2_all.deb
dpkg -i libnvinfer-samples_7.2.0-1+cuda10.2_all.deb
dpkg -i tensorrt_7.2.0.14-1+cuda10.2_arm64.deb
dpkg -i python-libnvinfer_7.2.0-1+cuda10.2_arm64.deb
dpkg -i python-libnvinfer-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i python3-libnvinfer_7.2.0-1+cuda10.2_arm64.deb
dpkg -i python3-libnvinfer-dev_7.2.0-1+cuda10.2_arm64.deb
dpkg -i graphsurgeon-tf_7.2.0-1+cuda10.2_arm64.deb
dpkg -i uff-converter-tf_7.2.0-1+cuda10.2_arm64.deb

4. 测试安装

  • 下载TensorRT demo代码
git clone https://github.com/linghu8812/tensorrt_inference.git
  • 测试yolov5 demo
    将yolov5的PyTorch模型转为ONNX模型后,编译并进行测试
cd tensorrt_inference/yolov5
mkdir build && cd build
cmake ..
make -j
./yolov5_trt ../config.yaml ../samples

你可能感兴趣的:(TensorRT,Linux,Docker学习,jetson,TensorRT,docker,cuda,边缘计算)