最后一步又报错了,仅团队交流学习用,不算经验贴
What is OpenVINO?
Intel’s OpenVINO is an acceleration library for optimized computing with Intel’s hardware portfolio.
OpenVINO supports Intel CPUs, GPUs, FPGAs, and VPUs.
Deep learning libraries you’ve come to rely upon such as TensorFlow, Caffe, and mxnet are supported by OpenVINO.
Intel has even optimized OpenCV’s DNN module to support its hardware for deep learning.
In fact, many newer smart cameras use Intel’s hardware along with the OpenVINO toolkit. OpenVINO is edge computing and IoT at its finest — it enables resource-constrained devices like the Raspberry Pi to work with the Movidius coprocessor to perform deep learning at speeds that are useful for real-world applications.
$ sudo apt-get update && sudo apt-get upgrade
$ sudo apt-get install build-essential cmake unzip pkg-config
It is time to install a selection of image and video libraries — these are key to being able to work with image and video files:
$ sudo apt-get install libjpeg-dev libpng-dev libtiff-dev
$ sudo apt-get install libavcodec-dev libavformat-dev libswscale-dev libv4l-dev
$ sudo apt-get install libxvidcore-dev libx264-dev
From there, let’s install GTK, our GUI backend:
$ sudo apt-get install libgtk-3-dev
And now let’s install a package which may help to reduce GTK warnings:
$ sudo apt-get install libcanberra-gtk*
The asterisk ensures we will grab the ARM-specific GTK. It is required.
Now we need two packages which contain numerical optimizations for OpenCV:
$ sudo apt-get install libatlas-base-dev gfortran
And finally, let’s install the Python 3 development headers:
$ sudo apt-get install python3-dev
Once you have all of these prerequisites installed you can move on to the next step.
Download the Intel® Distribution of OpenVINO™ toolkit package file from Intel® Distribution of OpenVINO™ toolkit for Linux*. Select the Intel® Distribution of OpenVINO™ toolkit for Linux package from the dropdown menu.
software
directory:cd ~/software/
By default, the file is saved as l_openvino_toolkit_p_
1
tar -xvzf l_openvino_toolkit_p_<version>.tgz
The files are unpacked to the l_openvino_toolkit_p_
directory.
l_openvino_toolkit_p_
directory:cd l_openvino_toolkit_p_<version>
If you have a previous version of the Intel Distribution of OpenVINO toolkit installed, rename or delete these two directories:
~/inference_engine_samples_build
~/openvino_models
sudo ./install_GUI.sh
sudo ./install.sh
sudo sed -i 's/decline/accept/g' silent.cfg
sudo ./install.sh -s silent.cfg
You can select which OpenVINO components will be installed by modifying the COMPONENTS
parameter in the silent.cfg
file. For example, to install only CPU runtime for the Inference Engine, set COMPONENTS=intel-openvino-ie-rt-cpu__x86_64
in silent.cfg
. To get a full list of available components for installation, run the ./install.sh --list_components
command from the unpacked OpenVINO™ toolkit package.
:For root or administrator: /opt/intel/openvino_
For regular users: /home/
For simplicity, a symbolic link to the latest installation is also created: /opt/intel/openvino_2020.4.287/
.
NOTE: If there is an OpenVINO™ toolkit version previously installed on your system, the installer will use the same destination directory for next installations. If you want to install a newer version to a different directory, you need to uninstall the previously installed versions.
NOTE: The Intel® Media SDK component is always installed in the
/opt/intel/mediasdk
directory regardless of the OpenVINO installation path chosen.
The first core components are installed. Continue to the next section to install additional dependencies.
NOTE: If you installed the Intel® Distribution of OpenVINO™ to the non-default install directory, replace /opt/intel with the directory in which you installed the software.
These dependencies are required for:
install_dependencies
directory:cd /opt/intel/openvino_2020.4.287/install_dependencies
sudo -E ./install_openvino_dependencies.sh
The dependencies are installed. Continue to the next section to set your environment variables.
PS:
此处如果出现错误:
ModuleNotFoundError: No module named 'apt_pkg'
解决办法:
sudo apt-get remove --purge python3-apt
sudo apt-get install python3-apt -f
cd /usr/lib/python3/dist-packages/
sudo cp apt_pkg.cpython-36m-x86_64-linux-gnu.so apt_pkg.cpython-37m-x86_64-linux-gnu.so
You must update several environment variables before you can compile and run OpenVINO™ applications. Run the following script to temporarily set your environment variables:
source /opt/intel/openvino_2020.4.287/bin/setupvars.sh
为了下次启用方便,此步骤还是做一下:
Optional: The OpenVINO environment variables are removed when you close the shell. As an option, you can permanently set the environment variables as follows:
.bashrc
file in
:vi ~/.bashrc
source /opt/intel/openvino_2020.4.287/bin/setupvars.sh
:wq
.[setupvars.sh] OpenVINO environment initialized.
The environment variables are set. Continue to the next section to configure the Model Optimizer.
Configure the Model Optimizer
The Model Optimizer is a Python*-based command line tool for importing trained models from popular deep learning frameworks such as Caffe*, TensorFlow*, Apache MXNet*, ONNX* and Kaldi*.
The Model Optimizer is a key component of the Intel Distribution of OpenVINO toolkit. You cannot perform inference on your trained model without running the model through the Model Optimizer. When you run a pre-trained model through the Model Optimizer, your output is an Intermediate Representation (IR) of the network. The Intermediate Representation is a pair of files that describe the whole model:
.xml
: Describes the network topology.bin
: Contains the weights and biases binary dataYou can choose to either configure all supported frameworks at once OR configure one framework at a time. Choose the option that best suits your needs. If you see error messages, make sure you installed all dependencies.
NOTE: Since the TensorFlow framework is not officially supported on CentOS*, the Model Optimizer for TensorFlow can’t be configured and ran on those systems.
IMPORTANT: The Internet access is required to execute the following steps successfully. If you have access to the Internet through the proxy server only, please make sure that it is configured in your OS environment.
简单起见,选择第一种方式
>>>
Option 1: Configure all supported frameworks at the same time
cd /opt/intel/openvino_2020.4.287/deployment_tools/model_optimizer/install_prerequisites
sudo ./install_prerequisites.sh
PS
: 如果此处出现错误:
Command "/usr/bin/python3 -u -c "import setuptools, tokenize;
__file__='/tmp/pip-build-u18wjxip/onnx/setup.py';
f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-cigm8pw8-record/install-record.txt --single-version-externally-managed --compile"
failed with error code 1 in /tmp/pip-build-u18wjxip/onnx/
解决办法:
sudo apt-get remove protobuf-compiler
sudo apt-get install protobuf-compiler
Option 2: Configure each framework separately
Configure individual frameworks separately ONLY if you did not select Option 1 above.
cd /opt/intel/openvino_2020.4.287/deployment_tools/model_optimizer/install_prerequisites
sudo ./install_prerequisites_caffe.sh
For TensorFlow 1.x:
sudo ./install_prerequisites_tf.sh
For TensorFlow 2.x:
sudo ./install_prerequisites_tf2.sh
For MXNet:
sudo ./install_prerequisites_mxnet.sh
For ONNX:
sudo ./install_prerequisites_onnx.sh
For Kaldi:
sudo ./install_prerequisites_kaldi.sh
The Model Optimizer is configured for one or more frameworks.
You have completed all required installation, configuration and build steps in this guide to use your CPU to work with your trained models.
To enable inference on other hardware, see:
Or proceed to the Get Started to get started with running code samples and demo applications.
因为我买的计算棒2,所以后面只截取计算棒2方面的教程,需要其他的教程,直接进入原网页看就行了(文末资源第2个链接)
These steps are only required if you want to perform inference on Intel® Movidius™ NCS powered by the Intel® Movidius™ Myriad™ 2 VPU or Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X VPU. See also the Get Started page for Intel® Neural Compute Stick 2:
users
group:sudo usermod -a -G users "$(whoami)"
Log out and log in for it to take effect.
sudo cp /opt/intel/openvino_2020.3.341/inference_engine/external/97-myriad-usbboot.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules
sudo udevadm trigger
sudo ldconfig
NOTE: You may need to reboot your machine for this to take effect.
You’ve completed all required configuration steps to perform inference on Intel® Neural Compute Stick 2. Proceed to the Get Started to get started with running code samples and demo applications.
Now you are ready to get started. To continue, see the following pages:
如果需要关于卸载和其他资源部分,直接进入原链接看即可(文末资源第2个链接)