Ubuntu 16.04 在Conda沙盒环境下安装Caffe(Python2.7.15 + Protobuf2.6.1 + GPU)

最近做的课题需要阅读Caffe的代码,就抽时间配一下。发现官方教程是基于Python2.7的,做死尝试了一次Python3.6没有成功(以后有空再研究),所以就先用Python2.7.15 + Protobuf2.6.1配了一下。 因为担心Caffe会影响Miniconda3的其他环境,这里为Caffe单独创建一个沙盒环境。

前置要求是你需要成功配置GPU或者CPU,我使用的是Miniconda3 + CUDA 9.0 + CuDNN 7.1.4,具体可以参考博文

1. 创建Caffe沙盒

这里我使用Protobuf2.6.1的原因在于OpenPose是不兼容Miniconda3新的Protobuf Compiler。

conda create --name caffe-py2.7.15-pr2.6.1 python=2.7.15

2. 下载Caffe

进入miniconda3/envs/caffe-py2.7.15-pr2.6.1, 执行以下命令

cd ~/miniconda3/envs/caffe-py2.7.15-pr2.6.1/
git clone https://github.com/BVLC/caffe.git

如果你需要某个commit版本的Caffe,执行

cd caffe/
git checkout commit

如果你需要编译OpenPose(今天是2018.11.19,该贡献人员还没解决因最新版Caffe增加Layer导致与OpenPose不兼容的问题,后期解决后就不需要这么做了),当前的commitf019d0dfe86f49d1140961f8c7dec22130c83154

小组的同学需要执行

cd caffe/
git checkout f019d0dfe86f49d1140961f8c7dec22130c83154

3. 安装依赖

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install -y build-essential cmake git pkg-config
sudo apt-get install -y libprotobuf-dev libleveldb-dev libsnappy-dev protobuf-compiler
sudo apt-get install -y libatlas-base-dev 
sudo apt-get install -y --no-install-recommends libboost-all-dev
sudo apt-get install -y libgflags-dev libgoogle-glog-dev liblmdb-dev
sudo apt-get install -y libopencv-dev libhdf5-serial-dev
sudo apt-get install -y build-essential cmake git libgtk2.0-dev pkg-config python-dev python-numpy libdc1394-22 libdc1394-22-dev libjpeg-dev libpng12-dev libtiff5-dev libjasper-dev libavcodec-dev libavformat-dev libswscale-dev libxine2-dev libgstreamer0.10-dev libgstreamer-plugins-base0.10-dev libv4l-dev libtbb-dev libqt4-dev libfaac-dev libmp3lame-dev libopencore-amrnb-dev libopencore-amrwb-dev libtheora-dev libvorbis-dev libxvidcore-dev x264 v4l-utils unzip

4. 激活caffe环境并安装opencv

source activate caffe-py2.7.15-pr2.6.1
conda install opencv

5. Protobuf版本选择

请注意这一步非常非常重要!!!
编译Caffe时Protobuf版本的选择也是有讲究的(如果你需要其他特定版本的库,具体做法和这个类似)。拿我的库举个例子:Miniconda3的Protobuf版本是3.5.2,而通过apt-get 安装的Protobuf版本是2.6.1。

Github上有些开源项目使用的是较低版本(如2.6.1)的Protobuf,你用高版本(如3.5.2)的Protobuf编译后的Caffe去配置他们的项目肯定会失败的。编译Caffe所使用的依赖库版本在执行7.1后终端会给出提示信息。

注意:5.15.2只需要执行一个即可,5.3是教大家如何安装和删除protobuf的,没有需求跳过即可。小组成员请选择5.2执行

5.1 使用Conda的Protobuf

只要Miniconda3的路径在bashrc,而且你通过Conda安装了libprotobuf和protobuf这两个库,那么直接编译Caffe即可。对我的机器而言,这时候编译Caffe是基于Protobuf3.5.2的。

source activate caffe-py2.7.15-pr2.6.1
conda install protobuf
5.2 使用自己的版本

首先你要保证当前的沙盒环境中没有libprotobuf和protobuf这两个库(激活当前的caffe-py2.7.15-pr2.6.1,然后conda list查看是否有这两个库,有的库需要删掉),然后注释掉Miniconda3在bashrc的环境变量,接下来就可以直接使用系统的库去编译Caffe了。当然如果你想使用一些特定版本(非系统库)的库去编译,你还需要将它的路径导入bashrc,然后再编译。

vim ~/.bashrc
注释掉Miniconda3的环境变量
source ~/.bashrc

修改完毕之后,重启终端进入Caffe目录

cd ~/miniconda3/envs/caffe-py2.7.15-pr2.6.1/caffe/
5.3 卸载安装Protobuf

libprotobuf和protobuf是一起安装和卸载的。不要忘记先激活当前的沙盒环境,要不然你就是对base环境操作了

conda install protobuf
conda uninstall protobuf

6. 修改编译文件

cp Makefile.config.example Makefile.config

我的Makefile.config如下:

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
USE_CUDNN := 1

# CPU-only switch (uncomment to build without GPU support).
# CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0
# This code is taken from https://github.com/sh1r0/caffe-android-lib
# USE_HDF5 := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#	You should not set this flag if you will be reading LMDBs with any
#	possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
OPENCV_VERSION := 3

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
# For CUDA >= 9.0, comment the *_20 and *_21 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
		-gencode arch=compute_20,code=sm_21 \
		-gencode arch=compute_30,code=sm_30 \
		-gencode arch=compute_35,code=sm_35 \
		-gencode arch=compute_50,code=sm_50 \
		-gencode arch=compute_52,code=sm_52 \
		-gencode arch=compute_60,code=sm_60 \
		-gencode arch=compute_61,code=sm_61 \
		-gencode arch=compute_61,code=compute_61

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
# PYTHON_INCLUDE := /usr/include/python2.7 \
		# /usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := $(HOME)/miniconda3/envs/caffe-py2.7.15-pr2.6.1
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
		$(ANACONDA_HOME)/include/python2.7 \
		$(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include

# Uncomment to use Python 3 (default is Python 2)
# PYTHON_LIBRARIES := boost_python3 python3.5m
# PYTHON_INCLUDE := /usr/include/python3.5m \
#                 /usr/lib/python3.5/dist-packages/numpy/core/include

# We need to be able to find libpythonX.X.so or .dylib.
# PYTHON_LIB := /usr/lib
PYTHON_LIB := $(ANACONDA_HOME)/lib

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
WITH_PYTHON_LAYER := 1

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include 
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib

# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# NCCL acceleration switch (uncomment to build with NCCL)
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
# USE_NCCL := 1

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

# N.B. both build and distribute dirs are cleared on `make clean`
BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @

7. 编译Caffe

7.1 cmake

不要忘记在第5步正确选择你所需要的Protobuf版本(以及其他库的版本)

mkdir build
cd build/
cmake ..

这一步如果出现以下Warning信息:

Make Warning (dev) in src/caffe/CMakeLists.txt:
Policy CMP0022 is not set: INTERFACE_LINK_LIBRARIES defines the link
interface.  Run "cmake --help-policy CMP0022" for policy details.  Use the
cmake_policy command to set the policy and suppress this warning.

CMakeLists.txt头部加入,再重新编译即可

if(POLICY CMP0022)
  cmake_policy(SET CMP0022 NEW)
endif()

这是我的依赖信息,可以看到我使用的Protobuf2.6.1。如果这一步出错,问题很有可能是你少装了一些库,具体错误信息会有提示,比如import XXX的时候出错,这时候你再装一下这个库就行了。

Dependencies:
--   BLAS              :   Yes (Atlas)
--   Boost             :   Yes (ver. 1.58)
--   glog              :   Yes
--   gflags            :   Yes
--   protobuf          :   Yes (ver. 2.6.1)
--   lmdb              :   Yes (ver. 0.9.17)
--   LevelDB           :   Yes (ver. 1.18)
--   Snappy            :   Yes (ver. 1.1.3)
--   OpenCV            :   Yes (ver. 2.4.9.1)
--   CUDA              :   Yes (ver. 9.0)
-- 
-- NVIDIA CUDA:
--   Target GPU(s)     :   Auto
--   GPU arch(s)       :   sm_61
--   cuDNN             :   Yes (ver. 7.1.4)
-- 
-- Python:
--   Interpreter       :   /usr/bin/python2.7 (ver. 2.7.12)
--   Libraries         :   /usr/lib/x86_64-linux-gnu/libpython2.7.so (ver 2.7.12)
--   NumPy             :   /usr/lib/python2.7/dist-packages/numpy/core/include (ver 1.11.0)
7.2 make
make all -j8
make install -j8
make runtest -j8

这一步如果make runtest提示hdf5版本不匹配, 比如:

Headers are 1.10.2, library is 1.8.17

那么你需要安装1.10.2版本(具体版本需要根据你自己的需求)的hdf5

conda install hdf5=1.10.2

需要注意的是:如果新版本的hdf5会移除opencv,那么在安装新版本的hdf5之后,需要重新装一下opencv。

8. 加入环境变量

8.1 打开bashrc
vim ~/.bashrc
8.2 加入环境变量
export PYTHONPATH=/home/czy/miniconda3/envs/caffe-py2.7.15-pr2.6.1/caffe/python:$PYTHONPATH

不要忘记把Miniconda3的环境变量取消注释以及将特定版本库的环境变量注释掉

8.3 保存退出
source ~/.bashrc

9. 验证

9.1 重新开启终端, 进入caffe环境
source activate caffe-py2.7.15-pr2.6.1
9.2 安装一些依赖库
conda install cython scikit-image protobuf scikit-learn ipython pandas jupyter tqdm lxml pillow
9.3 查看caffe版本
python -c "import caffe;print caffe.__version__"

如果成功显示版本, 则说明caffe安装成功。

Reference

https://gist.github.com/FrancoisPl/e7375c3a08c1b73d5547709e97405253
http://caffe.berkeleyvision.org/installation.html

你可能感兴趣的:(深度学习)