caffe-MobileNet-ssd环境搭建及训练自己的数据集模型

caffe-MobileNet-ssd环境搭建及训练自己的数据集模型,转载来自https://blog.csdn.net/cs_fang_dn/article/details/78790790


**************************************************************************************************************
一、Ubuntu16.04环境设置
①在Ubuntu中首先设置更新源,选择中国服务器中的aliyun站点

②下载Anaconda2的Linux版本,官网地址 https://www.anaconda.com/download/#linux
然后就是安装Anaconda2,这个不多说,安装的最后一步就是询问你是否加入环境变量,选yes就好了,如果错过了,
那么就自己加入环境变量(编辑~/.bashrc, 加入export PATH=”/home/gdu/anaconda2/bin:$PATH”)
③下载opencv安装包,百度云地址(链接:https://pan.baidu.com/s/1hsOqVGC 密码:sinx

④cuda的安装,详细见教程
http://www.jianshu.com/p/5b708817f5d8?open_source=weibo_search
http://keras-cn.readthedocs.io/en/latest/for_beginners/keras_linux/
,或者你自己收藏的教程。 最终cuda安装好以后,在终端输入 nvidia-smi,如果出现相关GPU信息,说明安装完成

**************************************************************************************************************

二、opencv安装编译

sudo apt-get install build-essential
sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev
sudo apt-get install python-dev python-numpy libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libjasper-dev libdc1394-22-dev

解压opencv 和opencv_contrib并放在home目录下

cd opencv
mkdir build
cd build
cmake -D CMAKE_BUILD_TYPE=Release -D OPENCV_EXTRA_MODULES_PATH=~/opencv_contrib/modules/ -D CMAKE_INSTALL_PREFIX=/usr/local ..
make -j8
-j8的意思是cpu的可用核数,这里是8核

如果出现Downloading ippicv_linux_20151201.tgz…这个错误
则 cd ~/opencv/3rdparty/ippicv/downloads/linux-808…../
替换ippicv_linux_20151201.tgz,文件下载地址 链接:https://pan.baidu.com/s/1c3vIgU0 密码:di07

重新编译

make -j8
sudo make install

编辑文件/etc/ld.so.conf,在里面加入环境配置(nano只是一个编辑器,ctrl+X保存退出,再按y ,回车,就保存了,如果你熟悉其他的编译器就用其他的编译器喽)
sudo nano /etc/ld.so.conf

加入
/usr/local/lib

更新配置
sudo ldconfig

测试opencv是否安装好了:
①测试图片

②测试代码

      
      
      
      
  1. #include
  2. #include
  3. int main(int argc,char* argv[]){
  4. const std:: string window_name = “lena”;
  5. const std:: string input_pic = “lena.jpg”;
  6. cv::Mat test_pic = cv::imread(input_pic);
  7. if(test_pic.empty()){
  8. std:: cout<< “no input image”<< std:: endl;
  9. return 1;
  10. }
  11. cv::namedWindow(window_name);
  12. cv::imshow(window_name,test_pic);
  13. cv::waitKey( 0);
  14. return 0;
  15. }


③编译
g++ -L/usr/local/lib -o test_opencv test_opencv.cpp -lopencv_core -lopencv_highgui -lopencv_imgcodecs

④运行
./test_opencv

**************************************************************************************************************
三、caffe-ssd安装
参考官网安装要求 http://caffe.berkeleyvision.org/install_apt.html
①ubuntu16.04下安装caffe依赖包
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev protobuf-compiler

sudo apt-get install –no-install-recommends libboost-all-dev

sudo apt-get install libatlas-base-dev

sudo apt-get install libopenblas-dev

sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev

如果上述依赖包安装出错,有可能是dns原因,修改/etc/resolve.conf,再里面加入 nameserver 8.8.8.8

②ssd编译
假设你的用户目录是/home/gdu/
在用户目录下
git clone https://github.com/weiliu89/caffe.git
cd caffe
git checkout ssd

假设你的caffe根目录 cafferoot/home/gdu/caffe c a f f e r o o t 为 / h o m e / g d u / c a f f e , 请 谨 记 你 的 caffe_root目录,下面可能会用到

在caffe根目录下替换如下两个文件(Makefile.config和Makefile)
避免报错,可能需要在Makefile.config 中,加入一句
LINKFLAGS := -Wl,-rpath,$(HOME)/anaconda2/lib



      
      
      
      
  1. ## Refer to http://caffe.berkeleyvision.org/installation.html
  2. # Contributions simplifying and improving our build system are welcome!
  3. # cuDNN acceleration switch (uncomment to build with cuDNN).
  4. USE_CUDNN := 1
  5. # CPU-only switch (uncomment to build without GPU support).
  6. # CPU_ONLY := 1
  7. # uncomment to disable IO dependencies and corresponding data layers
  8. # USE_OPENCV := 0
  9. # USE_LEVELDB := 0
  10. # USE_LMDB := 0
  11. # uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
  12. # You should not set this flag if you will be reading LMDBs with any
  13. # possibility of simultaneous read and write
  14. # ALLOW_LMDB_NOLOCK := 1
  15. # Uncomment if you’re using OpenCV 3
  16. OPENCV_VERSION := 3
  17. # To customize your choice of compiler, uncomment and set the following.
  18. # N.B. the default for Linux is g++ and the default for OSX is clang++
  19. # CUSTOM_CXX := g++
  20. # CUDA directory contains bin/ and lib/ directories that we need.
  21. CUDA_DIR := /usr/local/cuda
  22. # On Ubuntu 14.04, if cuda tools are installed via
  23. # “sudo apt-get install nvidia-cuda-toolkit” then use this instead:
  24. # CUDA_DIR := /usr
  25. # CUDA architecture setting: going with all of them.
  26. # For CUDA < 6.0, comment the lines after *_35 for compatibility.
  27. CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
  28. -gencode arch=compute_20,code=sm_21 \
  29. -gencode arch=compute_30,code=sm_30 \
  30. -gencode arch=compute_35,code=sm_35 \
  31. -gencode arch=compute_50,code=sm_50 \
  32. -gencode arch=compute_52,code=sm_52 \
  33. -gencode arch=compute_61,code=sm_61
  34. # BLAS choice:
  35. # atlas for ATLAS (default)
  36. # mkl for MKL
  37. # open for OpenBlas
  38. # BLAS := atlas
  39. BLAS := open
  40. # Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
  41. # Leave commented to accept the defaults for your choice of BLAS
  42. # (which should work)!
  43. # BLAS_INCLUDE := /path/to/your/blas
  44. # BLAS_LIB := /path/to/your/blas
  45. # Homebrew puts openblas in a directory that is not on the standard search path
  46. # BLAS_INCLUDE := $(shell brew --prefix openblas)/include
  47. # BLAS_LIB := $(shell brew –prefix openblas)/lib
  48. # This is required only if you will compile the matlab interface.
  49. # MATLAB directory should contain the mex binary in /bin.
  50. # MATLAB_DIR := /usr/local
  51. # MATLAB_DIR := /Applications/MATLAB_R2012b.app
  52. # NOTE: this is required only if you will compile the python interface.
  53. # We need to be able to find Python.h and numpy/arrayobject.h.
  54. #PYTHON_INCLUDE := /usr/include/python2.7 \
  55. # /usr/lib/python2.7/dist-packages/numpy/core/include
  56. ”color:rgb(223,64,42);font-size:14px;background-color:rgb(255,255,255);”>LINKFLAGS := -Wl,-rpath,$(HOME)/anaconda2/lib</span>
  57. # Anaconda Python distribution is quite popular. Include path:
  58. # Verify anaconda location, sometimes it's in root.
  59. ANACONDA_HOME := $(HOME)/anaconda2
  60. PYTHON_INCLUDE := $(ANACONDA_HOME)/include \&amp;lt;/div>
  61. $(ANACONDA_HOME)/ include/python2 .7 \
  62. $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include \&amp;lt;/div>
  63. #ANACONDA_HOME := $(HOME)/anaconda3
  64. #PYTHON_INCLUDE := $(ANACONDA_HOME)/include \&amp;lt;/span>
  65. # $(ANACONDA_HOME)/include/python3.6m \
  66. # $(ANACONDA_HOME)/lib/python3.6/site-packages/numpy/core/include \&amp;lt;/span>
  67. # Uncomment to use Python 3 (default is Python 2)
  68. # PYTHON_LIBRARIES := boost_python3 python3.5m
  69. # PYTHON_INCLUDE := /usr/include/python3.5m \&amp;lt;/span>
  70. # /usr/lib/python3.5/dist-packages/numpy/core/include
  71. # We need to be able to find libpythonX.X.so or .dylib.
  72. PYTHON_LIB := /usr/lib
  73. # PYTHON_LIB := $(ANACONDA_HOME)/lib
  74. # Homebrew installs numpy in a non standard path (keg only)
  75. # PYTHON_INCLUDE += $(dir $(shell python -c ‘import numpy.core; print(numpy.core.__file__)’))/include
  76. # PYTHON_LIB += $(shell brew --prefix numpy)/lib
  77. # Uncomment to support layers written in Python (will link against Python libs)
  78. WITH_PYTHON_LAYER := 1
  79. # Whatever else you find you need goes here.
  80. #INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
  81. INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
  82. #LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib
  83. LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial
  84. # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
  85. # INCLUDE_DIRS += $(shell brew –prefix)/include
  86. # LIBRARY_DIRS += $(shell brew --prefix)/lib
  87. # Uncomment to use `pkg-config` to specify OpenCV library paths.
  88. # (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
  89. # USE_PKG_CONFIG := 1
  90. # N.B. both build and distribute dirs are cleared on `make clean`
  91. BUILD_DIR := build
  92. DISTRIBUTE_DIR := distribute
  93. # Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
  94. # DEBUG := 1
  95. # The ID of the GPU that ‘make runtest’ will use to run unit tests.
  96. TEST_GPUID := 0
  97. # enable pretty build (comment to see full commands)
  98. Q ?= @


      
      
      
      
  1. PROJECT := caffe
  2. CONFIG_FILE := Makefile.config
  3. # Explicitly check for the config file, otherwise make -k will proceed anyway.
  4. ifeq ($(wildcard $(CONFIG_FILE)),)
  5. $(error $(CONFIG_FILE) not found. See $(CONFIG_FILE).example.)
  6. endif
  7. include $(CONFIG_FILE)
  8. BUILD_DIR_LINK := $(BUILD_DIR)
  9. ifeq ($(RELEASE_BUILD_DIR),)
  10. RELEASE_BUILD_DIR := .$(BUILD_DIR)_release
  11. endif
  12. ifeq ($(DEBUG_BUILD_DIR),)
  13. DEBUG_BUILD_DIR := .$(BUILD_DIR)_debug
  14. endif
  15. DEBUG ?= 0
  16. ifeq ($(DEBUG), 1)
  17. BUILD_DIR := $(DEBUG_BUILD_DIR)
  18. OTHER_BUILD_DIR := $(RELEASE_BUILD_DIR)
  19. else
  20. BUILD_DIR := $(RELEASE_BUILD_DIR)
  21. OTHER_BUILD_DIR := $(DEBUG_BUILD_DIR)
  22. endif
  23. # All of the directories containing code.
  24. SRC_DIRS := $(shell find * -type d -exec bash -c "find {} -maxdepth 1 \&amp;lt;/span>
  25. \( -name '*.cpp' -o -name '*.proto' \) | grep -q ." \; -print)
  26. # The target shared library name
  27. LIBRARY_NAME := $(PROJECT)
  28. LIB_BUILD_DIR := $(BUILD_DIR)/lib
  29. STATIC_NAME := $(LIB_BUILD_DIR)/lib$(LIBRARY_NAME).a
  30. DYNAMIC_VERSION_MAJOR := 1
  31. DYNAMIC_VERSION_MINOR := 0
  32. DYNAMIC_VERSION_REVISION := 0-rc3
  33. DYNAMIC_NAME_SHORT := lib$(LIBRARY_NAME).so
  34. #DYNAMIC_SONAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR)
  35. DYNAMIC_VERSIONED_NAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
  36. DYNAMIC_NAME := $(LIB_BUILD_DIR)/$(DYNAMIC_VERSIONED_NAME_SHORT)
  37. COMMON_FLAGS += -DCAFFE_VERSION=$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
  38. ##############################
  39. # Get all source files
  40. ##############################
  41. # CXX_SRCS are the source files excluding the test ones.
  42. CXX_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cpp" -name "*.cpp")
  43. # CU_SRCS are the cuda source files
  44. CU_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cu" -name "*.cu")
  45. # TEST_SRCS are the test source files
  46. TEST_MAIN_SRC := src/$(PROJECT)/ test/test_caffe_main.cpp
  47. TEST_SRCS := $(shell find src/$(PROJECT) -name ”test_*.cpp”)
  48. TEST_SRCS := $(filter-out $(TEST_MAIN_SRC), $(TEST_SRCS))
  49. TEST_CU_SRCS := $(shell find src/$(PROJECT) -name "test_*.cu")
  50. GTEST_SRC := src/gtest/gtest-all.cpp
  51. # TOOL_SRCS are the source files for the tool binaries
  52. TOOL_SRCS := $(shell find tools -name ”*.cpp”)
  53. # EXAMPLE_SRCS are the source files for the example binaries
  54. EXAMPLE_SRCS := $(shell find examples -name "*.cpp")
  55. # BUILD_INCLUDE_DIR contains any generated header files we want to include.
  56. BUILD_INCLUDE_DIR := $(BUILD_DIR)/src
  57. # PROTO_SRCS are the protocol buffer definitions
  58. PROTO_SRC_DIR := src/$(PROJECT)/proto
  59. PROTO_SRCS := $(wildcard $(PROTO_SRC_DIR)/*.proto)
  60. # PROTO_BUILD_DIR will contain the .cc and obj files generated from
  61. # PROTO_SRCS; PROTO_BUILD_INCLUDE_DIR will contain the .h header files
  62. PROTO_BUILD_DIR := $(BUILD_DIR)/$(PROTO_SRC_DIR)
  63. PROTO_BUILD_INCLUDE_DIR := $(BUILD_INCLUDE_DIR)/$(PROJECT)/proto
  64. # NONGEN_CXX_SRCS includes all source/header files except those generated
  65. # automatically (e.g., by proto).
  66. NONGEN_CXX_SRCS := $(shell find \
  67. src/$(PROJECT) \&amp;lt;/div>
  68. include/$(PROJECT) \
  69. python/$(PROJECT) \&amp;lt;/div>
  70. matlab/+$(PROJECT)/private \
  71. examples \
  72. tools \
  73. -name ”*.cpp” -or -name ”*.hpp” -or -name ”*.cu” -or -name ”*.cuh”)
  74. LINT_SCRIPT := scripts/cpp_lint.py
  75. LINT_OUTPUT_DIR := $(BUILD_DIR)/.lint
  76. LINT_EXT := lint.txt
  77. LINT_OUTPUTS := $(addsuffix .$(LINT_EXT), $(addprefix $(LINT_OUTPUT_DIR)/, $(NONGEN_CXX_SRCS)))
  78. EMPTY_LINT_REPORT := $(BUILD_DIR)/.$(LINT_EXT)
  79. NONEMPTY_LINT_REPORT := $(BUILD_DIR)/$(LINT_EXT)
  80. # PY$(PROJECT)_SRC is the python wrapper for $(PROJECT)
  81. PY$(PROJECT)_SRC := python/$(PROJECT)/_$(PROJECT).cpp
  82. PY$(PROJECT)_SO := python/$(PROJECT)/_$(PROJECT).so
  83. PY$(PROJECT)_HXX := include/$(PROJECT)/layers/python_layer.hpp
  84. # MAT$(PROJECT)_SRC is the mex entrance point of matlab package for $(PROJECT)
  85. MAT$(PROJECT)_SRC := matlab/+$(PROJECT)/private/$(PROJECT)_.cpp
  86. ifneq ($(MATLAB_DIR),)
  87. MAT_SO_EXT := $(shell $(MATLAB_DIR)/bin/mexext)
  88. endif
  89. MAT$(PROJECT)_SO := matlab/+$(PROJECT)/private/$(PROJECT)_.$(MAT_SO_EXT)
  90. ##############################
  91. # Derive generated files
  92. ##############################
  93. # The generated files for protocol buffers
  94. PROTO_GEN_HEADER_SRCS := $(addprefix $(PROTO_BUILD_DIR)/, \
  95. $(notdir ${PROTO_SRCS:.proto=.pb.h}))
  96. PROTO_GEN_HEADER := $(addprefix $(PROTO_BUILD_INCLUDE_DIR)/, \
  97. $(notdir ${PROTO_SRCS:.proto=.pb.h}))
  98. PROTO_GEN_CC := $(addprefix $(BUILD_DIR)/, ${PROTO_SRCS:.proto=.pb.cc})
  99. PY_PROTO_BUILD_DIR := python/$(PROJECT)/proto
  100. PY_PROTO_INIT := python/$(PROJECT)/proto/__init__.py
  101. PROTO_GEN_PY := $(foreach file, ${PROTO_SRCS:.proto=_pb2.py}, \&amp;lt;/div>
  102. $(PY_PROTO_BUILD_DIR)/$(notdir $(file)))
  103. # The objects corresponding to the source files
  104. # These objects will be linked into the final shared library, so we
  105. # exclude the tool, example, and test objects.
  106. CXX_OBJS := $(addprefix $(BUILD_DIR)/, ${CXX_SRCS:.cpp=.o})
  107. CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o})
  108. PROTO_OBJS := ${PROTO_GEN_CC:.cc=.o}
  109. OBJS := $(PROTO_OBJS) $(CXX_OBJS) $(CU_OBJS)
  110. # tool, example, and test objects
  111. TOOL_OBJS := $(addprefix $(BUILD_DIR)/, ${TOOL_SRCS:.cpp=.o})
  112. TOOL_BUILD_DIR := $(BUILD_DIR)/tools
  113. TEST_CXX_BUILD_DIR := $(BUILD_DIR)/src/$(PROJECT)/ test
  114. TEST_CU_BUILD_DIR := $(BUILD_DIR)/cuda/src/$(PROJECT)/ test
  115. TEST_CXX_OBJS := $(addprefix $(BUILD_DIR)/, ${TEST_SRCS:.cpp=.o})
  116. TEST_CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o})
  117. TEST_OBJS := $(TEST_CXX_OBJS) $(TEST_CU_OBJS)
  118. GTEST_OBJ := $(addprefix $(BUILD_DIR)/, ${GTEST_SRC:.cpp=.o})
  119. EXAMPLE_OBJS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o})
  120. # Output files for automatic dependency generation
  121. DEPS := ${CXX_OBJS:.o=.d} ${CU_OBJS:.o=.d} ${TEST_CXX_OBJS:.o=.d} \&amp;lt;/div>
  122. ${TEST_CU_OBJS:.o=.d} $(BUILD_DIR)/${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}
  123. # tool, example, and test bins
  124. TOOL_BINS := ${TOOL_OBJS:.o=.bin}
  125. EXAMPLE_BINS := ${EXAMPLE_OBJS:.o=.bin}
  126. # symlinks to tool bins without the “.bin” extension
  127. TOOL_BIN_LINKS := ${TOOL_BINS:.bin=}
  128. # Put the test binaries in build/test for convenience.
  129. TEST_BIN_DIR := $(BUILD_DIR)/test
  130. TEST_CU_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \&amp;lt;/div>
  131. $(foreach obj,$(TEST_CU_OBJS),$(basename $(notdir $(obj))))))
  132. TEST_CXX_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \&amp;lt;/div>
  133. $(foreach obj,$(TEST_CXX_OBJS),$(basename $(notdir $(obj))))))
  134. TEST_BINS := $(TEST_CXX_BINS) $(TEST_CU_BINS)
  135. # TEST_ALL_BIN is the test binary that links caffe dynamically.
  136. TEST_ALL_BIN := $(TEST_BIN_DIR)/test_all.testbin
  137. ##############################
  138. # Derive compiler warning dump locations
  139. ##############################
  140. WARNS_EXT := warnings.txt
  141. CXX_WARNS := $(addprefix $(BUILD_DIR)/, ${CXX_SRCS:.cpp=.o.$(WARNS_EXT)})
  142. CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o.$(WARNS_EXT)})
  143. TOOL_WARNS := $(addprefix $(BUILD_DIR)/, ${TOOL_SRCS:.cpp=.o.$(WARNS_EXT)})
  144. EXAMPLE_WARNS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o.$(WARNS_EXT)})
  145. TEST_WARNS := $(addprefix $(BUILD_DIR)/, ${TEST_SRCS:.cpp=.o.$(WARNS_EXT)})
  146. TEST_CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o.$(WARNS_EXT)})
  147. ALL_CXX_WARNS := $(CXX_WARNS) $(TOOL_WARNS) $(EXAMPLE_WARNS) $(TEST_WARNS)
  148. ALL_CU_WARNS := $(CU_WARNS) $(TEST_CU_WARNS)
  149. ALL_WARNS := $(ALL_CXX_WARNS) $(ALL_CU_WARNS)
  150. EMPTY_WARN_REPORT := $(BUILD_DIR)/.$(WARNS_EXT)
  151. NONEMPTY_WARN_REPORT := $(BUILD_DIR)/$(WARNS_EXT)
  152. ##############################
  153. # Derive include and lib directories
  154. ##############################
  155. CUDA_INCLUDE_DIR := $(CUDA_DIR)/include
  156. CUDA_LIB_DIR :=
  157. # add /lib64 only if it exists
  158. ifneq ( $(wildcard $(CUDA_DIR)/lib64)”, ”“)
  159. CUDA_LIB_DIR += $(CUDA_DIR)/lib64
  160. endif
  161. CUDA_LIB_DIR += $(CUDA_DIR)/lib
  162. INCLUDE_DIRS += $(BUILD_INCLUDE_DIR) ./src ./include
  163. ifneq ($(CPU_ONLY), 1)
  164. INCLUDE_DIRS += $(CUDA_INCLUDE_DIR)
  165. LIBRARY_DIRS += $(CUDA_LIB_DIR)
  166. LIBRARIES := cudart cublas curand
  167. endif
  168. #LIBRARIES += glog gflags protobuf boost_system boost_filesystem boost_regex m hdf5_hl hdf5
  169. LIBRARIES += glog gflags protobuf boost_system boost_filesystem boost_regex m hdf5_serial_hl hdf5_serial opencv_core opencv_highgui opencv_imgproc opencv_imgcodecs opencv_videoio
  170. # handle IO dependencies
  171. USE_LEVELDB ?= 1
  172. USE_LMDB ?= 1
  173. USE_OPENCV ?= 1
  174. ifeq ($(USE_LEVELDB), 1)
  175. LIBRARIES += leveldb snappy
  176. endif
  177. ifeq ($(USE_LMDB), 1)
  178. LIBRARIES += lmdb
  179. endif
  180. ifeq ($(USE_OPENCV), 1)
  181. LIBRARIES += opencv_core opencv_highgui opencv_imgproc
  182. ifeq ($(OPENCV_VERSION), 3)
  183. LIBRARIES += opencv_imgcodecs opencv_videoio
  184. endif
  185. endif
  186. PYTHON_LIBRARIES ?= boost_python python2.7
  187. WARNINGS := -Wall -Wno-sign-compare
  188. ##############################
  189. # Set build directories
  190. ##############################
  191. DISTRIBUTE_DIR ?= distribute
  192. DISTRIBUTE_SUBDIRS := $(DISTRIBUTE_DIR)/bin $(DISTRIBUTE_DIR)/lib
  193. DIST_ALIASES := dist
  194. ifneq ($(strip $(DISTRIBUTE_DIR)),distribute)
  195. DIST_ALIASES += distribute
  196. endif
  197. ALL_BUILD_DIRS := $(sort $(BUILD_DIR) $(addprefix $(BUILD_DIR)/, $(SRC_DIRS)) \&amp;lt;/div>
  198. $(addprefix $(BUILD_DIR)/cuda/, $(SRC_DIRS)) \
  199. $(LIB_BUILD_DIR) $(TEST_BIN_DIR) $(PY_PROTO_BUILD_DIR) $(LINT_OUTPUT_DIR) \
  200. $(DISTRIBUTE_SUBDIRS) $(PROTO_BUILD_INCLUDE_DIR))
  201. ##############################
  202. # Set directory for Doxygen-generated documentation
  203. ##############################
  204. DOXYGEN_CONFIG_FILE ?= ./.Doxyfile
  205. # should be the same as OUTPUT_DIRECTORY in the .Doxyfile
  206. DOXYGEN_OUTPUT_DIR ?= ./doxygen
  207. DOXYGEN_COMMAND ?= doxygen
  208. # All the files that might have Doxygen documentation.
  209. DOXYGEN_SOURCES := $(shell find \&amp;lt;/div>
  210. src/$(PROJECT) \
  211. include/$(PROJECT) \&amp;lt;/div>
  212. python/ \&amp;lt;/div>
  213. matlab/ \&amp;lt;/div>
  214. examples \&amp;lt;/div>
  215. tools \&amp;lt;/div>
  216. -name "*.cpp" -or -name "*.hpp" -or -name "*.cu" -or -name "*.cuh" -or \&amp;lt;/div>
  217. -name "*.py" -or -name "*.m")
  218. DOXYGEN_SOURCES += $(DOXYGEN_CONFIG_FILE)
  219. ##############################
  220. # Configure build
  221. ##############################
  222. # Determine platform
  223. UNAME := $(shell uname -s)
  224. ifeq ($(UNAME), Linux)
  225. LINUX := 1
  226. else ifeq ($(UNAME), Darwin)
  227. OSX := 1
  228. OSX_MAJOR_VERSION := $(shell sw_vers -productVersion | cut -f 1 -d .)
  229. OSX_MINOR_VERSION := $(shell sw_vers -productVersion | cut -f 2 -d .)
  230. endif
  231. # Linux
  232. ifeq ($(LINUX), 1)
  233. CXX ?= /usr/bin/g++
  234. GCCVERSION := $(shell $(CXX) -dumpversion | cut -f1,2 -d.)
  235. # older versions of gcc are too dumb to build boost with -Wuninitalized
  236. ifeq ($(shell echo | awk '{exit $(GCCVERSION) < 4.6;}'), 1)
  237. WARNINGS += -Wno-uninitialized
  238. endif
  239. # boost::thread is reasonably called boost_thread (compare OS X)
  240. # We will also explicitly add stdc++ to the link target.
  241. LIBRARIES += boost_thread stdc++
  242. VERSIONFLAGS += -Wl,-soname,$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../lib
  243. endif
  244. # OS X:
  245. # clang++ instead of g++
  246. # libstdc++ for NVCC compatibility on OS X >= 10.9 with CUDA < 7.0
  247. ifeq ($(OSX), 1)
  248. CXX := /usr/bin/clang++
  249. ifneq ($(CPU_ONLY), 1)
  250. CUDA_VERSION := $(shell $(CUDA_DIR)/bin/nvcc -V | grep -o 'release [0-9.]*' | tr -d '[a-z ]')
  251. ifeq ($(shell echo | awk ’{exit $(CUDA_VERSION) < 7.0;}'), 1)
  252. CXXFLAGS += -stdlib=libstdc++
  253. LINKFLAGS += -stdlib=libstdc++
  254. endif
  255. # clang throws this warning for cuda headers
  256. WARNINGS += -Wno-unneeded-internal-declaration
  257. # 10.11 strips DYLD_* env vars so link CUDA (rpath is available on 10.5+)
  258. OSX_10_OR_LATER := $(shell [ $(OSX_MAJOR_VERSION) -ge 10 ] && echo true)
  259. OSX_10_5_OR_LATER := $(shell [ $(OSX_MINOR_VERSION) -ge 5 ] && echo true)
  260. ifeq ($(OSX_10_OR_LATER),true)
  261. ifeq ($(OSX_10_5_OR_LATER),true)
  262. LDFLAGS += -Wl,-rpath,$(CUDA_LIB_DIR)
  263. endif
  264. endif
  265. endif
  266. # gtest needs to use its own tuple to not conflict with clang
  267. COMMON_FLAGS += -DGTEST_USE_OWN_TR1_TUPLE=1
  268. # boost::thread is called boost_thread-mt to mark multithreading on OS X
  269. LIBRARIES += boost_thread-mt
  270. # we need to explicitly ask for the rpath to be obeyed
  271. ORIGIN := @loader_path
  272. VERSIONFLAGS += -Wl,-install_name,@rpath/$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../../build/lib
  273. else
  274. ORIGIN := \$ $ORIGIN
  275. endif
  276. # Custom compiler
  277. ifdef CUSTOM_CXX
  278. CXX := $(CUSTOM_CXX)
  279. endif
  280. # Static linking
  281. ifneq (,$(findstring clang++,$(CXX)))
  282. STATIC_LINK_COMMAND := -Wl,-force_load $(STATIC_NAME)
  283. else ifneq (,$(findstring g++,$(CXX)))
  284. STATIC_LINK_COMMAND := -Wl,--whole-archive $(STATIC_NAME) -Wl,–no-whole-archive
  285. else
  286. # The following line must not be indented with a tab, since we are not inside a target
  287. $(error Cannot static link with the $(CXX) compiler)
  288. endif
  289. # Debugging
  290. ifeq ($(DEBUG), 1)
  291. COMMON_FLAGS += -DDEBUG -g -O0
  292. NVCCFLAGS += -G
  293. else
  294. COMMON_FLAGS += -DNDEBUG -O2
  295. endif
  296. # cuDNN acceleration configuration.
  297. ifeq ($(USE_CUDNN), 1)
  298. LIBRARIES += cudnn
  299. COMMON_FLAGS += -DUSE_CUDNN
  300. endif
  301. # configure IO libraries
  302. ifeq ($(USE_OPENCV), 1)
  303. COMMON_FLAGS += -DUSE_OPENCV
  304. endif
  305. ifeq ($(USE_LEVELDB), 1)
  306. COMMON_FLAGS += -DUSE_LEVELDB
  307. endif
  308. ifeq ($(USE_LMDB), 1)
  309. COMMON_FLAGS += -DUSE_LMDB
  310. ifeq ($(ALLOW_LMDB_NOLOCK), 1)
  311. COMMON_FLAGS += -DALLOW_LMDB_NOLOCK
  312. endif
  313. endif
  314. # CPU-only configuration
  315. ifeq ($(CPU_ONLY), 1)
  316. OBJS := $(PROTO_OBJS) $(CXX_OBJS)
  317. TEST_OBJS := $(TEST_CXX_OBJS)
  318. TEST_BINS := $(TEST_CXX_BINS)
  319. ALL_WARNS := $(ALL_CXX_WARNS)
  320. TEST_FILTER := –gtest_filter= ”-*GPU*”
  321. COMMON_FLAGS += -DCPU_ONLY
  322. endif
  323. # Python layer support
  324. ifeq ($(WITH_PYTHON_LAYER), 1)
  325. COMMON_FLAGS += -DWITH_PYTHON_LAYER
  326. LIBRARIES += $(PYTHON_LIBRARIES)
  327. endif
  328. # BLAS configuration (default = ATLAS)
  329. BLAS ?= atlas
  330. ifeq ($(BLAS), mkl)
  331. # MKL
  332. LIBRARIES += mkl_rt
  333. COMMON_FLAGS += -DUSE_MKL
  334. MKLROOT ?= /opt/intel/mkl
  335. BLAS_INCLUDE ?= $(MKLROOT)/include
  336. BLAS_LIB ?= $(MKLROOT)/lib $(MKLROOT)/lib/intel64
  337. else ifeq ($(BLAS), open)
  338. # OpenBLAS
  339. LIBRARIES += openblas
  340. else
  341. # ATLAS
  342. ifeq ($(LINUX), 1)
  343. ifeq ($(BLAS), atlas)
  344. # Linux simply has cblas and atlas
  345. LIBRARIES += cblas atlas
  346. endif
  347. else ifeq ($(OSX), 1)
  348. # OS X packages atlas as the vecLib framework
  349. LIBRARIES += cblas
  350. # 10.10 has accelerate while 10.9 has veclib
  351. XCODE_CLT_VER := $(shell pkgutil --pkg-info=com.apple.pkg.CLTools_Executables | grep 'version' | sed 's/[^0-9]*\([0-9]\).*/\1/')
  352. XCODE_CLT_GEQ_7 := $(shell [ $(XCODE_CLT_VER) -gt 6 ] && echo 1)
  353. XCODE_CLT_GEQ_6 := $(shell [ $(XCODE_CLT_VER) -gt 5 ] && echo 1)
  354. ifeq ($(XCODE_CLT_GEQ_7), 1)
  355. BLAS_INCLUDE ?= /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/$(shell ls /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/ | sort | tail -1)/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/Headers
  356. else ifeq ($(XCODE_CLT_GEQ_6), 1)
  357. BLAS_INCLUDE ?= /System/Library/Frameworks/Accelerate.framework/Versions/Current/Frameworks/vecLib.framework/Headers/
  358. LDFLAGS += -framework Accelerate
  359. else
  360. BLAS_INCLUDE ?= /System/Library/Frameworks/vecLib.framework/Versions/Current/Headers/
  361. LDFLAGS += -framework vecLib
  362. endif
  363. endif
  364. endif
  365. INCLUDE_DIRS += $(BLAS_INCLUDE)
  366. LIBRARY_DIRS += $(BLAS_LIB)
  367. LIBRARY_DIRS += $(LIB_BUILD_DIR)
  368. # Automatic dependency generation (nvcc is handled separately)
  369. CXXFLAGS += -MMD -MP
  370. # Complete build flags.
  371. COMMON_FLAGS += $(foreach includedir,$(INCLUDE_DIRS),-isystem $(includedir))
  372. CXXFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)
  373. #NVCCFLAGS += -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
  374. NVCCFLAGS += -D_FORCE_INLINES -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
  375. # mex may invoke an older gcc that is too liberal with -Wuninitalized
  376. MATLAB_CXXFLAGS := $(CXXFLAGS) -Wno-uninitialized
  377. LINKFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)
  378. USE_PKG_CONFIG ?= 0
  379. ifeq ($(USE_PKG_CONFIG), 1)
  380. PKG_CONFIG := $(shell pkg-config opencv --libs)
  381. else
  382. PKG_CONFIG :=
  383. endif
  384. LDFLAGS += $(foreach librarydir,$(LIBRARY_DIRS),-L$(librarydir)) $(PKG_CONFIG) \&amp;lt;/div>
  385. $(foreach library,$(LIBRARIES),-l$(library))
  386. PYTHON_LDFLAGS := $(LDFLAGS) $(foreach library,$(PYTHON_LIBRARIES),-l$(library))
  387. # ‘superclean’ target recursively* deletes all files ending with an extension
  388. # in $(SUPERCLEAN_EXTS) below. This may be useful if you've built older
  389. # versions of Caffe that do not place all generated files in a location known
  390. # to the 'clean' target.
  391. #
  392. # 'supercleanlist' will list the files to be deleted by make superclean.
  393. #
  394. # * Recursive with the exception that symbolic links are never followed, per the
  395. # default behavior of 'find'.
  396. SUPERCLEAN_EXTS := .so .a .o .bin .testbin .pb.cc .pb.h _pb2.py .cuo
  397. # Set the sub-targets of the 'everything' target.
  398. EVERYTHING_TARGETS := all py$(PROJECT) test warn lint
  399. # Only build matcaffe as part of “everything” if MATLAB_DIR is specified.
  400. ifneq ($(MATLAB_DIR),)
  401. EVERYTHING_TARGETS += mat$(PROJECT)
  402. endif
  403. ##############################
  404. # Define build targets
  405. ##############################
  406. .PHONY: all lib test clean docs linecount lint lintclean tools examples $(DIST_ALIASES) \&amp;lt;/div>
  407. py mat py$(PROJECT) mat$(PROJECT) proto runtest \&amp;lt;/div>
  408. superclean supercleanlist supercleanfiles warn everything
  409. all: lib tools examples
  410. lib: $(STATIC_NAME) $(DYNAMIC_NAME)
  411. everything: $(EVERYTHING_TARGETS)
  412. linecount:
  413. cloc – read-lang-def=$(PROJECT).cloc \&amp;lt;/div>
  414. src/$(PROJECT) include/$(PROJECT) tools examples \&amp;lt;/div>
  415. python matlab
  416. lint: $(EMPTY_LINT_REPORT)
  417. lintclean:
  418. @ $(RM) -r $(LINT_OUTPUT_DIR) $(EMPTY_LINT_REPORT) $(NONEMPTY_LINT_REPORT)
  419. docs: $(DOXYGEN_OUTPUT_DIR)
  420. @ cd ./docs ; ln -sfn ../$(DOXYGEN_OUTPUT_DIR)/html doxygen
  421. $(DOXYGEN_OUTPUT_DIR): $(DOXYGEN_CONFIG_FILE) $(DOXYGEN_SOURCES)
  422. $(DOXYGEN_COMMAND) $(DOXYGEN_CONFIG_FILE)
  423. $(EMPTY_LINT_REPORT): $(LINT_OUTPUTS) | $(BUILD_DIR)
  424. @ cat $(LINT_OUTPUTS) > $@
  425. @ if [ -s $@" ]; then \&amp;lt;/div>
  426. cat $@; \
  427. mv $@ $(NONEMPTY_LINT_REPORT); \
  428. echo ”Found one or more lint errors.”; \
  429. exit 1; \
  430. fi; \
  431. $(RM) $(NONEMPTY_LINT_REPORT); \
  432. echo ”No lint errors!”;
  433. $(LINT_OUTPUTS): $(LINT_OUTPUT_DIR)/%.lint.txt : % $(LINT_SCRIPT) | $(LINT_OUTPUT_DIR)
  434. @ mkdir -p $(dir $@)
  435. @ python $(LINT_SCRIPT) $< 2>&1 \
  436. | grep -v ”^Done processing “ \
  437. | grep -v ”^Total errors found: 0” \
  438. > $@ \&amp;lt;/div>
  439. || true
  440. test: $(TEST_ALL_BIN) $(TEST_ALL_DYNLINK_BIN) $(TEST_BINS)
  441. tools: $(TOOL_BINS) $(TOOL_BIN_LINKS)
  442. examples: $(EXAMPLE_BINS)
  443. py$(PROJECT): py
  444. py: $(PY$(PROJECT)_SO) $(PROTO_GEN_PY)
  445. $(PY$(PROJECT)_SO): $(PY$(PROJECT)_SRC) $(PY$(PROJECT)_HXX) | $(DYNAMIC_NAME)
  446. @ echo CXX/LD -o $@ $<
  447. $(Q)$(CXX) -shared -o $@ $(PY$(PROJECT)_SRC) \&amp;lt;/div>
  448. -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(PYTHON_LDFLAGS) \&amp;lt;/div>
  449. -Wl,-rpath,$(ORIGIN)/../../build/lib
  450. mat$(PROJECT): mat
  451. mat: $(MAT$(PROJECT)_SO)
  452. $(MAT$(PROJECT)_SO): $(MAT$(PROJECT)_SRC) $(STATIC_NAME)
  453. @ if [ -z $(MATLAB_DIR)" ]; then \&amp;lt;/div>
  454. echo "MATLAB_DIR must be specified in $(CONFIG_FILE) \
  455. ”to build mat$(PROJECT)."; \&amp;lt;/div>
  456. exit 1; \&amp;lt;/div>
  457. fi
  458. @ echo MEX $<
  459. $(Q)$(MATLAB_DIR)/bin/mex $(MAT$(PROJECT)_SRC) \
  460. CXX= $(CXX)" \&amp;lt;/div>
  461. CXXFLAGS="\$$CXXFLAGS $(MATLAB_CXXFLAGS)" \
  462. CXXLIBS="\$$CXXLIBS $(STATIC_LINK_COMMAND) $(LDFLAGS) -output $@
  463. @ if [ -f "$(PROJECT) _.d” ]; then \
  464. mv -f $(PROJECT)_.d $(BUILD_DIR)/ ${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}; \&amp;lt;/div>
  465. fi
  466. runtest: $(TEST_ALL_BIN)
  467. $(TOOL_BUILD_DIR)/caffe
  468. $(TEST_ALL_BIN) $(TEST_GPUID) --gtest_shuffle $(TEST_FILTER)
  469. pytest: py
  470. cd python; python -m unittest discover -s caffe/ test
  471. mattest: mat
  472. cd matlab; $(MATLAB_DIR)/bin/matlab -nodisplay -r 'caffe.run_tests(), exit()'
  473. warn: $(EMPTY_WARN_REPORT)
  474. $(EMPTY_WARN_REPORT): $(ALL_WARNS) | $(BUILD_DIR)
  475. @ cat $(ALL_WARNS) > $@
  476. @ if [ -s "$@ ]; then \
  477. cat $@; \&amp;lt;/div>
  478. mv $@ $(NONEMPTY_WARN_REPORT); \&amp;lt;/div>
  479. echo "Compiler produced one or more warnings."; \&amp;lt;/div>
  480. exit 1; \&amp;lt;/div>
  481. fi; \&amp;lt;/div>
  482. $(RM) $(NONEMPTY_WARN_REPORT); \&amp;lt;/div>
  483. echo "No compiler warnings!";
  484. $(ALL_WARNS): %.o.$(WARNS_EXT) : %.o
  485. $(BUILD_DIR_LINK): $(BUILD_DIR)/.linked
  486. # Create a target ".linked" in this BUILD_DIR to tell Make that the "build" link
  487. # is currently correct, then delete the one in the OTHER_BUILD_DIR in case it
  488. # exists and $(DEBUG) is toggled later.
  489. $(BUILD_DIR)/.linked:
  490. @ mkdir -p $(BUILD_DIR)
  491. @ $(RM) $(OTHER_BUILD_DIR)/.linked
  492. @ $(RM) -r $(BUILD_DIR_LINK)
  493. @ ln -s $(BUILD_DIR) $(BUILD_DIR_LINK)
  494. @ touch $@
  495. $(ALL_BUILD_DIRS): | $(BUILD_DIR_LINK)
  496. @ mkdir -p $@
  497. $(DYNAMIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)
  498. @ echo LD -o $@
  499. $(Q)$(CXX) -shared -o $@ $(OBJS) $(VERSIONFLAGS) $(LINKFLAGS) $(LDFLAGS)
  500. @ cd $(BUILD_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT); ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)
  501. $(STATIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)
  502. @ echo AR -o $@
  503. $(Q)ar rcs $@ $(OBJS)
  504. $(BUILD_DIR)/%.o: %.cpp | $(ALL_BUILD_DIRS)
  505. @ echo CXX $<
  506. $(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2> $@.$(WARNS_EXT) \
  507. || (cat $@.$(WARNS_EXT); exit 1)
  508. @ cat $@.$(WARNS_EXT)
  509. $(PROTO_BUILD_DIR)/%.pb.o: $(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_GEN_HEADER) \&amp;lt;/div>
  510. | $(PROTO_BUILD_DIR)
  511. @ echo CXX $<
  512. $(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2> $@.$(WARNS_EXT) \
  513. || (cat $@.$(WARNS_EXT); exit 1)
  514. @ cat $@.$(WARNS_EXT)
  515. $(BUILD_DIR)/cuda/%.o: %.cu | $(ALL_BUILD_DIRS)
  516. @ echo NVCC $<
  517. $(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -M $< -o ${@:.o=.d} \&amp;lt;/div>
  518. -odir $(@D)
  519. $(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -c $< -o $@ 2> $@.$(WARNS_EXT) \
  520. || (cat $@.$(WARNS_EXT); exit 1)
  521. @ cat $@.$(WARNS_EXT)
  522. $(TEST_ALL_BIN): $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \
  523. | $(DYNAMIC_NAME) $(TEST_BIN_DIR)
  524. @ echo CXX/LD -o $@ $<
  525. $(Q)$(CXX) $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \&amp;lt;/div>
  526. -o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
  527. $(TEST_CU_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CU_BUILD_DIR)/%.o \&amp;lt;/div>
  528. $(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)
  529. @ echo LD $<
  530. $(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \
  531. -o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
  532. $(TEST_CXX_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CXX_BUILD_DIR)/%.o \
  533. $(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)
  534. @ echo LD $<
  535. $(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \&amp;lt;/div>
  536. -o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
  537. # Target for extension-less symlinks to tool binaries with extension ‘*.bin’.
  538. $(TOOL_BUILD_DIR)/%: $(TOOL_BUILD_DIR)/%.bin | $(TOOL_BUILD_DIR)
  539. @ $(RM) $@
  540. @ ln -s $(notdir $<) $@
  541. $(TOOL_BINS): %.bin : %.o | $(DYNAMIC_NAME)
  542. @ echo CXX/LD -o $@
  543. $(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \
  544. -Wl,-rpath,$(ORIGIN)/../lib
  545. $(EXAMPLE_BINS): %.bin : %.o | $(DYNAMIC_NAME)
  546. @ echo CXX/LD -o $@
  547. $(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \&amp;lt;/div>
  548. -Wl,-rpath,$(ORIGIN)/../../lib
  549. proto: $(PROTO_GEN_CC) $(PROTO_GEN_HEADER)
  550. $(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_BUILD_DIR)/%.pb.h : \
  551. $(PROTO_SRC_DIR)/%.proto | $(PROTO_BUILD_DIR)
  552. @ echo PROTOC $<
  553. $(Q)protoc –proto_path=$(PROTO_SRC_DIR) --cpp_out=$(PROTO_BUILD_DIR) $<
  554. $(PY_PROTO_BUILD_DIR)/%_pb2.py : $(PROTO_SRC_DIR)/%.proto \&amp;lt;/div>
  555. $(PY_PROTO_INIT) | $(PY_PROTO_BUILD_DIR)
  556. @ echo PROTOC \(python\) $<
  557. $(Q)protoc --proto_path=$(PROTO_SRC_DIR) –python_out=$(PY_PROTO_BUILD_DIR) $<
  558. $(PY_PROTO_INIT): | $(PY_PROTO_BUILD_DIR)
  559. touch $(PY_PROTO_INIT)
  560. clean:
  561. @- $(RM) -rf $(ALL_BUILD_DIRS)
  562. @- $(RM) -rf $(OTHER_BUILD_DIR)
  563. @- $(RM) -rf $(BUILD_DIR_LINK)
  564. @- $(RM) -rf $(DISTRIBUTE_DIR)
  565. @- $(RM) $(PY$(PROJECT)_SO)
  566. @- $(RM) $(MAT$(PROJECT)_SO)
  567. supercleanfiles:
  568. $( eval SUPERCLEAN_FILES := $(strip \&amp;lt;/div>
  569. $(foreach ext,$(SUPERCLEAN_EXTS), $(shell find . -name ’*$(ext)' \&amp;lt;/div>
  570. -not -path './data/*'))))
  571. supercleanlist: supercleanfiles
  572. @ \&amp;lt;/div>
  573. if [ -z "$(SUPERCLEAN_FILES) ]; then \
  574. echo ”No generated files found.”; \
  575. else \
  576. echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \&amp;lt;/div>
  577. fi
  578. superclean: clean supercleanfiles
  579. @ \&amp;lt;/div>
  580. if [ -z "$(SUPERCLEAN_FILES) ]; then \
  581. echo ”No generated files found.”; \
  582. else \
  583. echo ”Deleting the following generated files:”; \
  584. echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \&amp;lt;/div>
  585. $(RM) $(SUPERCLEAN_FILES); \&amp;lt;/div>
  586. fi
  587. $(DIST_ALIASES): $(DISTRIBUTE_DIR)
  588. $(DISTRIBUTE_DIR): all py | $(DISTRIBUTE_SUBDIRS)
  589. # add proto
  590. cp -r src/caffe/proto $(DISTRIBUTE_DIR)/
  591. # add include
  592. cp -r include $(DISTRIBUTE_DIR)/
  593. mkdir -p $(DISTRIBUTE_DIR)/include/caffe/proto
  594. cp $(PROTO_GEN_HEADER_SRCS) $(DISTRIBUTE_DIR)/include/caffe/proto
  595. # add tool and example binaries
  596. cp $(TOOL_BINS) $(DISTRIBUTE_DIR)/bin
  597. cp $(EXAMPLE_BINS) $(DISTRIBUTE_DIR)/bin
  598. # add libraries
  599. cp $(STATIC_NAME) $(DISTRIBUTE_DIR)/lib
  600. install -m 644 $(DYNAMIC_NAME) $(DISTRIBUTE_DIR)/lib
  601. cd $(DISTRIBUTE_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT); ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)
  602. # add python - it’s not the standard way, indeed…
  603. cp -r python $(DISTRIBUTE_DIR)/python
  604. -include $(DEPS)




Makefile.config文件中的ANACONDA_HOME := $(HOME)/anaconda2
是安装anaconda后的路径

在$caffe_root目录下打开命令行终端,输入以下命令
make -j8
make py
make test -j8

如果不报错,说明差不多了
ps: -j8是说明你的电脑配置的cpu有几核

编辑~/.bashrc文件,加入如下环境变量
export PYTHONPATH=/home/gdu/caffe/python:$PYTHONPA
其中/home/gdu/caffe/python就是你$caffe_root 目录下的python
然后更新环境变量
source ~/.bashrce

打开命令终端,输入
python

进入Python解释器后再输入
import caffe
如果不报错,那么,congratulation,你的caffe-ssd配置成功

**************************************************************************************************************
四、配置及运行 MobileNetSSD
如果你需要用MobileNetSSD进行训练自己的数据集,你可能额外需要阅读一下其他参考网址: http://www.cnblogs.com/EstherLjy/p/6863890.html ,已经有的步骤就不需要做了

MobileNetSSD官网网址: https://github.com/chuanqi305/MobileNet-SSD
官网的步骤如下:
Run
1.Download  SSD  source code and compile (follow the SSD README).
2.Download the pretrained deploy weights from the link above.
3.Put all the files in SSD_HOME/examples/
4.Run demo.py to show the detection result.
解释一下:
1步就是让你配置SSD,我上面已经配置好了
2步就是下载预训练模型,后面我会附上相关文件下载地址
3步就是说把MobileNet-SSD代码放到ssd的examples目录下,也就是$caffe_root /examples/
4步就是运行demo.py喽

Train your own dataset
其中官网介绍说让你先创建lmdb数据,是用软连接创建的,下面我按照我的方式进行训练你自己的数据
1).Convert your own dataset to lmdb database (follow the SSD README), and create symlinks to current directory.
ln -s PATH_TO_YOUR_TRAIN_LMDB trainval_lmdbln -s PATH_TO_YOUR_TEST_LMDB test_lmdb
上面是官网简单的介绍,其实不用这么做,用绝对路径就好了,在第3)步中修改绝对路径,详情见第3)步
2).Create the labelmap.prototxt file and put it into current directory.
这一步也就是配置SSD的时候生成的,我这里的名字叫做 labelmap_voc_my_test.prototxt ,在/home/caffe/data/my_test/ 目录下

3).Use gen_model.sh to generate your own training prototxt.
执行(说明下,应该在MobileNet-SSD目录下执行,也就是$caffe_root/examples/MobileNet-SSD)
./ gen_model.sh 13
这个13是类数,包括一个背景类,如果你有20类的,那么这里就是21
在example下面会生成MobileNetSSD_deploy.prototxt MobileNetSSD_test.prototxt MobileNetSSD_train.prototxt
这里需要修改三个文件,分别是MobileNetSSD_train_template.prototxt, MobileNetSSD_test_template.prototxt, MobileNetSSD_deploy_template.prototxt
这三个文件所在的目录是/home/gdu/caffe/examples/MobileNetSSD/template
MobileNetSSD_train_template.prototxt需要修改的地方,大概在:
第49行source: “/home/gdu/caffe/examples/my_test/my_test_trainval_lmdb/”,
第136行 label_map_file: “/home/gdu/caffe/data/my_test/labelmap_voc_my_test.prototxt”

MobileNetSSD_test_template.prototxt需要修改的地方,大概在:
第24行source: “/home/gdu/caffe/examples/my_test/my_test_test_lmdb”,
第31行label_map_file: “/home/gdu/caffe/data/my_test/labelmap_voc_my_test.prototxt”

MobileNetSSD_deploy_template.prototxt需要修改的地方:
暂时不用修改


4).Download the training weights from the link above, and run train.sh, after about 30000 iterations, the loss should be 1.5 - 2.5.
下载权重文件 mobilenet_iter_73000.caffemodel 链接: https://pan.baidu.com/s/1gfIoVi7 密码: 7yu5
可以调整solver_train.prototxt文件里面的参数,比如max_iter代表最大迭代次数,原先120000;snapshot代表迭代多少次保存一次,原先8000
运行(说明下,应该在MobileNet-SSD目录下执行,也就是$caffe_root/examples/MobileNet-SSD)
./train.sh

5).Run test.sh to evaluate the result.
这一步是评估,可跳过,如果需要做,那么需要更改一些东西,
第三行的latest=$(ls -t snapshot/mobilenet_iter_2000.caffemodel | head -n 1)
mobilenet_iter_2000.caffemodel 是4)训练保存的结果

第七行的/home/gdu/caffe/build/tools/caffe train -solver=”solver_test.prototxt”
必须用solver_test.prototxt这个文件

6).Run merge_bn.py to generate your own deploy caffemodel.
$caffe_root/examples/MobileNet-SSD 目录下执行
python merge_bn.py
其中merge_bn.py文件中:
train_proto = ‘example/MobileNetSSD_train.prototxt’ #这个是3)产生的文件
train_model = ‘snapshot/mobilenet_iter_2000.caffemodel’ #should be your snapshot caffemodel 这个是4)训练保存的结果
deploy_proto = ‘MobileNetSSD_deploy.prototxt’ #这个是3)产生的文件
save_model = ‘MobileNetSSD_deploy_my_test_2000.caffemodel’ #这个是合并的模型文件

最后会就生成了你自己的数据集模型,MobileNetSSD_deploy.prototxt是网络结构文件,MobileNetSSD_deploy_my_test_2000.caffemodel是模型文件

3、测试训练的模型
Python代码如下

      
      
      
      
  1. import numpy as np
  2. import sys,os
  3. import cv2
  4. caffe_root = “/home/gdu/caffe/”
  5. import sys
  6. #sys.path.insert(0, caffe_root + ‘python’)
  7. sys.path.append(caffe_root+ ‘python’)
  8. import caffe
  9. #net_file = “model/MobileNetSSD_deploy.prototxt”
  10. #caffe_model = “model/MobileNetSSD_deploy.caffemodel”
  11. # net_file = “example/MobileNetSSD_deploy100*100.prototxt”
  12. # caffe_model = “result_model/MobileNetSSD_deploy_my_test_100*100_2000.caffemodel”
  13. net_file = “model/MobileNetSSD_deploy.prototxt”
  14. caffe_model = “model/MobileNetSSD_deploy.caffemodel”
  15. test_dir = “/home/gdu/caffe/examples/MobileNet-SSD/images”
  16. if not os.path.exists(caffe_model):
  17. print( “MobileNetSSD_deploy.affemodel does not exist,”)
  18. print( “use merge_bn.py to generate it.”)
  19. exit()
  20. net = caffe.Net(net_file,caffe_model,caffe.TEST)
  21. CLASSES = ( ‘background’,
  22. ‘aeroplane’, ‘bicycle’, ‘bird’, ‘boat’,
  23. ‘bottle’, ‘bus’, ‘car’, ‘cat’, ‘chair’,
  24. ‘cow’, ‘diningtable’, ‘dog’, ‘horse’,
  25. ‘motorbike’, ‘person’, ‘pottedplant’,
  26. ‘sheep’, ‘sofa’, ‘train’, ‘tvmonitor’)
  27. # CLASSES = (‘background’,
  28. # ‘bicycle’, ‘boat’,
  29. # ‘bus’, ‘car’, ‘cat’,
  30. # ‘cow’, ‘dog’, ‘horse’,
  31. # ‘motorbike’, ‘person’,
  32. # ‘sheep’, ‘train’)
  33. # CLASSES = (‘background’,
  34. # ‘person_v’, ‘person_p’)
  35. def preprocess(src):
  36. img = cv2.resize(src, ( 300, 300))
  37. img = img - 127.5
  38. img = img * 0.007843
  39. return img
  40. def postprocess(img, out):
  41. h = img.shape[ 0]
  42. w = img.shape[ 1]
  43. box = out[ ‘detection_out’][ 0, 0,:, 3: 7] * np.array([w, h, w, h])
  44. cls = out[ ‘detection_out’][ 0, 0,:, 1]
  45. conf = out[ ‘detection_out’][ 0, 0,:, 2]
  46. return (box.astype(np.int32), conf, cls)
  47. def detect(imgfile):
  48. origimg = cv2.imread(imgfile)
  49. img = preprocess(origimg)
  50. img = img.astype(np.float32)
  51. img = img.transpose(( 2, 0, 1))
  52. net.blobs[ ‘data’].data[…] = img
  53. out = net.forward()
  54. box, conf, cls = postprocess(origimg, out)
  55. for i in range(len(box)):
  56. p1 = (box[i][ 0], box[i][ 1])
  57. p2 = (box[i][ 2], box[i][ 3])
  58. cv2.rectangle(origimg, p1, p2, ( 0, 255, 0))
  59. p3 = (max(p1[ 0], 15), max(p1[ 1], 15))
  60. title = “%s:%.2f” % (CLASSES[int(cls[i])], conf[i])
  61. cv2.putText(origimg, title, p3, cv2.FONT_ITALIC, 0.6, ( 0, 255, 0), 1)
  62. cv2.imshow( “SSD”, origimg)
  63. k = cv2.waitKey( 0) & 0xff
  64. #Exit if ESC pressed
  65. if k == 27 : return False
  66. return True
  67. for f in os.listdir(test_dir):
  68. print(test_dir + “/” + f+ “\n”)
  69. if detect(test_dir + “/” + f) == False:
  70. break



**************************************************************************************************************
五、错误
1、如果用opencv3.3,在caffe进行make时,会报出这个错误时
collect2: error: ld returned 1 exit status
Makefile:560: recipe for target ‘.build_release/tools/upgrade_net_proto_text.bin’ failed
make: *** [.build_release/tools/upgrade_net_proto_text.bin] Error 1
则按照这个方式进行配置: https://stackoverflow.com/questions/31962975/caffe-install-on-ubuntu-for-anaconda-with-python-2-7-fails-with-libpng16-so-16-n

2、 如果出现这个错误 python caffe报错:No module named google.protobuf.internal
按照这个教程
http://www.jianshu.com/p/1e405b9fe973

3、如果之前用cmake,make install等方式安装过caffe的话
由于以前安装caffe的方式会在系统目录生成安装文件,caffe安装在其他目录下了

http://www.cnblogs.com/darkknightzh/p/5864715.html
删除/usr/include/caffe /usr/lib

https://github.com/BVLC/caffe/issues/3396
sudo rm -rf /usr/local/lib/libcaffe*

4、 src/caffe/layers/hdf5_output_layer.cpp:3:18: 致命错误: hdf5.h:没有那个文件或目录编译中断。
Makefile:572: recipe for target ‘.build_release/src/caffe/layers/hdf5_output_layer.o’ failed
make: *** [.build_release/src/caffe/layers/hdf5_output_layer.o] Error 1
make: *** 正在等待未完成的任务….
In file included from src/caffe/util/hdf5.cpp:1:0:
./include/caffe/util/hdf5.hpp:7:18: 致命错误: hdf5.h:没有那个文件或目录编译中断。
Makefile:572: recipe for target ‘.build_release/src/caffe/util/hdf5.o’ failed
make: *** [.build_release/src/caffe/util/hdf5.o] Error 1
src/caffe/net.cpp:8:18: 致命错误: hdf5.h:没有那个文件或目录

在你安装完成以后需要将libhdf5-serial-dev的位置添加在你的配置文件中方便他进行编译,我用的系统是ubuntu16,所以我的修改方式如下修改Makefile.config需要修改的内容:
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial

LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial

5、 m//home/yali/anaconda2/lib/libpng16.so.16:对‘inflateValidate@ZLIB_1.2.9’未定义的引用
sudo ln -s /home/yali/anaconda2/lib/libpng16.so.16 libpng16.so.16 (方法不行) 
正确解决方法: 
在 Makefile.config 中,加入下一句 
LINKFLAGS := -Wl,-rpath,$(HOME)/anaconda2/lib
参考 http://blog.csdn.net/ruotianxia/article/details/78437464

**************************************************************************************************************
相关文件:
VGG_ILSVRC_16_layers_fc_reduced.caffemodel文件 链接: https://pan.baidu.com/s/1kVEb5H1 密码: 2vet
MobileNetSSD_deploy.prototxt文件 链接: https://pan.baidu.com/s/1dE3OghV 密码: pc9w
MobileNetSSD_deploy.caffemodel文件 链接: https://pan.baidu.com/s/1kV3mhwj 密码: 728b
mobilenet_iter_73000.caffemodel文件 链接: https://pan.baidu.com/s/1gfIoVi7 密码: 7yu5






你可能感兴趣的:(caffe-MobileNet-ssd环境搭建及训练自己的数据集模型)