caffe-MobileNet-ssd环境搭建及训练自己的数据集模型,转载来自https://blog.csdn.net/cs_fang_dn/article/details/78790790
-
#include
-
#include
-
-
int main(int argc,char* argv[]){
-
const
std::
string window_name =
“lena”;
-
const
std::
string input_pic =
“lena.jpg”;
-
cv::Mat test_pic = cv::imread(input_pic);
-
if(test_pic.empty()){
-
std::
cout<<
“no input image”<<
std::
endl;
-
return
1;
-
}
-
cv::namedWindow(window_name);
-
cv::imshow(window_name,test_pic);
-
cv::waitKey(
0);
-
return
0;
-
}
-
## Refer to http://caffe.berkeleyvision.org/installation.html
-
# Contributions simplifying and improving our build system are welcome!
-
-
# cuDNN acceleration switch (uncomment to build with cuDNN).
-
USE_CUDNN :=
1
-
-
# CPU-only switch (uncomment to build without GPU support).
-
# CPU_ONLY := 1
-
-
# uncomment to disable IO dependencies and corresponding data layers
-
# USE_OPENCV := 0
-
# USE_LEVELDB := 0
-
# USE_LMDB := 0
-
-
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
-
# You should not set this flag if you will be reading LMDBs with any
-
# possibility of simultaneous read and write
-
# ALLOW_LMDB_NOLOCK := 1
-
-
# Uncomment if you’re using OpenCV 3
-
OPENCV_VERSION :=
3
-
-
# To customize your choice of compiler, uncomment and set the following.
-
# N.B. the default for Linux is g++ and the default for OSX is clang++
-
# CUSTOM_CXX := g++
-
-
# CUDA directory contains bin/ and lib/ directories that we need.
-
CUDA_DIR := /usr/local/cuda
-
# On Ubuntu 14.04, if cuda tools are installed via
-
# “sudo apt-get install nvidia-cuda-toolkit” then use this instead:
-
# CUDA_DIR := /usr
-
-
# CUDA architecture setting: going with all of them.
-
# For CUDA < 6.0, comment the lines after *_35 for compatibility.
-
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
-
-gencode arch=compute_20,code=sm_21 \
-
-gencode arch=compute_30,code=sm_30 \
-
-gencode arch=compute_35,code=sm_35 \
-
-gencode arch=compute_50,code=sm_50 \
-
-gencode arch=compute_52,code=sm_52 \
-
-gencode arch=compute_61,code=sm_61
-
-
# BLAS choice:
-
# atlas for ATLAS (default)
-
# mkl for MKL
-
# open for OpenBlas
-
# BLAS := atlas
-
BLAS := open
-
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
-
# Leave commented to accept the defaults for your choice of BLAS
-
# (which should work)!
-
# BLAS_INCLUDE := /path/to/your/blas
-
# BLAS_LIB := /path/to/your/blas
-
-
# Homebrew puts openblas in a directory that is not on the standard search path
-
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
- # BLAS_LIB := $(shell brew –prefix openblas)/lib
-
-
# This is required only if you will compile the matlab interface.
-
# MATLAB directory should contain the mex binary in /bin.
-
# MATLAB_DIR := /usr/local
-
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
-
-
# NOTE: this is required only if you will compile the python interface.
-
# We need to be able to find Python.h and numpy/arrayobject.h.
-
-
#PYTHON_INCLUDE := /usr/include/python2.7 \
-
# /usr/lib/python2.7/dist-packages/numpy/core/include
-
-
”color:rgb(223,64,42);font-size:14px;background-color:rgb(255,255,255);”>LINKFLAGS := -Wl,-rpath,$(HOME)/anaconda2/lib</span>
- # Anaconda Python distribution is quite popular. Include path:
- # Verify anaconda location, sometimes it's in root.
- ANACONDA_HOME := $(HOME)/anaconda2
-
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \&lt;/div>
- $(ANACONDA_HOME)/
include/python2
.7 \
-
$(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include \&lt;/div>
-
- #ANACONDA_HOME := $(HOME)/anaconda3
-
#PYTHON_INCLUDE := $(ANACONDA_HOME)/include \&lt;/span>
- # $(ANACONDA_HOME)/include/python3.6m \
-
# $(ANACONDA_HOME)/lib/python3.6/site-packages/numpy/core/include \&lt;/span>
-
-
- # Uncomment to use Python 3 (default is Python 2)
- # PYTHON_LIBRARIES := boost_python3 python3.5m
- # PYTHON_INCLUDE := /usr/include/python3.5m \&lt;/span>
- # /usr/lib/python3.5/dist-packages/numpy/core/include
-
- # We need to be able to find libpythonX.X.so or .dylib.
- PYTHON_LIB := /usr/lib
- # PYTHON_LIB := $(ANACONDA_HOME)/lib
-
-
# Homebrew installs numpy in a non standard path (keg only)
-
# PYTHON_INCLUDE += $(dir $(shell python -c ‘import numpy.core; print(numpy.core.__file__)’))/include
-
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
-
- # Uncomment to support layers written in Python (will link against Python libs)
- WITH_PYTHON_LAYER := 1
-
- # Whatever else you find you need goes here.
- #INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
-
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
- #LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib
-
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial
-
- # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
- # INCLUDE_DIRS += $(shell brew –prefix)/include
-
# LIBRARY_DIRS += $(shell brew --prefix)/lib
-
- # Uncomment to use `pkg-config` to specify OpenCV library paths.
- # (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
-
# USE_PKG_CONFIG := 1
-
-
# N.B. both build and distribute dirs are cleared on `make clean`
-
BUILD_DIR := build
-
DISTRIBUTE_DIR := distribute
-
-
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
-
# DEBUG := 1
-
-
# The ID of the GPU that ‘make runtest’ will use to run unit tests.
-
TEST_GPUID :=
0
-
-
# enable pretty build (comment to see full commands)
-
Q ?= @
-
PROJECT := caffe
-
-
CONFIG_FILE := Makefile.config
-
# Explicitly check for the config file, otherwise make -k will proceed anyway.
-
ifeq ($(wildcard $(CONFIG_FILE)),)
-
$(error $(CONFIG_FILE) not found. See $(CONFIG_FILE).example.)
- endif
- include $(CONFIG_FILE)
-
-
BUILD_DIR_LINK := $(BUILD_DIR)
- ifeq ($(RELEASE_BUILD_DIR),)
-
RELEASE_BUILD_DIR := .$(BUILD_DIR)_release
- endif
- ifeq ($(DEBUG_BUILD_DIR),)
-
DEBUG_BUILD_DIR := .$(BUILD_DIR)_debug
- endif
-
- DEBUG ?= 0
- ifeq ($(DEBUG), 1)
-
BUILD_DIR := $(DEBUG_BUILD_DIR)
- OTHER_BUILD_DIR := $(RELEASE_BUILD_DIR)
-
else
-
BUILD_DIR := $(RELEASE_BUILD_DIR)
- OTHER_BUILD_DIR := $(DEBUG_BUILD_DIR)
-
endif
-
-
# All of the directories containing code.
-
SRC_DIRS := $(shell find * -type d -exec bash -c "find {} -maxdepth 1 \&lt;/span>
- \( -name '*.cpp' -o -name '*.proto' \) | grep -q ." \; -print)
-
- # The target shared library name
- LIBRARY_NAME := $(PROJECT)
-
LIB_BUILD_DIR := $(BUILD_DIR)/lib
- STATIC_NAME := $(LIB_BUILD_DIR)/lib$(LIBRARY_NAME).a
- DYNAMIC_VERSION_MAJOR := 1
- DYNAMIC_VERSION_MINOR := 0
- DYNAMIC_VERSION_REVISION := 0-rc3
- DYNAMIC_NAME_SHORT := lib$(LIBRARY_NAME).so
-
#DYNAMIC_SONAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR)
-
DYNAMIC_VERSIONED_NAME_SHORT := $(DYNAMIC_NAME_SHORT).$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
-
DYNAMIC_NAME := $(LIB_BUILD_DIR)/$(DYNAMIC_VERSIONED_NAME_SHORT)
-
COMMON_FLAGS += -DCAFFE_VERSION=$(DYNAMIC_VERSION_MAJOR).$(DYNAMIC_VERSION_MINOR).$(DYNAMIC_VERSION_REVISION)
-
- ##############################
- # Get all source files
- ##############################
- # CXX_SRCS are the source files excluding the test ones.
- CXX_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cpp" -name "*.cpp")
- # CU_SRCS are the cuda source files
- CU_SRCS := $(shell find src/$(PROJECT) ! -name "test_*.cu" -name "*.cu")
- # TEST_SRCS are the test source files
- TEST_MAIN_SRC := src/$(PROJECT)/
test/test_caffe_main.cpp
-
TEST_SRCS := $(shell find src/$(PROJECT) -name
”test_*.cpp”)
-
TEST_SRCS := $(filter-out $(TEST_MAIN_SRC), $(TEST_SRCS))
- TEST_CU_SRCS := $(shell find src/$(PROJECT) -name "test_*.cu")
- GTEST_SRC := src/gtest/gtest-all.cpp
- # TOOL_SRCS are the source files for the tool binaries
- TOOL_SRCS := $(shell find tools -name
”*.cpp”)
-
# EXAMPLE_SRCS are the source files for the example binaries
-
EXAMPLE_SRCS := $(shell find examples -name "*.cpp")
- # BUILD_INCLUDE_DIR contains any generated header files we want to include.
- BUILD_INCLUDE_DIR := $(BUILD_DIR)/src
-
# PROTO_SRCS are the protocol buffer definitions
-
PROTO_SRC_DIR := src/$(PROJECT)/proto
- PROTO_SRCS := $(wildcard $(PROTO_SRC_DIR)/*.proto)
- # PROTO_BUILD_DIR will contain the .cc and obj files generated from
- # PROTO_SRCS; PROTO_BUILD_INCLUDE_DIR will contain the .h header files
- PROTO_BUILD_DIR := $(BUILD_DIR)/$(PROTO_SRC_DIR)
- PROTO_BUILD_INCLUDE_DIR := $(BUILD_INCLUDE_DIR)/$(PROJECT)/proto
- # NONGEN_CXX_SRCS includes all source/header files except those generated
- # automatically (e.g., by proto).
- NONGEN_CXX_SRCS := $(shell find \
-
src/$(PROJECT) \&lt;/div>
- include/$(PROJECT) \
-
python/$(PROJECT) \&lt;/div>
- matlab/+$(PROJECT)/private \
-
examples \
-
tools \
-
-name
”*.cpp” -or -name
”*.hpp” -or -name
”*.cu” -or -name
”*.cuh”)
-
LINT_SCRIPT := scripts/cpp_lint.py
-
LINT_OUTPUT_DIR := $(BUILD_DIR)/.lint
- LINT_EXT := lint.txt
- LINT_OUTPUTS := $(addsuffix .$(LINT_EXT), $(addprefix $(LINT_OUTPUT_DIR)/, $(NONGEN_CXX_SRCS)))
-
EMPTY_LINT_REPORT := $(BUILD_DIR)/.$(LINT_EXT)
-
NONEMPTY_LINT_REPORT := $(BUILD_DIR)/$(LINT_EXT)
-
# PY$(PROJECT)_SRC is the python wrapper for $(PROJECT)
-
PY$(PROJECT)_SRC := python/$(PROJECT)/_$(PROJECT).cpp
- PY$(PROJECT)_SO := python/$(PROJECT)/_$(PROJECT).so
-
PY$(PROJECT)_HXX := include/$(PROJECT)/layers/python_layer.hpp
-
# MAT$(PROJECT)_SRC is the mex entrance point of matlab package for $(PROJECT)
-
MAT$(PROJECT)_SRC := matlab/+$(PROJECT)/private/$(PROJECT)_.cpp
- ifneq ($(MATLAB_DIR),)
-
MAT_SO_EXT := $(shell $(MATLAB_DIR)/bin/mexext)
-
endif
-
MAT$(PROJECT)_SO := matlab/+$(PROJECT)/private/$(PROJECT)_.$(MAT_SO_EXT)
-
-
##############################
-
# Derive generated files
-
##############################
-
# The generated files for protocol buffers
-
PROTO_GEN_HEADER_SRCS := $(addprefix $(PROTO_BUILD_DIR)/, \
-
$(notdir ${PROTO_SRCS:.proto=.pb.h}))
-
PROTO_GEN_HEADER := $(addprefix $(PROTO_BUILD_INCLUDE_DIR)/, \
-
$(notdir ${PROTO_SRCS:.proto=.pb.h}))
-
PROTO_GEN_CC := $(addprefix $(BUILD_DIR)/,
${PROTO_SRCS:.proto=.pb.cc})
- PY_PROTO_BUILD_DIR := python/$(PROJECT)/proto
-
PY_PROTO_INIT := python/$(PROJECT)/proto/__init__.py
- PROTO_GEN_PY := $(foreach file,
${PROTO_SRCS:.proto=_pb2.py}, \&lt;/div>
- $(PY_PROTO_BUILD_DIR)/$(notdir $(file)))
-
# The objects corresponding to the source files
-
# These objects will be linked into the final shared library, so we
-
# exclude the tool, example, and test objects.
-
CXX_OBJS := $(addprefix $(BUILD_DIR)/,
${CXX_SRCS:.cpp=.o})
- CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o})
-
PROTO_OBJS :=
${PROTO_GEN_CC:.cc=.o}
- OBJS := $(PROTO_OBJS) $(CXX_OBJS) $(CU_OBJS)
-
# tool, example, and test objects
-
TOOL_OBJS := $(addprefix $(BUILD_DIR)/,
${TOOL_SRCS:.cpp=.o})
- TOOL_BUILD_DIR := $(BUILD_DIR)/tools
-
TEST_CXX_BUILD_DIR := $(BUILD_DIR)/src/$(PROJECT)/
test
-
TEST_CU_BUILD_DIR := $(BUILD_DIR)/cuda/src/$(PROJECT)/
test
-
TEST_CXX_OBJS := $(addprefix $(BUILD_DIR)/,
${TEST_SRCS:.cpp=.o})
- TEST_CU_OBJS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o})
-
TEST_OBJS := $(TEST_CXX_OBJS) $(TEST_CU_OBJS)
-
GTEST_OBJ := $(addprefix $(BUILD_DIR)/,
${GTEST_SRC:.cpp=.o})
- EXAMPLE_OBJS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o})
-
# Output files for automatic dependency generation
-
DEPS :=
${CXX_OBJS:.o=.d} ${CU_OBJS:.o=.d}
${TEST_CXX_OBJS:.o=.d} \&lt;/div>
- ${TEST_CU_OBJS:.o=.d} $(BUILD_DIR)/${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}
-
# tool, example, and test bins
-
TOOL_BINS :=
${TOOL_OBJS:.o=.bin}
- EXAMPLE_BINS := ${EXAMPLE_OBJS:.o=.bin}
-
# symlinks to tool bins without the “.bin” extension
-
TOOL_BIN_LINKS :=
${TOOL_BINS:.bin=}
- # Put the test binaries in build/test for convenience.
- TEST_BIN_DIR := $(BUILD_DIR)/test
-
TEST_CU_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \&lt;/div>
- $(foreach obj,$(TEST_CU_OBJS),$(basename $(notdir $(obj))))))
-
TEST_CXX_BINS := $(addsuffix .testbin,$(addprefix $(TEST_BIN_DIR)/, \&lt;/div>
- $(foreach obj,$(TEST_CXX_OBJS),$(basename $(notdir $(obj))))))
-
TEST_BINS := $(TEST_CXX_BINS) $(TEST_CU_BINS)
-
# TEST_ALL_BIN is the test binary that links caffe dynamically.
-
TEST_ALL_BIN := $(TEST_BIN_DIR)/test_all.testbin
-
- ##############################
- # Derive compiler warning dump locations
- ##############################
- WARNS_EXT := warnings.txt
- CXX_WARNS := $(addprefix $(BUILD_DIR)/, ${CXX_SRCS:.cpp=.o.$(WARNS_EXT)})
- CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${CU_SRCS:.cu=.o.$(WARNS_EXT)})
- TOOL_WARNS := $(addprefix $(BUILD_DIR)/, ${TOOL_SRCS:.cpp=.o.$(WARNS_EXT)})
- EXAMPLE_WARNS := $(addprefix $(BUILD_DIR)/, ${EXAMPLE_SRCS:.cpp=.o.$(WARNS_EXT)})
- TEST_WARNS := $(addprefix $(BUILD_DIR)/, ${TEST_SRCS:.cpp=.o.$(WARNS_EXT)})
- TEST_CU_WARNS := $(addprefix $(BUILD_DIR)/cuda/, ${TEST_CU_SRCS:.cu=.o.$(WARNS_EXT)})
- ALL_CXX_WARNS := $(CXX_WARNS) $(TOOL_WARNS) $(EXAMPLE_WARNS) $(TEST_WARNS)
- ALL_CU_WARNS := $(CU_WARNS) $(TEST_CU_WARNS)
- ALL_WARNS := $(ALL_CXX_WARNS) $(ALL_CU_WARNS)
-
- EMPTY_WARN_REPORT := $(BUILD_DIR)/.$(WARNS_EXT)
- NONEMPTY_WARN_REPORT := $(BUILD_DIR)/$(WARNS_EXT)
-
- ##############################
- # Derive include and lib directories
- ##############################
- CUDA_INCLUDE_DIR := $(CUDA_DIR)/include
-
-
CUDA_LIB_DIR :=
-
# add
/lib64 only if it exists
-
ifneq (
”$(wildcard $(CUDA_DIR)/lib64)”,
”“)
-
CUDA_LIB_DIR += $(CUDA_DIR)/lib64
- endif
- CUDA_LIB_DIR += $(CUDA_DIR)/lib
-
-
INCLUDE_DIRS += $(BUILD_INCLUDE_DIR) ./src ./include
- ifneq ($(CPU_ONLY), 1)
-
INCLUDE_DIRS += $(CUDA_INCLUDE_DIR)
- LIBRARY_DIRS += $(CUDA_LIB_DIR)
-
LIBRARIES := cudart cublas curand
-
endif
-
-
#LIBRARIES += glog gflags protobuf boost_system boost_filesystem boost_regex m hdf5_hl hdf5
-
LIBRARIES += glog gflags protobuf boost_system boost_filesystem boost_regex m hdf5_serial_hl hdf5_serial opencv_core opencv_highgui opencv_imgproc opencv_imgcodecs opencv_videoio
-
# handle IO dependencies
-
USE_LEVELDB ?= 1
-
USE_LMDB ?= 1
-
USE_OPENCV ?= 1
-
-
ifeq ($(USE_LEVELDB), 1)
- LIBRARIES += leveldb snappy
- endif
- ifeq ($(USE_LMDB), 1)
-
LIBRARIES += lmdb
-
endif
-
ifeq ($(USE_OPENCV), 1)
- LIBRARIES += opencv_core opencv_highgui opencv_imgproc
-
- ifeq ($(OPENCV_VERSION), 3)
-
LIBRARIES += opencv_imgcodecs opencv_videoio
-
endif
-
-
endif
-
PYTHON_LIBRARIES ?= boost_python python2.7
-
WARNINGS := -Wall -Wno-sign-compare
-
-
##############################
-
# Set build directories
-
##############################
-
-
DISTRIBUTE_DIR ?= distribute
-
DISTRIBUTE_SUBDIRS := $(DISTRIBUTE_DIR)/bin $(DISTRIBUTE_DIR)/lib
-
DIST_ALIASES := dist
-
ifneq ($(strip $(DISTRIBUTE_DIR)),distribute)
-
DIST_ALIASES += distribute
-
endif
-
-
ALL_BUILD_DIRS := $(sort $(BUILD_DIR) $(addprefix $(BUILD_DIR)/, $(SRC_DIRS)) \&lt;/div>
- $(addprefix $(BUILD_DIR)/cuda/, $(SRC_DIRS)) \
-
$(LIB_BUILD_DIR) $(TEST_BIN_DIR) $(PY_PROTO_BUILD_DIR) $(LINT_OUTPUT_DIR) \
-
$(DISTRIBUTE_SUBDIRS) $(PROTO_BUILD_INCLUDE_DIR))
-
-
##############################
-
# Set directory for Doxygen-generated documentation
-
##############################
-
DOXYGEN_CONFIG_FILE ?= ./.Doxyfile
-
# should be the same as OUTPUT_DIRECTORY in the .Doxyfile
-
DOXYGEN_OUTPUT_DIR ?= ./doxygen
-
DOXYGEN_COMMAND ?= doxygen
-
# All the files that might have Doxygen documentation.
-
DOXYGEN_SOURCES := $(shell find \&lt;/div>
- src/$(PROJECT) \
-
include/$(PROJECT) \&lt;/div>
- python/ \&lt;/div>
- matlab/ \&lt;/div>
- examples \&lt;/div>
- tools \&lt;/div>
- -name "*.cpp" -or -name "*.hpp" -or -name "*.cu" -or -name "*.cuh" -or \&lt;/div>
- -name "*.py" -or -name "*.m")
- DOXYGEN_SOURCES += $(DOXYGEN_CONFIG_FILE)
-
-
-
##############################
-
# Configure build
-
##############################
-
-
# Determine platform
-
UNAME := $(shell uname -s)
- ifeq ($(UNAME), Linux)
-
LINUX := 1
-
else ifeq ($(UNAME), Darwin)
- OSX := 1
- OSX_MAJOR_VERSION := $(shell sw_vers -productVersion | cut -f 1 -d .)
-
OSX_MINOR_VERSION := $(shell sw_vers -productVersion | cut -f 2 -d .)
- endif
-
- # Linux
- ifeq ($(LINUX), 1)
-
CXX ?= /usr/bin/g++
-
GCCVERSION := $(shell $(CXX) -dumpversion | cut -f1,2 -d.)
-
# older versions of gcc are too dumb to build boost with -Wuninitalized
-
ifeq ($(shell echo | awk '{exit $(GCCVERSION) < 4.6;}'), 1)
- WARNINGS += -Wno-uninitialized
- endif
- # boost::thread is reasonably called boost_thread (compare OS X)
- # We will also explicitly add stdc++ to the link target.
- LIBRARIES += boost_thread stdc++
- VERSIONFLAGS += -Wl,-soname,$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../lib
- endif
-
- # OS X:
- # clang++ instead of g++
- # libstdc++ for NVCC compatibility on OS X >= 10.9 with CUDA < 7.0
- ifeq ($(OSX), 1)
-
CXX := /usr/bin/clang++
-
ifneq ($(CPU_ONLY), 1)
- CUDA_VERSION := $(shell $(CUDA_DIR)/bin/nvcc -V | grep -o 'release [0-9.]*' | tr -d '[a-z ]')
- ifeq ($(shell
echo | awk
’{exit $(CUDA_VERSION) < 7.0;}'), 1)
- CXXFLAGS += -stdlib=libstdc++
- LINKFLAGS += -stdlib=libstdc++
- endif
- # clang throws this warning for cuda headers
- WARNINGS += -Wno-unneeded-internal-declaration
- # 10.11 strips DYLD_* env vars so link CUDA (rpath is available on 10.5+)
- OSX_10_OR_LATER := $(shell [ $(OSX_MAJOR_VERSION) -ge 10 ] && echo true)
- OSX_10_5_OR_LATER := $(shell [ $(OSX_MINOR_VERSION) -ge 5 ] && echo true)
- ifeq ($(OSX_10_OR_LATER),true)
-
ifeq ($(OSX_10_5_OR_LATER),true)
- LDFLAGS += -Wl,-rpath,$(CUDA_LIB_DIR)
-
endif
-
endif
-
endif
-
# gtest needs to use its own tuple to not conflict with clang
-
COMMON_FLAGS += -DGTEST_USE_OWN_TR1_TUPLE=1
-
# boost::thread is called boost_thread-mt to mark multithreading on OS X
-
LIBRARIES += boost_thread-mt
-
# we need to explicitly ask for the rpath to be obeyed
-
ORIGIN := @loader_path
-
VERSIONFLAGS += -Wl,-install_name,@rpath/$(DYNAMIC_VERSIONED_NAME_SHORT) -Wl,-rpath,$(ORIGIN)/../../build/lib
-
else
-
ORIGIN := \$
$ORIGIN
- endif
-
- # Custom compiler
- ifdef CUSTOM_CXX
- CXX := $(CUSTOM_CXX)
-
endif
-
-
# Static linking
-
ifneq (,$(findstring clang++,$(CXX)))
-
STATIC_LINK_COMMAND := -Wl,-force_load $(STATIC_NAME)
- else ifneq (,$(findstring g++,$(CXX)))
- STATIC_LINK_COMMAND := -Wl,--whole-archive $(STATIC_NAME) -Wl,–no-whole-archive
-
else
-
# The following line must not be indented with a tab, since we are not inside a target
-
$(error Cannot static link with the $(CXX) compiler)
-
endif
-
-
# Debugging
-
ifeq ($(DEBUG), 1)
- COMMON_FLAGS += -DDEBUG -g -O0
- NVCCFLAGS += -G
- else
- COMMON_FLAGS += -DNDEBUG -O2
- endif
-
- # cuDNN acceleration configuration.
- ifeq ($(USE_CUDNN), 1)
-
LIBRARIES += cudnn
-
COMMON_FLAGS += -DUSE_CUDNN
-
endif
-
-
# configure IO libraries
-
ifeq ($(USE_OPENCV), 1)
- COMMON_FLAGS += -DUSE_OPENCV
- endif
- ifeq ($(USE_LEVELDB), 1)
-
COMMON_FLAGS += -DUSE_LEVELDB
-
endif
-
ifeq ($(USE_LMDB), 1)
- COMMON_FLAGS += -DUSE_LMDB
- ifeq ($(ALLOW_LMDB_NOLOCK), 1)
-
COMMON_FLAGS += -DALLOW_LMDB_NOLOCK
-
endif
-
endif
-
-
# CPU-only configuration
-
ifeq ($(CPU_ONLY), 1)
- OBJS := $(PROTO_OBJS) $(CXX_OBJS)
- TEST_OBJS := $(TEST_CXX_OBJS)
-
TEST_BINS := $(TEST_CXX_BINS)
- ALL_WARNS := $(ALL_CXX_WARNS)
-
TEST_FILTER := –gtest_filter=
”-*GPU*”
-
COMMON_FLAGS += -DCPU_ONLY
-
endif
-
-
# Python layer support
-
ifeq ($(WITH_PYTHON_LAYER), 1)
- COMMON_FLAGS += -DWITH_PYTHON_LAYER
- LIBRARIES += $(PYTHON_LIBRARIES)
-
endif
-
-
# BLAS configuration (default = ATLAS)
-
BLAS ?= atlas
-
ifeq ($(BLAS), mkl)
- # MKL
- LIBRARIES += mkl_rt
- COMMON_FLAGS += -DUSE_MKL
- MKLROOT ?= /opt/intel/mkl
- BLAS_INCLUDE ?= $(MKLROOT)/include
-
BLAS_LIB ?= $(MKLROOT)/lib $(MKLROOT)/lib/intel64
-
else ifeq ($(BLAS), open)
- # OpenBLAS
- LIBRARIES += openblas
- else
- # ATLAS
- ifeq ($(LINUX), 1)
-
ifeq ($(BLAS), atlas)
- # Linux simply has cblas and atlas
- LIBRARIES += cblas atlas
- endif
- else ifeq ($(OSX), 1)
-
# OS X packages atlas as the vecLib framework
-
LIBRARIES += cblas
-
# 10.10 has accelerate while 10.9 has veclib
-
XCODE_CLT_VER := $(shell pkgutil --pkg-info=com.apple.pkg.CLTools_Executables | grep 'version' | sed 's/[^0-9]*\([0-9]\).*/\1/')
- XCODE_CLT_GEQ_7 := $(shell [ $(XCODE_CLT_VER) -gt 6 ] && echo 1)
- XCODE_CLT_GEQ_6 := $(shell [ $(XCODE_CLT_VER) -gt 5 ] && echo 1)
- ifeq ($(XCODE_CLT_GEQ_7), 1)
-
BLAS_INCLUDE ?= /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/$(shell ls /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/ | sort | tail -1)/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/Headers
- else ifeq ($(XCODE_CLT_GEQ_6), 1)
-
BLAS_INCLUDE ?= /System/Library/Frameworks/Accelerate.framework/Versions/Current/Frameworks/vecLib.framework/Headers/
-
LDFLAGS += -framework Accelerate
-
else
-
BLAS_INCLUDE ?= /System/Library/Frameworks/vecLib.framework/Versions/Current/Headers/
-
LDFLAGS += -framework vecLib
-
endif
-
endif
-
endif
-
INCLUDE_DIRS += $(BLAS_INCLUDE)
- LIBRARY_DIRS += $(BLAS_LIB)
-
-
LIBRARY_DIRS += $(LIB_BUILD_DIR)
-
- # Automatic dependency generation (nvcc is handled separately)
- CXXFLAGS += -MMD -MP
-
- # Complete build flags.
- COMMON_FLAGS += $(foreach includedir,$(INCLUDE_DIRS),-isystem $(includedir))
-
CXXFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)
-
#NVCCFLAGS += -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
-
NVCCFLAGS += -D_FORCE_INLINES -ccbin=$(CXX) -Xcompiler -fPIC $(COMMON_FLAGS)
-
# mex may invoke an older gcc that is too liberal with -Wuninitalized
-
MATLAB_CXXFLAGS := $(CXXFLAGS) -Wno-uninitialized
- LINKFLAGS += -pthread -fPIC $(COMMON_FLAGS) $(WARNINGS)
-
- USE_PKG_CONFIG ?= 0
- ifeq ($(USE_PKG_CONFIG), 1)
-
PKG_CONFIG := $(shell pkg-config opencv --libs)
- else
- PKG_CONFIG :=
- endif
- LDFLAGS += $(foreach librarydir,$(LIBRARY_DIRS),-L$(librarydir)) $(PKG_CONFIG) \&lt;/div>
- $(foreach library,$(LIBRARIES),-l$(library))
-
PYTHON_LDFLAGS := $(LDFLAGS) $(foreach library,$(PYTHON_LIBRARIES),-l$(library))
-
-
# ‘superclean’ target recursively* deletes all files ending with an extension
-
# in $(SUPERCLEAN_EXTS) below. This may be useful if you've built older
- # versions of Caffe that do not place all generated files in a location known
- # to the 'clean' target.
- #
- # 'supercleanlist' will list the files to be deleted by make superclean.
- #
- # * Recursive with the exception that symbolic links are never followed, per the
- # default behavior of 'find'.
- SUPERCLEAN_EXTS := .so .a .o .bin .testbin .pb.cc .pb.h _pb2.py .cuo
-
- # Set the sub-targets of the 'everything' target.
- EVERYTHING_TARGETS := all py$(PROJECT) test warn lint
-
# Only build matcaffe as part of “everything” if MATLAB_DIR is specified.
-
ifneq ($(MATLAB_DIR),)
- EVERYTHING_TARGETS += mat$(PROJECT)
-
endif
-
-
##############################
-
# Define build targets
-
##############################
-
.PHONY: all lib
test clean docs linecount lint lintclean tools examples $(DIST_ALIASES) \&lt;/div>
- py mat py$(PROJECT) mat$(PROJECT) proto runtest \&lt;/div>
- superclean supercleanlist supercleanfiles warn everything
-
- all: lib tools examples
-
- lib: $(STATIC_NAME) $(DYNAMIC_NAME)
-
- everything: $(EVERYTHING_TARGETS)
-
-
linecount:
-
cloc –
read-lang-def=$(PROJECT).cloc \&lt;/div>
- src/$(PROJECT) include/$(PROJECT) tools examples \&lt;/div>
- python matlab
-
- lint: $(EMPTY_LINT_REPORT)
-
-
lintclean:
-
@ $(RM) -r $(LINT_OUTPUT_DIR) $(EMPTY_LINT_REPORT) $(NONEMPTY_LINT_REPORT)
-
-
docs: $(DOXYGEN_OUTPUT_DIR)
- @ cd ./docs ; ln -sfn ../$(DOXYGEN_OUTPUT_DIR)/html doxygen
-
-
$(DOXYGEN_OUTPUT_DIR): $(DOXYGEN_CONFIG_FILE) $(DOXYGEN_SOURCES)
- $(DOXYGEN_COMMAND) $(DOXYGEN_CONFIG_FILE)
-
- $(EMPTY_LINT_REPORT): $(LINT_OUTPUTS) | $(BUILD_DIR)
-
@ cat $(LINT_OUTPUTS) > $@
-
@
if [ -s
”$@" ]; then \&lt;/div>
- cat $@; \
-
mv
$@ $(NONEMPTY_LINT_REPORT); \
-
echo
”Found one or more lint errors.”; \
-
exit 1; \
-
fi; \
-
$(RM) $(NONEMPTY_LINT_REPORT); \
-
echo
”No lint errors!”;
-
-
$(LINT_OUTPUTS): $(LINT_OUTPUT_DIR)/%.lint.txt : % $(LINT_SCRIPT) | $(LINT_OUTPUT_DIR)
-
@ mkdir -p $(dir $@)
-
@ python $(LINT_SCRIPT) $< 2>&1 \
-
| grep -v
”^Done processing “ \
-
| grep -v
”^Total errors found: 0” \
-
>
$@ \&lt;/div>
- || true
-
- test: $(TEST_ALL_BIN) $(TEST_ALL_DYNLINK_BIN) $(TEST_BINS)
-
-
tools: $(TOOL_BINS) $(TOOL_BIN_LINKS)
-
-
examples: $(EXAMPLE_BINS)
-
- py$(PROJECT): py
-
-
py: $(PY$(PROJECT)_SO) $(PROTO_GEN_PY)
-
- $(PY$(PROJECT)_SO): $(PY$(PROJECT)_SRC) $(PY$(PROJECT)_HXX) | $(DYNAMIC_NAME)
-
@
echo CXX/LD -o
$@ $<
-
$(Q)$(CXX) -shared -o
$@ $(PY$(PROJECT)_SRC) \&lt;/div>
- -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(PYTHON_LDFLAGS) \&lt;/div>
- -Wl,-rpath,$(ORIGIN)/../../build/lib
-
-
mat$(PROJECT): mat
-
- mat: $(MAT$(PROJECT)_SO)
-
- $(MAT$(PROJECT)_SO): $(MAT$(PROJECT)_SRC) $(STATIC_NAME)
-
@
if [ -z
”$(MATLAB_DIR)" ]; then \&lt;/div>
- echo "MATLAB_DIR must be specified in $(CONFIG_FILE)” \
-
”to build mat$(PROJECT)."; \&lt;/div>
- exit 1; \&lt;/div>
- fi
- @ echo MEX $<
-
$(Q)$(MATLAB_DIR)/bin/mex $(MAT$(PROJECT)_SRC) \
-
CXX=
”$(CXX)" \&lt;/div>
- CXXFLAGS="\$$CXXFLAGS $(MATLAB_CXXFLAGS)" \
- CXXLIBS="\$$CXXLIBS $(STATIC_LINK_COMMAND) $(LDFLAGS)” -output
$@
- @ if [ -f "$(PROJECT)
_.d” ];
then \
-
mv -f $(PROJECT)_.d $(BUILD_DIR)/
${MAT$(PROJECT)_SO:.$(MAT_SO_EXT)=.d}; \&lt;/div>
- fi
-
- runtest: $(TEST_ALL_BIN)
-
$(TOOL_BUILD_DIR)/caffe
- $(TEST_ALL_BIN) $(TEST_GPUID) --gtest_shuffle $(TEST_FILTER)
-
-
pytest: py
-
cd python; python -m unittest discover -s caffe/
test
-
-
mattest: mat
-
cd matlab; $(MATLAB_DIR)/bin/matlab -nodisplay -r 'caffe.run_tests(), exit()'
-
- warn: $(EMPTY_WARN_REPORT)
-
-
$(EMPTY_WARN_REPORT): $(ALL_WARNS) | $(BUILD_DIR)
- @ cat $(ALL_WARNS) >
$@
- @ if [ -s "$@
” ];
then \
-
cat
$@; \&lt;/div>
- mv $@ $(NONEMPTY_WARN_REPORT); \&lt;/div>
- echo "Compiler produced one or more warnings."; \&lt;/div>
- exit 1; \&lt;/div>
- fi; \&lt;/div>
- $(RM) $(NONEMPTY_WARN_REPORT); \&lt;/div>
- echo "No compiler warnings!";
-
- $(ALL_WARNS): %.o.$(WARNS_EXT) : %.o
-
- $(BUILD_DIR_LINK): $(BUILD_DIR)/.linked
-
- # Create a target ".linked" in this BUILD_DIR to tell Make that the "build" link
- # is currently correct, then delete the one in the OTHER_BUILD_DIR in case it
- # exists and $(DEBUG) is toggled later.
-
$(BUILD_DIR)/.linked:
- @ mkdir -p $(BUILD_DIR)
-
@ $(RM) $(OTHER_BUILD_DIR)/.linked
-
@ $(RM) -r $(BUILD_DIR_LINK)
-
@ ln -s $(BUILD_DIR) $(BUILD_DIR_LINK)
-
@ touch
$@
-
- $(ALL_BUILD_DIRS): | $(BUILD_DIR_LINK)
- @ mkdir -p $@
-
-
$(DYNAMIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)
- @ echo LD -o $@
-
$(Q)$(CXX) -shared -o
$@ $(OBJS) $(VERSIONFLAGS) $(LINKFLAGS) $(LDFLAGS)
- @ cd $(BUILD_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT); ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)
-
- $(STATIC_NAME): $(OBJS) | $(LIB_BUILD_DIR)
-
@
echo AR -o
$@
- $(Q)ar rcs $@ $(OBJS)
-
-
$(BUILD_DIR)/%.o: %.cpp | $(ALL_BUILD_DIRS)
-
@
echo CXX $<
- $(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2>
$@.$(WARNS_EXT) \
-
|| (cat
$@.$(WARNS_EXT); exit 1)
-
@ cat
$@.$(WARNS_EXT)
-
-
$(PROTO_BUILD_DIR)/%.pb.o: $(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_GEN_HEADER) \&lt;/div>
- | $(PROTO_BUILD_DIR)
-
@
echo CXX $<
- $(Q)$(CXX) $< $(CXXFLAGS) -c -o $@ 2>
$@.$(WARNS_EXT) \
-
|| (cat
$@.$(WARNS_EXT); exit 1)
-
@ cat
$@.$(WARNS_EXT)
-
-
$(BUILD_DIR)/cuda/%.o: %.cu | $(ALL_BUILD_DIRS)
-
@
echo NVCC $<
- $(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -M $< -o
${@:.o=.d} \&lt;/div>
- -odir $(@D)
-
$(Q)$(CUDA_DIR)/bin/nvcc $(NVCCFLAGS) $(CUDA_ARCH) -c $< -o $@ 2>
$@.$(WARNS_EXT) \
-
|| (cat
$@.$(WARNS_EXT); exit 1)
-
@ cat
$@.$(WARNS_EXT)
-
-
$(TEST_ALL_BIN): $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \
-
| $(DYNAMIC_NAME) $(TEST_BIN_DIR)
-
@
echo CXX/LD -o
$@ $<
-
$(Q)$(CXX) $(TEST_MAIN_SRC) $(TEST_OBJS) $(GTEST_OBJ) \&lt;/div>
- -o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
-
-
$(TEST_CU_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CU_BUILD_DIR)/%.o \&lt;/div>
- $(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)
-
@
echo LD $<
- $(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \
-
-o
$@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
-
- $(TEST_CXX_BINS): $(TEST_BIN_DIR)/%.testbin: $(TEST_CXX_BUILD_DIR)/%.o \
-
$(GTEST_OBJ) | $(DYNAMIC_NAME) $(TEST_BIN_DIR)
- @ echo LD $<
-
$(Q)$(CXX) $(TEST_MAIN_SRC) $< $(GTEST_OBJ) \&lt;/div>
- -o $@ $(LINKFLAGS) $(LDFLAGS) -l$(LIBRARY_NAME) -Wl,-rpath,$(ORIGIN)/../lib
-
-
# Target for extension-less symlinks to tool binaries with extension ‘*.bin’.
-
$(TOOL_BUILD_DIR)/%: $(TOOL_BUILD_DIR)/%.bin | $(TOOL_BUILD_DIR)
- @ $(RM)
$@
- @ ln -s $(notdir $<) $@
-
-
$(TOOL_BINS): %.bin : %.o | $(DYNAMIC_NAME)
-
@
echo CXX/LD -o
$@
- $(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \
-
-Wl,-rpath,$(ORIGIN)/../lib
-
- $(EXAMPLE_BINS): %.bin : %.o | $(DYNAMIC_NAME)
- @ echo CXX/LD -o $@
-
$(Q)$(CXX) $< -o $@ $(LINKFLAGS) -l$(LIBRARY_NAME) $(LDFLAGS) \&lt;/div>
- -Wl,-rpath,$(ORIGIN)/../../lib
-
-
proto: $(PROTO_GEN_CC) $(PROTO_GEN_HEADER)
-
-
$(PROTO_BUILD_DIR)/%.pb.cc $(PROTO_BUILD_DIR)/%.pb.h : \
-
$(PROTO_SRC_DIR)/%.proto | $(PROTO_BUILD_DIR)
-
@
echo PROTOC $<
- $(Q)protoc –proto_path=$(PROTO_SRC_DIR) --cpp_out=$(PROTO_BUILD_DIR) $<
-
- $(PY_PROTO_BUILD_DIR)/%_pb2.py : $(PROTO_SRC_DIR)/%.proto \&lt;/div>
- $(PY_PROTO_INIT) | $(PY_PROTO_BUILD_DIR)
- @ echo PROTOC \(python\) $<
-
$(Q)protoc --proto_path=$(PROTO_SRC_DIR) –python_out=$(PY_PROTO_BUILD_DIR) $<
-
-
$(PY_PROTO_INIT): | $(PY_PROTO_BUILD_DIR)
-
touch $(PY_PROTO_INIT)
-
- clean:
- @- $(RM) -rf $(ALL_BUILD_DIRS)
- @- $(RM) -rf $(OTHER_BUILD_DIR)
- @- $(RM) -rf $(BUILD_DIR_LINK)
- @- $(RM) -rf $(DISTRIBUTE_DIR)
- @- $(RM) $(PY$(PROJECT)_SO)
-
@- $(RM) $(MAT$(PROJECT)_SO)
-
- supercleanfiles:
- $(
eval SUPERCLEAN_FILES := $(strip \&lt;/div>
- $(foreach ext,$(SUPERCLEAN_EXTS), $(shell find . -name
’*$(ext)' \&lt;/div>
- -not -path './data/*'))))
-
- supercleanlist: supercleanfiles
- @ \&lt;/div>
- if [ -z "$(SUPERCLEAN_FILES)
” ];
then \
-
echo
”No generated files found.”; \
-
else \
-
echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \&lt;/div>
- fi
-
- superclean: clean supercleanfiles
- @ \&lt;/div>
- if [ -z "$(SUPERCLEAN_FILES)
” ];
then \
-
echo
”No generated files found.”; \
-
else \
-
echo
”Deleting the following generated files:”; \
-
echo $(SUPERCLEAN_FILES) | tr ' ' '\n'; \&lt;/div>
- $(RM) $(SUPERCLEAN_FILES); \&lt;/div>
- fi
-
- $(DIST_ALIASES): $(DISTRIBUTE_DIR)
-
- $(DISTRIBUTE_DIR): all py | $(DISTRIBUTE_SUBDIRS)
- # add proto
- cp -r src/caffe/proto $(DISTRIBUTE_DIR)/
-
# add include
-
cp -r include $(DISTRIBUTE_DIR)/
- mkdir -p $(DISTRIBUTE_DIR)/include/caffe/proto
-
cp $(PROTO_GEN_HEADER_SRCS) $(DISTRIBUTE_DIR)/include/caffe/proto
-
# add tool and example binaries
-
cp $(TOOL_BINS) $(DISTRIBUTE_DIR)/bin
-
cp $(EXAMPLE_BINS) $(DISTRIBUTE_DIR)/bin
-
# add libraries
-
cp $(STATIC_NAME) $(DISTRIBUTE_DIR)/lib
-
install -m 644 $(DYNAMIC_NAME) $(DISTRIBUTE_DIR)/lib
-
cd $(DISTRIBUTE_DIR)/lib; rm -f $(DYNAMIC_NAME_SHORT); ln -s $(DYNAMIC_VERSIONED_NAME_SHORT) $(DYNAMIC_NAME_SHORT)
-
# add python - it’s not the standard way, indeed…
-
cp -r python $(DISTRIBUTE_DIR)/python
-
- -include $(DEPS)
Makefile.config文件中的ANACONDA_HOME := $(HOME)/anaconda2
是安装anaconda后的路径
在$caffe_root目录下打开命令行终端,输入以下命令
make -j8
make py
make test -j8
如果不报错,说明差不多了
ps: -j8是说明你的电脑配置的cpu有几核
编辑~/.bashrc文件,加入如下环境变量
export PYTHONPATH=/home/gdu/caffe/python:$PYTHONPA
其中/home/gdu/caffe/python就是你$caffe_root
目录下的python
然后更新环境变量
source ~/.bashrce
打开命令终端,输入
python
进入Python解释器后再输入
import caffe
如果不报错,那么,congratulation,你的caffe-ssd配置成功
**************************************************************************************************************
四、配置及运行
MobileNetSSD
如果你需要用MobileNetSSD进行训练自己的数据集,你可能额外需要阅读一下其他参考网址:
http://www.cnblogs.com/EstherLjy/p/6863890.html
,已经有的步骤就不需要做了
MobileNetSSD官网网址:
https://github.com/chuanqi305/MobileNet-SSD
官网的步骤如下:
Run
1.Download
SSD
source code and compile (follow the SSD README).
2.Download the pretrained deploy weights from the link above.
3.Put all the files in SSD_HOME/examples/
4.Run demo.py to show the detection result.
解释一下:
1步就是让你配置SSD,我上面已经配置好了
2步就是下载预训练模型,后面我会附上相关文件下载地址
3步就是说把MobileNet-SSD代码放到ssd的examples目录下,也就是$caffe_root
/examples/
4步就是运行demo.py喽
Train your own dataset
其中官网介绍说让你先创建lmdb数据,是用软连接创建的,下面我按照我的方式进行训练你自己的数据
1).Convert your own dataset to lmdb database (follow the SSD README), and create symlinks to current directory.
ln -s PATH_TO_YOUR_TRAIN_LMDB trainval_lmdbln -s PATH_TO_YOUR_TEST_LMDB test_lmdb
上面是官网简单的介绍,其实不用这么做,用绝对路径就好了,在第3)步中修改绝对路径,详情见第3)步
2).Create the labelmap.prototxt file and put it into current directory.
这一步也就是配置SSD的时候生成的,我这里的名字叫做
labelmap_voc_my_test.prototxt
,在/home/caffe/data/my_test/
目录下
3).Use gen_model.sh to generate your own training prototxt.
执行(说明下,应该在MobileNet-SSD目录下执行,也就是$caffe_root/examples/MobileNet-SSD)
./
gen_model.sh 13
这个13是类数,包括一个背景类,如果你有20类的,那么这里就是21
在example下面会生成MobileNetSSD_deploy.prototxt
MobileNetSSD_test.prototxt
MobileNetSSD_train.prototxt
这里需要修改三个文件,分别是MobileNetSSD_train_template.prototxt, MobileNetSSD_test_template.prototxt,
MobileNetSSD_deploy_template.prototxt
这三个文件所在的目录是/home/gdu/caffe/examples/MobileNetSSD/template
MobileNetSSD_train_template.prototxt需要修改的地方,大概在:
第49行source: “/home/gdu/caffe/examples/my_test/my_test_trainval_lmdb/”,
第136行 label_map_file: “/home/gdu/caffe/data/my_test/labelmap_voc_my_test.prototxt”
MobileNetSSD_test_template.prototxt需要修改的地方,大概在:
第24行source: “/home/gdu/caffe/examples/my_test/my_test_test_lmdb”,
第31行label_map_file: “/home/gdu/caffe/data/my_test/labelmap_voc_my_test.prototxt”
MobileNetSSD_deploy_template.prototxt需要修改的地方:
暂时不用修改
4).Download the training weights from the link above, and run train.sh, after about 30000 iterations, the loss should be 1.5 - 2.5.
下载权重文件
mobilenet_iter_73000.caffemodel
链接: https://pan.baidu.com/s/1gfIoVi7 密码: 7yu5
可以调整solver_train.prototxt文件里面的参数,比如max_iter代表最大迭代次数,原先120000;snapshot代表迭代多少次保存一次,原先8000
运行(说明下,应该在MobileNet-SSD目录下执行,也就是$caffe_root/examples/MobileNet-SSD)
./train.sh
5).Run test.sh to evaluate the result.
这一步是评估,可跳过,如果需要做,那么需要更改一些东西,
第三行的latest=$(ls -t snapshot/mobilenet_iter_2000.caffemodel | head -n 1)
mobilenet_iter_2000.caffemodel 是4)训练保存的结果
第七行的/home/gdu/caffe/build/tools/caffe train -solver=”solver_test.prototxt”
必须用solver_test.prototxt这个文件
6).Run merge_bn.py to generate your own deploy caffemodel.
在$caffe_root/examples/MobileNet-SSD
目录下执行
python merge_bn.py
其中merge_bn.py文件中:
train_proto = ‘example/MobileNetSSD_train.prototxt’ #这个是3)产生的文件
train_model = ‘snapshot/mobilenet_iter_2000.caffemodel’ #should be your snapshot caffemodel 这个是4)训练保存的结果
deploy_proto = ‘MobileNetSSD_deploy.prototxt’ #这个是3)产生的文件
save_model = ‘MobileNetSSD_deploy_my_test_2000.caffemodel’ #这个是合并的模型文件
最后会就生成了你自己的数据集模型,MobileNetSSD_deploy.prototxt是网络结构文件,MobileNetSSD_deploy_my_test_2000.caffemodel是模型文件
3、测试训练的模型
Python代码如下
-
import numpy
as np
-
import sys,os
-
import cv2
-
caffe_root =
“/home/gdu/caffe/”
-
import sys
-
#sys.path.insert(0, caffe_root + ‘python’)
-
sys.path.append(caffe_root+
‘python’)
-
import caffe
-
-
#net_file = “model/MobileNetSSD_deploy.prototxt”
-
#caffe_model = “model/MobileNetSSD_deploy.caffemodel”
-
-
# net_file = “example/MobileNetSSD_deploy100*100.prototxt”
-
# caffe_model = “result_model/MobileNetSSD_deploy_my_test_100*100_2000.caffemodel”
-
net_file =
“model/MobileNetSSD_deploy.prototxt”
-
caffe_model =
“model/MobileNetSSD_deploy.caffemodel”
-
-
test_dir =
“/home/gdu/caffe/examples/MobileNet-SSD/images”
-
-
if
not os.path.exists(caffe_model):
-
print(
“MobileNetSSD_deploy.affemodel does not exist,”)
-
print(
“use merge_bn.py to generate it.”)
-
exit()
-
net = caffe.Net(net_file,caffe_model,caffe.TEST)
-
-
CLASSES = (
‘background’,
-
‘aeroplane’,
‘bicycle’,
‘bird’,
‘boat’,
-
‘bottle’,
‘bus’,
‘car’,
‘cat’,
‘chair’,
-
‘cow’,
‘diningtable’,
‘dog’,
‘horse’,
-
‘motorbike’,
‘person’,
‘pottedplant’,
-
‘sheep’,
‘sofa’,
‘train’,
‘tvmonitor’)
-
# CLASSES = (‘background’,
-
# ‘bicycle’, ‘boat’,
-
# ‘bus’, ‘car’, ‘cat’,
-
# ‘cow’, ‘dog’, ‘horse’,
-
# ‘motorbike’, ‘person’,
-
# ‘sheep’, ‘train’)
-
# CLASSES = (‘background’,
-
# ‘person_v’, ‘person_p’)
-
-
-
def preprocess(src):
-
img = cv2.resize(src, (
300,
300))
-
img = img -
127.5
-
img = img *
0.007843
-
return img
-
-
def postprocess(img, out):
-
h = img.shape[
0]
-
w = img.shape[
1]
-
box = out[
‘detection_out’][
0,
0,:,
3:
7] * np.array([w, h, w, h])
-
-
cls = out[
‘detection_out’][
0,
0,:,
1]
-
conf = out[
‘detection_out’][
0,
0,:,
2]
-
return (box.astype(np.int32), conf, cls)
-
-
def detect(imgfile):
-
origimg = cv2.imread(imgfile)
-
img = preprocess(origimg)
-
-
img = img.astype(np.float32)
-
img = img.transpose((
2,
0,
1))
-
-
net.blobs[
‘data’].data[…] = img
-
out = net.forward()
-
box, conf, cls = postprocess(origimg, out)
-
-
for i
in range(len(box)):
-
p1 = (box[i][
0], box[i][
1])
-
p2 = (box[i][
2], box[i][
3])
-
cv2.rectangle(origimg, p1, p2, (
0,
255,
0))
-
p3 = (max(p1[
0],
15), max(p1[
1],
15))
-
title =
“%s:%.2f” % (CLASSES[int(cls[i])], conf[i])
-
cv2.putText(origimg, title, p3, cv2.FONT_ITALIC,
0.6, (
0,
255,
0),
1)
-
cv2.imshow(
“SSD”, origimg)
-
-
k = cv2.waitKey(
0) &
0xff
-
#Exit if ESC pressed
-
if k ==
27 :
return
False
-
return
True
-
-
for f
in os.listdir(test_dir):
-
print(test_dir +
“/” + f+
“\n”)
-
if detect(test_dir +
“/” + f) ==
False:
-
break
**************************************************************************************************************
五、错误
1、如果用opencv3.3,在caffe进行make时,会报出这个错误时
collect2: error: ld returned 1 exit status
Makefile:560: recipe for target ‘.build_release/tools/upgrade_net_proto_text.bin’ failed
make: *** [.build_release/tools/upgrade_net_proto_text.bin] Error 1
则按照这个方式进行配置:
https://stackoverflow.com/questions/31962975/caffe-install-on-ubuntu-for-anaconda-with-python-2-7-fails-with-libpng16-so-16-n
2、
如果出现这个错误
python caffe报错:No module named google.protobuf.internal
按照这个教程
http://www.jianshu.com/p/1e405b9fe973
3、如果之前用cmake,make install等方式安装过caffe的话
由于以前安装caffe的方式会在系统目录生成安装文件,caffe安装在其他目录下了
http://www.cnblogs.com/darkknightzh/p/5864715.html
删除/usr/include/caffe /usr/lib
https://github.com/BVLC/caffe/issues/3396
sudo rm -rf /usr/local/lib/libcaffe*
4、
src/caffe/layers/hdf5_output_layer.cpp:3:18: 致命错误: hdf5.h:没有那个文件或目录编译中断。
Makefile:572: recipe for target ‘.build_release/src/caffe/layers/hdf5_output_layer.o’ failed
make: *** [.build_release/src/caffe/layers/hdf5_output_layer.o] Error 1
make: *** 正在等待未完成的任务….
In file included from src/caffe/util/hdf5.cpp:1:0:
./include/caffe/util/hdf5.hpp:7:18: 致命错误: hdf5.h:没有那个文件或目录编译中断。
Makefile:572: recipe for target ‘.build_release/src/caffe/util/hdf5.o’ failed
make: *** [.build_release/src/caffe/util/hdf5.o] Error 1
src/caffe/net.cpp:8:18: 致命错误: hdf5.h:没有那个文件或目录
在你安装完成以后需要将libhdf5-serial-dev的位置添加在你的配置文件中方便他进行编译,我用的系统是ubuntu16,所以我的修改方式如下修改Makefile.config需要修改的内容:
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial
5、
m//home/yali/anaconda2/lib/libpng16.so.16:对‘inflateValidate@ZLIB_1.2.9’未定义的引用
sudo ln -s /home/yali/anaconda2/lib/libpng16.so.16 libpng16.so.16 (方法不行)
正确解决方法:
在 Makefile.config 中,加入下一句
LINKFLAGS := -Wl,-rpath,$(HOME)/anaconda2/lib
参考
http://blog.csdn.net/ruotianxia/article/details/78437464
**************************************************************************************************************
相关文件:
VGG_ILSVRC_16_layers_fc_reduced.caffemodel文件 链接: https://pan.baidu.com/s/1kVEb5H1 密码: 2vet
MobileNetSSD_deploy.prototxt文件 链接: https://pan.baidu.com/s/1dE3OghV 密码: pc9w
MobileNetSSD_deploy.caffemodel文件 链接: https://pan.baidu.com/s/1kV3mhwj 密码: 728b
mobilenet_iter_73000.caffemodel文件
链接: https://pan.baidu.com/s/1gfIoVi7 密码: 7yu5
你可能感兴趣的:(caffe-MobileNet-ssd环境搭建及训练自己的数据集模型)