caffe安装与入门学习

                                          caffe cpu only 安装和入门学习


caffe安装:

              caffe的安装大部分都是在GPU模式下,但是由于目前没有GPU硬件,我们先部署CPU模式。仅仅需要把/examples/mnist/lenet_solver.prototxt模式下的文件solver_mode改成对应的CPU,并且把Make.config种的cpu_only=1的注释去掉,这里代表仅仅使用cpu模式,下面我们来用简单的配置CPU模式。

     安装caffe所需的依赖项,我们知道caffe是用google的protobuf,作为参数的参数传递,等等还有其他的依赖。如下命令安装依赖项。由于安装了很多的依赖项,所以需要很长很长的时间。。。

<span style="font-size: 18px;"> </span><span style="font-size:24px;">sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libboost-all-dev libhdf5-serial-dev protobuf-compiler liblmdb-dev  libgflags-dev libgoogle-glog-dev</span>

     然后再安装BLAS,什么是blas呢?这么说吧,例如,y=ax+b这样的基础运算,在caffe种最终调用的是blas库函数,还有一些矩阵运算等等,都是依赖blas库运行的。用如下命令安装。

<span style="font-size:24px;">sudo apt-get install libatlas-base-dev</span>

          下面就开始下载caffe源码以及相应的源码编译。会在当前目录下产生一个caffe的文件。

<span style="font-size:24px;">git clone git://github.com/BVLC/caffe.git</span>

          进入caffe目录之后,执行如下命令。注释掉cpu_only=1,仅仅使用cpu。

<span style="font-size:24px;">mv Makefile.config.example Makefile.config
vim  Makefile.config</span>
   

        然后进入编译阶段,执行如下命令,这样caffe就编译完成啦。

<span style="font-size:24px;">make
make test
make runtest</span>

       我们试图运行以下的命令来执行一个example案例。

       首先是数据及准备:

<span style="font-size:24px;">cd $CAFFE_ROOT
./data/mnist/get_mnist.sh
./examples/mnist/create_mnist.sh</span>

       改一下lenet_solver.prototxt中的内容,把solver mode这个变量改成CPU就可以了。

       好的,下面开始运行代码

./examples/mnist/train_lenet.sh

       你会看到整体的框架结构如下

        

I0205 01:42:48.034143 23807 layer_factory.hpp:77] Creating layer mnist
I0205 01:42:48.039044 23807 net.cpp:106] Creating Layer mnist
I0205 01:42:48.039551 23810 db_lmdb.cpp:38] Opened lmdb examples/mnist/mnist_train_lmdb
I0205 01:42:48.039721 23807 net.cpp:411] mnist -> data
I0205 01:42:48.039985 23807 net.cpp:411] mnist -> label
I0205 01:42:48.040212 23807 data_layer.cpp:41] output data size: 64,1,28,28
I0205 01:42:48.042438 23807 net.cpp:150] Setting up mnist
I0205 01:42:48.042546 23807 net.cpp:157] Top shape: 64 1 28 28 (50176)
I0205 01:42:48.042567 23807 net.cpp:157] Top shape: 64 (64)
I0205 01:42:48.042579 23807 net.cpp:165] Memory required for data: 200960
I0205 01:42:48.042599 23807 layer_factory.hpp:77] Creating layer conv1
I0205 01:42:48.042639 23807 net.cpp:106] Creating Layer conv1
I0205 01:42:48.042697 23807 net.cpp:454] conv1 <- data
I0205 01:42:48.042732 23807 net.cpp:411] conv1 -> conv1
I0205 01:42:48.042873 23807 net.cpp:150] Setting up conv1
I0205 01:42:48.042935 23807 net.cpp:157] Top shape: 64 20 24 24 (737280)
I0205 01:42:48.042950 23807 net.cpp:165] Memory required for data: 3150080
I0205 01:42:48.042984 23807 layer_factory.hpp:77] Creating layer pool1
I0205 01:42:48.043006 23807 net.cpp:106] Creating Layer pool1
I0205 01:42:48.043021 23807 net.cpp:454] pool1 <- conv1
I0205 01:42:48.043037 23807 net.cpp:411] pool1 -> pool1
I0205 01:42:48.043105 23807 net.cpp:150] Setting up pool1
I0205 01:42:48.043123 23807 net.cpp:157] Top shape: 64 20 12 12 (184320)
I0205 01:42:48.043135 23807 net.cpp:165] Memory required for data: 3887360
I0205 01:42:48.043177 23807 layer_factory.hpp:77] Creating layer conv2
I0205 01:42:48.043205 23807 net.cpp:106] Creating Layer conv2
I0205 01:42:48.043218 23807 net.cpp:454] conv2 <- pool1
I0205 01:42:48.043236 23807 net.cpp:411] conv2 -> conv2
I0205 01:42:48.043581 23807 net.cpp:150] Setting up conv2
I0205 01:42:48.043601 23807 net.cpp:157] Top shape: 64 50 8 8 (204800)
I0205 01:42:48.043617 23807 net.cpp:165] Memory required for data: 4706560
I0205 01:42:48.043637 23807 layer_factory.hpp:77] Creating layer pool2
I0205 01:42:48.043653 23807 net.cpp:106] Creating Layer pool2
I0205 01:42:48.043665 23807 net.cpp:454] pool2 <- conv2
I0205 01:42:48.043679 23807 net.cpp:411] pool2 -> pool2
I0205 01:42:48.043699 23807 net.cpp:150] Setting up pool2
I0205 01:42:48.043714 23807 net.cpp:157] Top shape: 64 50 4 4 (51200)
I0205 01:42:48.043726 23807 net.cpp:165] Memory required for data: 4911360
I0205 01:42:48.043738 23807 layer_factory.hpp:77] Creating layer ip1
I0205 01:42:48.043761 23807 net.cpp:106] Creating Layer ip1
I0205 01:42:48.043776 23807 net.cpp:454] ip1 <- pool2
I0205 01:42:48.043790 23807 net.cpp:411] ip1 -> ip1
I0205 01:42:48.047220 23807 net.cpp:150] Setting up ip1
I0205 01:42:48.047313 23807 net.cpp:157] Top shape: 64 500 (32000)
I0205 01:42:48.047341 23807 net.cpp:165] Memory required for data: 5039360
I0205 01:42:48.047416 23807 layer_factory.hpp:77] Creating layer relu1
I0205 01:42:48.047518 23807 net.cpp:106] Creating Layer relu1
I0205 01:42:48.047566 23807 net.cpp:454] relu1 <- ip1
I0205 01:42:48.047615 23807 net.cpp:397] relu1 -> ip1 (in-place)
I0205 01:42:48.047694 23807 net.cpp:150] Setting up relu1
I0205 01:42:48.047719 23807 net.cpp:157] Top shape: 64 500 (32000)
I0205 01:42:48.047730 23807 net.cpp:165] Memory required for data: 5167360
I0205 01:42:48.047747 23807 layer_factory.hpp:77] Creating layer ip2
I0205 01:42:48.047776 23807 net.cpp:106] Creating Layer ip2
I0205 01:42:48.047788 23807 net.cpp:454] ip2 <- ip1
I0205 01:42:48.047806 23807 net.cpp:411] ip2 -> ip2
I0205 01:42:48.047963 23807 net.cpp:150] Setting up ip2
I0205 01:42:48.047994 23807 net.cpp:157] Top shape: 64 10 (640)
I0205 01:42:48.048007 23807 net.cpp:165] Memory required for data: 5169920
I0205 01:42:48.048023 23807 layer_factory.hpp:77] Creating layer loss
I0205 01:42:48.048058 23807 net.cpp:106] Creating Layer loss
I0205 01:42:48.048074 23807 net.cpp:454] loss <- ip2
I0205 01:42:48.048085 23807 net.cpp:454] loss <- label
I0205 01:42:48.048102 23807 net.cpp:411] loss -> loss
I0205 01:42:48.048254 23807 layer_factory.hpp:77] Creating layer loss
I0205 01:42:48.048310 23807 net.cpp:150] Setting up loss
I0205 01:42:48.048322 23807 net.cpp:157] Top shape: (1)
I0205 01:42:48.048331 23807 net.cpp:160]     with loss weight 1
I0205 01:42:48.048363 23807 net.cpp:165] Memory required for data: 5169924
I0205 01:42:48.048382 23807 net.cpp:226] loss needs backward computation.
I0205 01:42:48.048390 23807 net.cpp:226] ip2 needs backward computation.
I0205 01:42:48.048398 23807 net.cpp:226] relu1 needs backward computation.
I0205 01:42:48.048405 23807 net.cpp:226] ip1 needs backward computation.
I0205 01:42:48.048413 23807 net.cpp:226] pool2 needs backward computation.
I0205 01:42:48.048420 23807 net.cpp:226] conv2 needs backward computation.
I0205 01:42:48.048429 23807 net.cpp:226] pool1 needs backward computation.
I0205 01:42:48.048435 23807 net.cpp:226] conv1 needs backward computation.
I0205 01:42:48.048444 23807 net.cpp:228] mnist does not need backward computation.
I0205 01:42:48.048451 23807 net.cpp:270] This network produces output loss
I0205 01:42:48.048463 23807 net.cpp:283] Network initialization done.


     下一篇:我们来具体的讨论和这段代码相关的论文。



你可能感兴趣的:(caffe安装与入门学习)