caffe训练日志各项含义

https://blog.csdn.net/dataningwei/article/details/77446841


I0821 09:53:35.929999 10308 solver.cpp:60] Solver scaffolding done.
I0821 09:53:35.929999 10308 caffe.cpp:252] Starting Optimization     ####### 开始网络训练
I0821 09:53:35.929999 10308 solver.cpp:279] Solving LeNet
I0821 09:53:35.929999 10308 solver.cpp:280] Learning Rate Policy: multistep
I0821 09:53:35.930999 10308 solver.cpp:337] Iteration 0, Testing net (#0)                                                           #### Test(Iteration 0)
I0821 09:53:35.993999 10308 blocking_queue.cpp:50] Data layer prefetch queue empty
I0821 09:53:36.180999 10308 solver.cpp:404]     Test net output #0: accuracy = 0.1121                                           #### Test(Iteration 0)网络输出节点0,accuracy信息。(由网络定义决定)
I0821 09:53:36.180999 10308 solver.cpp:404]     Test net output #1: loss = 2.30972 (* 1 = 2.30972 loss)     #### Test(Iteration 0)网络输出节点1,loss信息。       (由网络定义决定)

I0821 09:53:36.190999 10308 solver.cpp:228] Iteration 0, loss = 2.2891                                                                      #### Tain(Iteration 0) 网络loss值
I0821 09:53:36.190999 10308 solver.cpp:244]     Train net output #0: loss = 2.2891 (* 1 = 2.2891 loss)    #### Tain(Iteration 0) 只有一个输出值
I0821 09:53:36.190999 10308 sgd_solver.cpp:106] Iteration 0, lr = 0.001                                                                     #### Tain(Iteration 0)

I0821 09:53:36.700999 10308 solver.cpp:228] Iteration 100, loss = 2.24716                                                                   #### Tain(Iteration 100)
I0821 09:53:36.700999 10308 solver.cpp:244]     Train net output #0: loss = 2.24716 (* 1 = 2.24716 loss)    #### Tain(Iteration 100)
I0821 09:53:36.700999 10308 sgd_solver.cpp:106] Iteration 100, lr = 0.001                                                                   #### Tain(Iteration 100)
I0821 09:53:37.225999 10308 solver.cpp:228] Iteration 200, loss = 2.08563
I0821 09:53:37.225999 10308 solver.cpp:244]     Train net output #0: loss = 2.08563 (* 1 = 2.08563 loss)
I0821 09:53:37.225999 10308 sgd_solver.cpp:106] Iteration 200, lr = 0.001
I0821 09:53:37.756000 10308 solver.cpp:228] Iteration 300, loss = 2.11631
I0821 09:53:37.756000 10308 solver.cpp:244]     Train net output #0: loss = 2.11631 (* 1 = 2.11631 loss)
I0821 09:53:37.756000 10308 sgd_solver.cpp:106] Iteration 300, lr = 0.001
I0821 09:53:38.286999 10308 solver.cpp:228] Iteration 400, loss = 1.89424
I0821 09:53:38.286999 10308 solver.cpp:244]     Train net output #0: loss = 1.89424 (* 1 = 1.89424 loss)
I0821 09:53:38.286999 10308 sgd_solver.cpp:106] Iteration 400, lr = 0.001
I0821 09:53:38.819999 10308 solver.cpp:337] Iteration 500, Testing net (#0)                                                             #### Test(Iteration 500)
I0821 09:53:39.069999 10308 solver.cpp:404]     Test net output #0: accuracy = 0.3232                                           #### Test(Iteration 500)
I0821 09:53:39.069999 10308 solver.cpp:404]     Test net output #1: loss = 1.87822 (* 1 = 1.87822 loss)     #### Test(Iteration 500)
I0821 09:53:39.072999 10308 solver.cpp:228] Iteration 500, loss = 1.94478
I0821 09:53:39.072999 10308 solver.cpp:244]     Train net output #0: loss = 1.94478 (* 1 = 1.94478 loss)
I0821 09:53:39.072999 10308 sgd_solver.cpp:106] Iteration 500, lr = 0.001

你可能感兴趣的:(caffe训练日志各项含义)