Matlab 使用caffe示例
環境: Ubuntu 12.04 , Matlab 2013b
1. 首先修改Makefile.config中的MATLAB_DIR項, 如下所示
MATLAB_DIR := /u01/MATLAB/R2013b
2. 編譯下caffe下的matlab介面
make matcaffe
3. 切換到目錄/u01/caffe/examples/imagenet, 執行./get_caffe_reference_imagenet_model.sh下載訓練的模型
4. 切換到目錄/u01/caffe/matlab/caffe下,執行matlab呼叫caffe的示例,
matlab -nodisplay
>> run('matcaffe_demo.m')
......
layers {
bottom: "conv4"
top: "conv4"
name: "relu4"
type: RELU
}
layers {
bottom: "conv4"
top: "conv5"
name: "conv5"
type: CONVOLUTION
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
}
}
layers {
bottom: "conv5"
top: "conv5"
name: "relu5"
type: RELU
}
layers {
bottom: "conv5"
top: "pool5"
name: "pool5"
type: POOLING
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layers {
bottom: "pool5"
top: "fc6"
name: "fc6"
type: INNER_PRODUCT
inner_product_param {
num_output: 4096
}
}
layers {
bottom: "fc6"
top: "fc6"
name: "relu6"
type: RELU
}
layers {
bottom: "fc6"
top: "fc6"
name: "drop6"
type: DROPOUT
dropout_param {
dropout_ratio: 0.5
}
}
layers {
bottom: "fc6"
top: "fc7"
name: "fc7"
type: INNER_PRODUCT
inner_product_param {
num_output: 4096
}
}
layers {
bottom: "fc7"
top: "fc7"
name: "relu7"
type: RELU
}
layers {
bottom: "fc7"
top: "fc7"
name: "drop7"
type: DROPOUT
dropout_param {
dropout_ratio: 0.5
}
}
layers {
bottom: "fc7"
top: "fc8"
name: "fc8"
type: INNER_PRODUCT
inner_product_param {
num_output: 1000
}
}
layers {
bottom: "fc8"
top: "prob"
name: "prob"
type: SOFTMAX
}
input: "data"
input_dim: 10
input_dim: 3
input_dim: 227
input_dim: 227
I0912 18:22:26.956653 11968 net.cpp:292] Input 0 -> data
I0912 18:22:26.956778 11968 net.cpp:66] Creating Layer conv1
I0912 18:22:26.956809 11968 net.cpp:329] conv1 I0912 18:22:26.956889 11968 net.cpp:290] conv1 -> conv1
I0912 18:22:26.957068 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)
I0912 18:22:26.957139 11968 net.cpp:125] conv1 needs backward computation.
I0912 18:22:26.957207 11968 net.cpp:66] Creating Layer relu1
I0912 18:22:26.957243 11968 net.cpp:329] relu1 I0912 18:22:26.957279 11968 net.cpp:280] relu1 -> conv1 (in-place)
I0912 18:22:26.957347 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)
I0912 18:22:26.957382 11968 net.cpp:125] relu1 needs backward computation.
I0912 18:22:26.957422 11968 net.cpp:66] Creating Layer pool1
I0912 18:22:26.957458 11968 net.cpp:329] pool1 I0912 18:22:26.957496 11968 net.cpp:290] pool1 -> pool1
I0912 18:22:26.957548 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)
I0912 18:22:26.957583 11968 net.cpp:125] pool1 needs backward computation.
I0912 18:22:26.957619 11968 net.cpp:66] Creating Layer norm1
I0912 18:22:26.957681 11968 net.cpp:329] norm1 I0912 18:22:26.957728 11968 net.cpp:290] norm1 -> norm1
I0912 18:22:26.957774 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)
I0912 18:22:26.957809 11968 net.cpp:125] norm1 needs backward computation.
I0912 18:22:26.958052 11968 net.cpp:66] Creating Layer conv2
I0912 18:22:26.958092 11968 net.cpp:329] conv2 I0912 18:22:26.960306 11968 net.cpp:290] conv2 -> conv2
I0912 18:22:26.961231 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)
I0912 18:22:26.961369 11968 net.cpp:125] conv2 needs backward computation.
I0912 18:22:26.961398 11968 net.cpp:66] Creating Layer relu2
I0912 18:22:26.961436 11968 net.cpp:329] relu2 I0912 18:22:26.961468 11968 net.cpp:280] relu2 -> conv2 (in-place)
I0912 18:22:26.961496 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)
I0912 18:22:26.961516 11968 net.cpp:125] relu2 needs backward computation.
I0912 18:22:26.961539 11968 net.cpp:66] Creating Layer pool2
I0912 18:22:26.961593 11968 net.cpp:329] pool2 I0912 18:22:26.961629 11968 net.cpp:290] pool2 -> pool2
I0912 18:22:26.961676 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.961710 11968 net.cpp:125] pool2 needs backward computation.
I0912 18:22:26.961805 11968 net.cpp:66] Creating Layer norm2
I0912 18:22:26.961841 11968 net.cpp:329] norm2 I0912 18:22:26.961875 11968 net.cpp:290] norm2 -> norm2
I0912 18:22:26.961913 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.961969 11968 net.cpp:125] norm2 needs backward computation.
I0912 18:22:26.962023 11968 net.cpp:66] Creating Layer conv3
I0912 18:22:26.962059 11968 net.cpp:329] conv3 I0912 18:22:26.962096 11968 net.cpp:290] conv3 -> conv3
I0912 18:22:26.965011 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.965140 11968 net.cpp:125] conv3 needs backward computation.
I0912 18:22:26.965181 11968 net.cpp:66] Creating Layer relu3
I0912 18:22:26.965258 11968 net.cpp:329] relu3 I0912 18:22:26.965299 11968 net.cpp:280] relu3 -> conv3 (in-place)
I0912 18:22:26.965338 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.965479 11968 net.cpp:125] relu3 needs backward computation.
I0912 18:22:26.965520 11968 net.cpp:66] Creating Layer conv4
I0912 18:22:26.965555 11968 net.cpp:329] conv4 I0912 18:22:26.965634 11968 net.cpp:290] conv4 -> conv4
I0912 18:22:26.968613 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.968745 11968 net.cpp:125] conv4 needs backward computation.
I0912 18:22:26.968781 11968 net.cpp:66] Creating Layer relu4
I0912 18:22:26.968819 11968 net.cpp:329] relu4 I0912 18:22:26.968873 11968 net.cpp:280] relu4 -> conv4 (in-place)
I0912 18:22:26.968919 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.968992 11968 net.cpp:125] relu4 needs backward computation.
I0912 18:22:26.969028 11968 net.cpp:66] Creating Layer conv5
I0912 18:22:26.969066 11968 net.cpp:329] conv5 I0912 18:22:26.969108 11968 net.cpp:290] conv5 -> conv5
I0912 18:22:26.970634 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.970749 11968 net.cpp:125] conv5 needs backward computation.
I0912 18:22:26.970780 11968 net.cpp:66] Creating Layer relu5
I0912 18:22:26.970803 11968 net.cpp:329] relu5 I0912 18:22:26.970827 11968 net.cpp:280] relu5 -> conv5 (in-place)
I0912 18:22:26.970918 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.970952 11968 net.cpp:125] relu5 needs backward computation.
I0912 18:22:26.970988 11968 net.cpp:66] Creating Layer pool5
I0912 18:22:26.971233 11968 net.cpp:329] pool5 I0912 18:22:26.971282 11968 net.cpp:290] pool5 -> pool5
I0912 18:22:26.971361 11968 net.cpp:83] Top shape: 10 256 6 6 (92160)
I0912 18:22:26.971397 11968 net.cpp:125] pool5 needs backward computation.
I0912 18:22:26.971434 11968 net.cpp:66] Creating Layer fc6
I0912 18:22:26.971470 11968 net.cpp:329] fc6 I0912 18:22:26.971559 11968 net.cpp:290] fc6 -> fc6
I0912 18:22:27.069502 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069640 11968 net.cpp:125] fc6 needs backward computation.
I0912 18:22:27.069672 11968 net.cpp:66] Creating Layer relu6
I0912 18:22:27.069694 11968 net.cpp:329] relu6 I0912 18:22:27.069718 11968 net.cpp:280] relu6 -> fc6 (in-place)
I0912 18:22:27.069743 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069763 11968 net.cpp:125] relu6 needs backward computation.
I0912 18:22:27.069792 11968 net.cpp:66] Creating Layer drop6
I0912 18:22:27.069824 11968 net.cpp:329] drop6 I0912 18:22:27.069875 11968 net.cpp:280] drop6 -> fc6 (in-place)
I0912 18:22:27.069954 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069990 11968 net.cpp:125] drop6 needs backward computation.
I0912 18:22:27.070144 11968 net.cpp:66] Creating Layer fc7
I0912 18:22:27.070173 11968 net.cpp:329] fc7 I0912 18:22:27.070199 11968 net.cpp:290] fc7 -> fc7
I0912 18:22:27.111870 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.111963 11968 net.cpp:125] fc7 needs backward computation.
I0912 18:22:27.111991 11968 net.cpp:66] Creating Layer relu7
I0912 18:22:27.112015 11968 net.cpp:329] relu7 I0912 18:22:27.112040 11968 net.cpp:280] relu7 -> fc7 (in-place)
I0912 18:22:27.112068 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.112139 11968 net.cpp:125] relu7 needs backward computation.
I0912 18:22:27.112164 11968 net.cpp:66] Creating Layer drop7
I0912 18:22:27.112184 11968 net.cpp:329] drop7 I0912 18:22:27.112213 11968 net.cpp:280] drop7 -> fc7 (in-place)
I0912 18:22:27.112242 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.112263 11968 net.cpp:125] drop7 needs backward computation.
I0912 18:22:27.112285 11968 net.cpp:66] Creating Layer fc8
I0912 18:22:27.112305 11968 net.cpp:329] fc8 I0912 18:22:27.112334 11968 net.cpp:290] fc8 -> fc8
I0912 18:22:27.122274 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)
I0912 18:22:27.122380 11968 net.cpp:125] fc8 needs backward computation.
I0912 18:22:27.122421 11968 net.cpp:66] Creating Layer prob
I0912 18:22:27.122503 11968 net.cpp:329] prob I0912 18:22:27.122547 11968 net.cpp:290] prob -> prob
I0912 18:22:27.122660 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)
I0912 18:22:27.122688 11968 net.cpp:125] prob needs backward computation.
I0912 18:22:27.122706 11968 net.cpp:156] This network produces output prob
I0912 18:22:27.122745 11968 net.cpp:402] Collecting Learning Rate and Weight Decay.
I0912 18:22:27.122769 11968 net.cpp:167] Network initialization done.
I0912 18:22:27.122788 11968 net.cpp:168] Memory required for data: 6183480
Done with init
Using CPU Mode
Done with set_mode
Done with set_phase_test
Elapsed time is 0.579487 seconds.
Elapsed time is 3.748376 seconds.
ans =
1 1 1000 10
1. 首先修改Makefile.config中的MATLAB_DIR項, 如下所示
MATLAB_DIR := /u01/MATLAB/R2013b
2. 編譯下caffe下的matlab介面
make matcaffe
3. 切換到目錄/u01/caffe/examples/imagenet, 執行./get_caffe_reference_imagenet_model.sh下載訓練的模型
4. 切換到目錄/u01/caffe/matlab/caffe下,執行matlab呼叫caffe的示例,
matlab -nodisplay
>> run('matcaffe_demo.m')
......
layers {
bottom: "conv4"
top: "conv4"
name: "relu4"
type: RELU
}
layers {
bottom: "conv4"
top: "conv5"
name: "conv5"
type: CONVOLUTION
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
}
}
layers {
bottom: "conv5"
top: "conv5"
name: "relu5"
type: RELU
}
layers {
bottom: "conv5"
top: "pool5"
name: "pool5"
type: POOLING
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layers {
bottom: "pool5"
top: "fc6"
name: "fc6"
type: INNER_PRODUCT
inner_product_param {
num_output: 4096
}
}
layers {
bottom: "fc6"
top: "fc6"
name: "relu6"
type: RELU
}
layers {
bottom: "fc6"
top: "fc6"
name: "drop6"
type: DROPOUT
dropout_param {
dropout_ratio: 0.5
}
}
layers {
bottom: "fc6"
top: "fc7"
name: "fc7"
type: INNER_PRODUCT
inner_product_param {
num_output: 4096
}
}
layers {
bottom: "fc7"
top: "fc7"
name: "relu7"
type: RELU
}
layers {
bottom: "fc7"
top: "fc7"
name: "drop7"
type: DROPOUT
dropout_param {
dropout_ratio: 0.5
}
}
layers {
bottom: "fc7"
top: "fc8"
name: "fc8"
type: INNER_PRODUCT
inner_product_param {
num_output: 1000
}
}
layers {
bottom: "fc8"
top: "prob"
name: "prob"
type: SOFTMAX
}
input: "data"
input_dim: 10
input_dim: 3
input_dim: 227
input_dim: 227
I0912 18:22:26.956653 11968 net.cpp:292] Input 0 -> data
I0912 18:22:26.956778 11968 net.cpp:66] Creating Layer conv1
I0912 18:22:26.956809 11968 net.cpp:329] conv1 I0912 18:22:26.956889 11968 net.cpp:290] conv1 -> conv1
I0912 18:22:26.957068 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)
I0912 18:22:26.957139 11968 net.cpp:125] conv1 needs backward computation.
I0912 18:22:26.957207 11968 net.cpp:66] Creating Layer relu1
I0912 18:22:26.957243 11968 net.cpp:329] relu1 I0912 18:22:26.957279 11968 net.cpp:280] relu1 -> conv1 (in-place)
I0912 18:22:26.957347 11968 net.cpp:83] Top shape: 10 96 55 55 (2904000)
I0912 18:22:26.957382 11968 net.cpp:125] relu1 needs backward computation.
I0912 18:22:26.957422 11968 net.cpp:66] Creating Layer pool1
I0912 18:22:26.957458 11968 net.cpp:329] pool1 I0912 18:22:26.957496 11968 net.cpp:290] pool1 -> pool1
I0912 18:22:26.957548 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)
I0912 18:22:26.957583 11968 net.cpp:125] pool1 needs backward computation.
I0912 18:22:26.957619 11968 net.cpp:66] Creating Layer norm1
I0912 18:22:26.957681 11968 net.cpp:329] norm1 I0912 18:22:26.957728 11968 net.cpp:290] norm1 -> norm1
I0912 18:22:26.957774 11968 net.cpp:83] Top shape: 10 96 27 27 (699840)
I0912 18:22:26.957809 11968 net.cpp:125] norm1 needs backward computation.
I0912 18:22:26.958052 11968 net.cpp:66] Creating Layer conv2
I0912 18:22:26.958092 11968 net.cpp:329] conv2 I0912 18:22:26.960306 11968 net.cpp:290] conv2 -> conv2
I0912 18:22:26.961231 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)
I0912 18:22:26.961369 11968 net.cpp:125] conv2 needs backward computation.
I0912 18:22:26.961398 11968 net.cpp:66] Creating Layer relu2
I0912 18:22:26.961436 11968 net.cpp:329] relu2 I0912 18:22:26.961468 11968 net.cpp:280] relu2 -> conv2 (in-place)
I0912 18:22:26.961496 11968 net.cpp:83] Top shape: 10 256 27 27 (1866240)
I0912 18:22:26.961516 11968 net.cpp:125] relu2 needs backward computation.
I0912 18:22:26.961539 11968 net.cpp:66] Creating Layer pool2
I0912 18:22:26.961593 11968 net.cpp:329] pool2 I0912 18:22:26.961629 11968 net.cpp:290] pool2 -> pool2
I0912 18:22:26.961676 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.961710 11968 net.cpp:125] pool2 needs backward computation.
I0912 18:22:26.961805 11968 net.cpp:66] Creating Layer norm2
I0912 18:22:26.961841 11968 net.cpp:329] norm2 I0912 18:22:26.961875 11968 net.cpp:290] norm2 -> norm2
I0912 18:22:26.961913 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.961969 11968 net.cpp:125] norm2 needs backward computation.
I0912 18:22:26.962023 11968 net.cpp:66] Creating Layer conv3
I0912 18:22:26.962059 11968 net.cpp:329] conv3 I0912 18:22:26.962096 11968 net.cpp:290] conv3 -> conv3
I0912 18:22:26.965011 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.965140 11968 net.cpp:125] conv3 needs backward computation.
I0912 18:22:26.965181 11968 net.cpp:66] Creating Layer relu3
I0912 18:22:26.965258 11968 net.cpp:329] relu3 I0912 18:22:26.965299 11968 net.cpp:280] relu3 -> conv3 (in-place)
I0912 18:22:26.965338 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.965479 11968 net.cpp:125] relu3 needs backward computation.
I0912 18:22:26.965520 11968 net.cpp:66] Creating Layer conv4
I0912 18:22:26.965555 11968 net.cpp:329] conv4 I0912 18:22:26.965634 11968 net.cpp:290] conv4 -> conv4
I0912 18:22:26.968613 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.968745 11968 net.cpp:125] conv4 needs backward computation.
I0912 18:22:26.968781 11968 net.cpp:66] Creating Layer relu4
I0912 18:22:26.968819 11968 net.cpp:329] relu4 I0912 18:22:26.968873 11968 net.cpp:280] relu4 -> conv4 (in-place)
I0912 18:22:26.968919 11968 net.cpp:83] Top shape: 10 384 13 13 (648960)
I0912 18:22:26.968992 11968 net.cpp:125] relu4 needs backward computation.
I0912 18:22:26.969028 11968 net.cpp:66] Creating Layer conv5
I0912 18:22:26.969066 11968 net.cpp:329] conv5 I0912 18:22:26.969108 11968 net.cpp:290] conv5 -> conv5
I0912 18:22:26.970634 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.970749 11968 net.cpp:125] conv5 needs backward computation.
I0912 18:22:26.970780 11968 net.cpp:66] Creating Layer relu5
I0912 18:22:26.970803 11968 net.cpp:329] relu5 I0912 18:22:26.970827 11968 net.cpp:280] relu5 -> conv5 (in-place)
I0912 18:22:26.970918 11968 net.cpp:83] Top shape: 10 256 13 13 (432640)
I0912 18:22:26.970952 11968 net.cpp:125] relu5 needs backward computation.
I0912 18:22:26.970988 11968 net.cpp:66] Creating Layer pool5
I0912 18:22:26.971233 11968 net.cpp:329] pool5 I0912 18:22:26.971282 11968 net.cpp:290] pool5 -> pool5
I0912 18:22:26.971361 11968 net.cpp:83] Top shape: 10 256 6 6 (92160)
I0912 18:22:26.971397 11968 net.cpp:125] pool5 needs backward computation.
I0912 18:22:26.971434 11968 net.cpp:66] Creating Layer fc6
I0912 18:22:26.971470 11968 net.cpp:329] fc6 I0912 18:22:26.971559 11968 net.cpp:290] fc6 -> fc6
I0912 18:22:27.069502 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069640 11968 net.cpp:125] fc6 needs backward computation.
I0912 18:22:27.069672 11968 net.cpp:66] Creating Layer relu6
I0912 18:22:27.069694 11968 net.cpp:329] relu6 I0912 18:22:27.069718 11968 net.cpp:280] relu6 -> fc6 (in-place)
I0912 18:22:27.069743 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069763 11968 net.cpp:125] relu6 needs backward computation.
I0912 18:22:27.069792 11968 net.cpp:66] Creating Layer drop6
I0912 18:22:27.069824 11968 net.cpp:329] drop6 I0912 18:22:27.069875 11968 net.cpp:280] drop6 -> fc6 (in-place)
I0912 18:22:27.069954 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.069990 11968 net.cpp:125] drop6 needs backward computation.
I0912 18:22:27.070144 11968 net.cpp:66] Creating Layer fc7
I0912 18:22:27.070173 11968 net.cpp:329] fc7 I0912 18:22:27.070199 11968 net.cpp:290] fc7 -> fc7
I0912 18:22:27.111870 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.111963 11968 net.cpp:125] fc7 needs backward computation.
I0912 18:22:27.111991 11968 net.cpp:66] Creating Layer relu7
I0912 18:22:27.112015 11968 net.cpp:329] relu7 I0912 18:22:27.112040 11968 net.cpp:280] relu7 -> fc7 (in-place)
I0912 18:22:27.112068 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.112139 11968 net.cpp:125] relu7 needs backward computation.
I0912 18:22:27.112164 11968 net.cpp:66] Creating Layer drop7
I0912 18:22:27.112184 11968 net.cpp:329] drop7 I0912 18:22:27.112213 11968 net.cpp:280] drop7 -> fc7 (in-place)
I0912 18:22:27.112242 11968 net.cpp:83] Top shape: 10 4096 1 1 (40960)
I0912 18:22:27.112263 11968 net.cpp:125] drop7 needs backward computation.
I0912 18:22:27.112285 11968 net.cpp:66] Creating Layer fc8
I0912 18:22:27.112305 11968 net.cpp:329] fc8 I0912 18:22:27.112334 11968 net.cpp:290] fc8 -> fc8
I0912 18:22:27.122274 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)
I0912 18:22:27.122380 11968 net.cpp:125] fc8 needs backward computation.
I0912 18:22:27.122421 11968 net.cpp:66] Creating Layer prob
I0912 18:22:27.122503 11968 net.cpp:329] prob I0912 18:22:27.122547 11968 net.cpp:290] prob -> prob
I0912 18:22:27.122660 11968 net.cpp:83] Top shape: 10 1000 1 1 (10000)
I0912 18:22:27.122688 11968 net.cpp:125] prob needs backward computation.
I0912 18:22:27.122706 11968 net.cpp:156] This network produces output prob
I0912 18:22:27.122745 11968 net.cpp:402] Collecting Learning Rate and Weight Decay.
I0912 18:22:27.122769 11968 net.cpp:167] Network initialization done.
I0912 18:22:27.122788 11968 net.cpp:168] Memory required for data: 6183480
Done with init
Using CPU Mode
Done with set_mode
Done with set_phase_test
Elapsed time is 0.579487 seconds.
Elapsed time is 3.748376 seconds.
ans =
1 1 1000 10
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/16582684/viewspace-1268749/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- Windows下編譯Caffe並編譯Matlab介面Windows編譯Matlab
- caffe的python介面繪製loss和accuracy曲線示例Python
- caffe的python介面caffemodel引數及特徵抽取示例Python特徵
- MATLAB生成.coe檔案和.mif檔案程式碼示例Matlab
- caffe學習(1)caffe模型三種結構模型
- GPUImageRawDataInput 使用示例GPUUIAI
- JMeter使用示例JMeter
- Tcpdump使用示例TCP
- rsync 使用示例
- javaJedis使用示例Java
- 【Caffe篇】--Caffe solver層從初始到應用
- caffe中各種cblas的函式使用總結函式
- (14)caffe總結之Linux下Caffe如何除錯Linux除錯
- dataview 元件使用示例View元件
- 索引器使用示例索引
- FileSystemWatch使用示例
- opencv呼叫caffe模型OpenCV模型
- caffe make 編譯編譯
- 【caffe2從頭學】:2.學習caffe2
- 【Caffe篇】--Caffe從入門到初始及各層介紹
- ubuntu下matlab 簡單使用UbuntuMatlab
- 使用matlab合成馬鞍波Matlab
- Oracle expdp/impdp 使用示例Oracle
- nginx 使用webrman配置示例NginxWeb
- react-ace使用示例React
- Jmeter (5.6.3) Windows 使用示例JMeterWindows
- caffe+SSD封裝封裝
- Caffe程式碼結構
- 【蜂口 | AI人工智慧】caffe框架的使用——龍鵬的一站式caffe工程實踐連載(三)AI人工智慧框架
- 【嚴肅臉】使用caffe實現色情圖片的識別
- matlab 2014 破解使用Matlab
- matlab 使用小記(VScode)MatlabVSCode
- MATLAB影像處理imadjust()函式調節影像的對比度示例Matlab函式
- Guava RateLimiter限流器使用示例GuavaMIT
- 8個Date命令使用示例
- UEditor編輯器使用示例
- iOS 字典轉模型使用示例iOS模型
- react-md-editor使用示例React