Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.1
Test net output #1: loss = 2.30292
可見加了sigmoid層的方法在此模型中無效。
模型7:(換成ReLU層)
層類別
具體資訊
conv1
output: 32, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
Sigmoid
norm1
LRN
conv2
output: 16, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
200
ip2
100
ip3
10
實驗結果:
Iteration 60000, loss = 0.620338
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.6391
Test net output #1: loss = 1.05354
模型8:(全部換成ReLU層)
層類別
具體資訊
conv1
output: 32, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
norm1
LRN
conv2
output: 16, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
200
ip2
100
ip3
10
實驗結果:
Iteration 60000, loss = 0.416507
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.6794
Test net output #1: loss = 1.15119
在兩個卷積層後面加上ReLU層後,識別效果提升了較多,識別率為67.94%。
模型9:(加一個Dropout層)
層類別
具體資訊
conv1
output: 32, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
norm1
LRN
conv2
output: 16, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
200
dropout
dropout, 0.5
ip2
100
ip3
10
實驗結果:
Iteration 60000, loss = 0.563472
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.6728
Test net output #1: loss = 1.03333
從實驗結果可以知道,加了Dropout層之後,雖然沒有提高識別效果,但是降低了過擬合。因此,下一步增加FC層的輸出看看。
模型10:(增加FC層的輸出個數)
層類別
具體資訊
conv1
output: 32, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
norm1
LRN
conv2
output: 16, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
400
dropout
dropout, 0.5
ip2
150
ip3
10
實驗結果:
Iteration 60000, loss = 0.446714
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.6903
Test net output #1: loss = 0.990431
模型11:(再增加一個Dropout)
層類別
具體資訊
conv1
output: 32, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
norm1
LRN
conv2
output: 16, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
400
dropout
dropout, 0.5
ip2
200
dropout
dropout, 0.5
ip3
10
實驗結果:
Iteration 60000, loss = 0.586936
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.7013
Test net output #1: loss = 0.92605
模型12:(調整卷積層的輸出)
層類別
具體資訊
conv1
output: 48, kernel: 5, stride: 1 pad: 2
pool1
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
norm1
LRN
conv2
output: 32, kernel: 5, stride: 1 pad: 2
pool2
pool: MAX, kernel: 3, stride: 2
relu1
ReLU
ip1
400
dropout
dropout, 0.5
ip2
200
dropout
dropout, 0.5
ip3
10
實驗結果:
Iteration 60000, loss = 0.273988
Iteration 60000, Testing net (#0)
Test net output #0: accuracy = 0.7088
Test net output #1: loss = 1.1117