深度殘差網路+自適應引數化ReLU啟用函式(調參記錄2)
續上一篇:
深度殘差網路+自適應引數化ReLU啟用函式(調參記錄1)
https://blog.csdn.net/dangqing1988/article/details/105590515
本文依然是測試深度殘差網路+自適應引數化ReLU啟用函式,殘差模組的數量增加到了27個,其他保持不變,卷積核的個數依然是8個、16個到32個,繼續測試在Cifar10資料集上的效果。
自適應引數化ReLU啟用函式是Parametric ReLU的一種改進,基本原理見下圖。
具體Keras程式碼如下:
#!/usr/bin/env python3 # -*- coding: utf-8 -*- """ Created on Tue Apr 14 04:17:45 2020 Implemented using TensorFlow 1.10.0 and Keras 2.2.1 Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458 @author: Minghang Zhao """ from __future__ import print_function import keras import numpy as np from keras.datasets import cifar10 from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape from keras.regularizers import l2 from keras import backend as K from keras.models import Model from keras import optimizers from keras.preprocessing.image import ImageDataGenerator from keras.callbacks import LearningRateScheduler K.set_learning_phase(1) # The data, split between train and test sets (x_train, y_train), (x_test, y_test) = cifar10.load_data() # Noised data x_train = x_train.astype('float32') / 255. x_test = x_test.astype('float32') / 255. x_test = x_test-np.mean(x_train) x_train = x_train-np.mean(x_train) print('x_train shape:', x_train.shape) print(x_train.shape[0], 'train samples') print(x_test.shape[0], 'test samples') # convert class vectors to binary class matrices y_train = keras.utils.to_categorical(y_train, 10) y_test = keras.utils.to_categorical(y_test, 10) # Schedule the learning rate, multiply 0.1 every 400 epoches def scheduler(epoch): if epoch % 400 == 0 and epoch != 0: lr = K.get_value(model.optimizer.lr) K.set_value(model.optimizer.lr, lr * 0.1) print("lr changed to {}".format(lr * 0.1)) return K.get_value(model.optimizer.lr) # An adaptively parametric rectifier linear unit (APReLU) def aprelu(inputs): # get the number of channels channels = inputs.get_shape().as_list()[-1] # get a zero feature map zeros_input = keras.layers.subtract([inputs, inputs]) # get a feature map with only positive features pos_input = Activation('relu')(inputs) # get a feature map with only negative features neg_input = Minimum()([inputs,zeros_input]) # define a network to obtain the scaling coefficients scales_p = GlobalAveragePooling2D()(pos_input) scales_n = GlobalAveragePooling2D()(neg_input) scales = Concatenate()([scales_n, scales_p]) scales = Dense(channels//4, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization()(scales) scales = Activation('relu')(scales) scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization()(scales) scales = Activation('sigmoid')(scales) scales = Reshape((1,1,channels))(scales) # apply a paramtetric relu neg_part = keras.layers.multiply([scales, neg_input]) return keras.layers.add([pos_input, neg_part]) # Residual Block def residual_block(incoming, nb_blocks, out_channels, downsample=False, downsample_strides=2): residual = incoming in_channels = incoming.get_shape().as_list()[-1] for i in range(nb_blocks): identity = residual if not downsample: downsample_strides = 1 residual = BatchNormalization()(residual) residual = aprelu(residual) residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = BatchNormalization()(residual) residual = aprelu(residual) residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) # Downsampling if downsample_strides > 1: identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity) # Zero_padding to match channels if in_channels != out_channels: zeros_identity = keras.layers.subtract([identity, identity]) identity = keras.layers.concatenate([identity, zeros_identity]) in_channels = out_channels residual = keras.layers.add([residual, identity]) return residual # define and train a model inputs = Input(shape=(32, 32, 3)) net = Conv2D(8, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs) net = residual_block(net, 9, 8, downsample=False) net = residual_block(net, 1, 16, downsample=True) net = residual_block(net, 8, 16, downsample=False) net = residual_block(net, 1, 32, downsample=True) net = residual_block(net, 8, 32, downsample=False) net = BatchNormalization()(net) net = aprelu(net) net = GlobalAveragePooling2D()(net) outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net) model = Model(inputs=inputs, outputs=outputs) sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True) model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy']) # data augmentation datagen = ImageDataGenerator( # randomly rotate images in the range (deg 0 to 180) rotation_range=30, # randomly flip images horizontal_flip=True, # randomly shift images horizontally width_shift_range=0.125, # randomly shift images vertically height_shift_range=0.125) reduce_lr = LearningRateScheduler(scheduler) # fit the model on the batches generated by datagen.flow(). model.fit_generator(datagen.flow(x_train, y_train, batch_size=100), validation_data=(x_test, y_test), epochs=1000, verbose=1, callbacks=[reduce_lr], workers=4) # get results K.set_learning_phase(0) DRSN_train_score1 = model.evaluate(x_train, y_train, batch_size=100, verbose=0) print('Train loss:', DRSN_train_score1[0]) print('Train accuracy:', DRSN_train_score1[1]) DRSN_test_score1 = model.evaluate(x_test, y_test, batch_size=100, verbose=0) print('Test loss:', DRSN_test_score1[0]) print('Test accuracy:', DRSN_test_score1[1])
部分實驗結果如下(前271個epoch的結果在spyder的視窗裡已經不顯示了):
Epoch 272/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6071 - acc: 0.8711 - val_loss: 0.6295 - val_acc: 0.8667 Epoch 273/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6078 - acc: 0.8705 - val_loss: 0.6373 - val_acc: 0.8678 Epoch 274/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.6043 - acc: 0.8714 - val_loss: 0.6245 - val_acc: 0.8686 Epoch 275/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6056 - acc: 0.8720 - val_loss: 0.6228 - val_acc: 0.8713 Epoch 276/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6059 - acc: 0.8730 - val_loss: 0.6104 - val_acc: 0.8730 Epoch 277/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5980 - acc: 0.8756 - val_loss: 0.6265 - val_acc: 0.8671 Epoch 278/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6093 - acc: 0.8716 - val_loss: 0.6363 - val_acc: 0.8617 Epoch 279/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6051 - acc: 0.8716 - val_loss: 0.6355 - val_acc: 0.8650 Epoch 280/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6062 - acc: 0.8725 - val_loss: 0.6227 - val_acc: 0.8669 Epoch 281/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6025 - acc: 0.8731 - val_loss: 0.6156 - val_acc: 0.8723 Epoch 282/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6031 - acc: 0.8725 - val_loss: 0.6450 - val_acc: 0.8630 Epoch 283/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6030 - acc: 0.8745 - val_loss: 0.6282 - val_acc: 0.8688 Epoch 284/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6049 - acc: 0.8717 - val_loss: 0.6213 - val_acc: 0.8693 Epoch 285/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6005 - acc: 0.8709 - val_loss: 0.6208 - val_acc: 0.8682 Epoch 286/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6049 - acc: 0.8718 - val_loss: 0.6420 - val_acc: 0.8647 Epoch 287/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6040 - acc: 0.8728 - val_loss: 0.6188 - val_acc: 0.8694 Epoch 288/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6011 - acc: 0.8741 - val_loss: 0.6548 - val_acc: 0.8577 Epoch 289/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6060 - acc: 0.8731 - val_loss: 0.6163 - val_acc: 0.8717 Epoch 290/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6047 - acc: 0.8717 - val_loss: 0.6172 - val_acc: 0.8733 Epoch 291/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6029 - acc: 0.8728 - val_loss: 0.6319 - val_acc: 0.8639 Epoch 292/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6011 - acc: 0.8742 - val_loss: 0.6237 - val_acc: 0.8664 Epoch 293/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5998 - acc: 0.8741 - val_loss: 0.6410 - val_acc: 0.8646 Epoch 294/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6001 - acc: 0.8736 - val_loss: 0.6435 - val_acc: 0.8644 Epoch 295/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.6022 - acc: 0.8730 - val_loss: 0.6233 - val_acc: 0.8657 Epoch 296/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.6015 - acc: 0.8746 - val_loss: 0.6224 - val_acc: 0.8665 Epoch 297/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5995 - acc: 0.8750 - val_loss: 0.6471 - val_acc: 0.8613 Epoch 298/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5992 - acc: 0.8735 - val_loss: 0.6436 - val_acc: 0.8635 Epoch 299/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.6040 - acc: 0.8716 - val_loss: 0.6273 - val_acc: 0.8674 Epoch 300/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6008 - acc: 0.8736 - val_loss: 0.6543 - val_acc: 0.8603 Epoch 301/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6023 - acc: 0.8732 - val_loss: 0.6420 - val_acc: 0.8633 Epoch 302/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5992 - acc: 0.8747 - val_loss: 0.6125 - val_acc: 0.8712 Epoch 303/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6016 - acc: 0.8743 - val_loss: 0.6402 - val_acc: 0.8660 Epoch 304/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5998 - acc: 0.8742 - val_loss: 0.6256 - val_acc: 0.8663 Epoch 305/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5998 - acc: 0.8736 - val_loss: 0.6193 - val_acc: 0.8713 Epoch 306/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5977 - acc: 0.8760 - val_loss: 0.6219 - val_acc: 0.8686 Epoch 307/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6000 - acc: 0.8743 - val_loss: 0.6643 - val_acc: 0.8539 Epoch 308/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6022 - acc: 0.8740 - val_loss: 0.6308 - val_acc: 0.8671 Epoch 309/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6083 - acc: 0.8737 - val_loss: 0.6168 - val_acc: 0.8730 Epoch 310/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.6008 - acc: 0.8727 - val_loss: 0.6165 - val_acc: 0.8751 Epoch 311/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6046 - acc: 0.8731 - val_loss: 0.6369 - val_acc: 0.8639 Epoch 312/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5976 - acc: 0.8753 - val_loss: 0.6246 - val_acc: 0.8695 Epoch 313/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6037 - acc: 0.8738 - val_loss: 0.6266 - val_acc: 0.8691 Epoch 314/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6007 - acc: 0.8732 - val_loss: 0.6520 - val_acc: 0.8631 Epoch 315/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5993 - acc: 0.8751 - val_loss: 0.6436 - val_acc: 0.8632 Epoch 316/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5996 - acc: 0.8750 - val_loss: 0.6413 - val_acc: 0.8589 Epoch 317/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5998 - acc: 0.8740 - val_loss: 0.6406 - val_acc: 0.8621 Epoch 318/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5992 - acc: 0.8753 - val_loss: 0.6364 - val_acc: 0.8614 Epoch 319/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5983 - acc: 0.8748 - val_loss: 0.6275 - val_acc: 0.8650 Epoch 320/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5987 - acc: 0.8766 - val_loss: 0.6207 - val_acc: 0.8724 Epoch 321/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5979 - acc: 0.8756 - val_loss: 0.6266 - val_acc: 0.8711 Epoch 322/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5981 - acc: 0.8748 - val_loss: 0.6461 - val_acc: 0.8627 Epoch 323/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5966 - acc: 0.8757 - val_loss: 0.6235 - val_acc: 0.8696 Epoch 324/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5940 - acc: 0.8758 - val_loss: 0.6141 - val_acc: 0.8750 Epoch 325/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6007 - acc: 0.8757 - val_loss: 0.6513 - val_acc: 0.8610 Epoch 326/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5988 - acc: 0.8760 - val_loss: 0.6219 - val_acc: 0.8724 Epoch 327/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6003 - acc: 0.8744 - val_loss: 0.6115 - val_acc: 0.8693 Epoch 328/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5942 - acc: 0.8762 - val_loss: 0.6358 - val_acc: 0.8660 Epoch 329/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5923 - acc: 0.8769 - val_loss: 0.6340 - val_acc: 0.8672 Epoch 330/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5954 - acc: 0.8781 - val_loss: 0.6246 - val_acc: 0.8688 Epoch 331/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6015 - acc: 0.8747 - val_loss: 0.6194 - val_acc: 0.8710 Epoch 332/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5980 - acc: 0.8764 - val_loss: 0.6311 - val_acc: 0.8685 Epoch 333/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.6019 - acc: 0.8748 - val_loss: 0.6095 - val_acc: 0.8733 Epoch 334/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5964 - acc: 0.8760 - val_loss: 0.6515 - val_acc: 0.8623 Epoch 335/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5973 - acc: 0.8765 - val_loss: 0.6300 - val_acc: 0.8702 Epoch 336/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5953 - acc: 0.8776 - val_loss: 0.6297 - val_acc: 0.8656 Epoch 337/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6005 - acc: 0.8752 - val_loss: 0.6252 - val_acc: 0.8711 Epoch 338/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5949 - acc: 0.8778 - val_loss: 0.6175 - val_acc: 0.8693 Epoch 339/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5996 - acc: 0.8749 - val_loss: 0.6215 - val_acc: 0.8688 Epoch 340/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5921 - acc: 0.8777 - val_loss: 0.6239 - val_acc: 0.8713 Epoch 341/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5910 - acc: 0.8776 - val_loss: 0.6327 - val_acc: 0.8684 Epoch 342/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5952 - acc: 0.8778 - val_loss: 0.6083 - val_acc: 0.8767 Epoch 343/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5965 - acc: 0.8763 - val_loss: 0.6312 - val_acc: 0.8696 Epoch 344/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5965 - acc: 0.8771 - val_loss: 0.6204 - val_acc: 0.8707 Epoch 345/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5932 - acc: 0.8764 - val_loss: 0.6211 - val_acc: 0.8709 Epoch 346/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5900 - acc: 0.8785 - val_loss: 0.6422 - val_acc: 0.8663 Epoch 347/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5919 - acc: 0.8775 - val_loss: 0.6437 - val_acc: 0.8646 Epoch 348/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.6001 - acc: 0.8753 - val_loss: 0.6184 - val_acc: 0.8709 Epoch 349/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5952 - acc: 0.8778 - val_loss: 0.6410 - val_acc: 0.8626 Epoch 350/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5946 - acc: 0.8768 - val_loss: 0.6321 - val_acc: 0.8660 Epoch 351/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5931 - acc: 0.8770 - val_loss: 0.6444 - val_acc: 0.8655 Epoch 352/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5969 - acc: 0.8757 - val_loss: 0.6205 - val_acc: 0.8710 Epoch 353/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5978 - acc: 0.8754 - val_loss: 0.6287 - val_acc: 0.8672 Epoch 354/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5925 - acc: 0.8778 - val_loss: 0.6314 - val_acc: 0.8664 Epoch 355/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5942 - acc: 0.8765 - val_loss: 0.6392 - val_acc: 0.8658 Epoch 356/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5961 - acc: 0.8786 - val_loss: 0.6316 - val_acc: 0.8675 Epoch 357/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5945 - acc: 0.8766 - val_loss: 0.6536 - val_acc: 0.8619 Epoch 358/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5957 - acc: 0.8769 - val_loss: 0.6112 - val_acc: 0.8748 Epoch 359/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5992 - acc: 0.8750 - val_loss: 0.6291 - val_acc: 0.8677 Epoch 360/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5935 - acc: 0.8778 - val_loss: 0.6283 - val_acc: 0.8691 Epoch 361/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.5886 - acc: 0.8795 - val_loss: 0.6396 - val_acc: 0.8654 Epoch 362/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5900 - acc: 0.8774 - val_loss: 0.6273 - val_acc: 0.8699 Epoch 363/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5952 - acc: 0.8769 - val_loss: 0.6017 - val_acc: 0.8798 Epoch 364/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5928 - acc: 0.8771 - val_loss: 0.6156 - val_acc: 0.8729 Epoch 365/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5997 - acc: 0.8761 - val_loss: 0.6384 - val_acc: 0.8662 Epoch 366/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5946 - acc: 0.8771 - val_loss: 0.6245 - val_acc: 0.8714 Epoch 367/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5958 - acc: 0.8769 - val_loss: 0.6280 - val_acc: 0.8660 Epoch 368/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5917 - acc: 0.8786 - val_loss: 0.6152 - val_acc: 0.8727 Epoch 369/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5895 - acc: 0.8784 - val_loss: 0.6376 - val_acc: 0.8654 Epoch 370/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5948 - acc: 0.8779 - val_loss: 0.6222 - val_acc: 0.8692 Epoch 371/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5895 - acc: 0.8788 - val_loss: 0.6430 - val_acc: 0.8652 Epoch 372/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5891 - acc: 0.8801 - val_loss: 0.6184 - val_acc: 0.8750 Epoch 373/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5912 - acc: 0.8784 - val_loss: 0.6222 - val_acc: 0.8687 Epoch 374/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5899 - acc: 0.8784 - val_loss: 0.6184 - val_acc: 0.8711 Epoch 375/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5921 - acc: 0.8778 - val_loss: 0.6091 - val_acc: 0.8736 Epoch 376/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5927 - acc: 0.8778 - val_loss: 0.6492 - val_acc: 0.8604 Epoch 377/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5969 - acc: 0.8762 - val_loss: 0.6185 - val_acc: 0.8708 Epoch 378/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5901 - acc: 0.8778 - val_loss: 0.6314 - val_acc: 0.8681 Epoch 379/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5936 - acc: 0.8767 - val_loss: 0.6159 - val_acc: 0.8733 Epoch 380/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5941 - acc: 0.8771 - val_loss: 0.6361 - val_acc: 0.8674 Epoch 381/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5910 - acc: 0.8778 - val_loss: 0.6542 - val_acc: 0.8600 Epoch 382/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5915 - acc: 0.8785 - val_loss: 0.6324 - val_acc: 0.8675 Epoch 383/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5905 - acc: 0.8770 - val_loss: 0.6428 - val_acc: 0.8629 Epoch 384/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5887 - acc: 0.8786 - val_loss: 0.6285 - val_acc: 0.8663 Epoch 385/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5908 - acc: 0.8779 - val_loss: 0.6417 - val_acc: 0.8616 Epoch 386/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5887 - acc: 0.8790 - val_loss: 0.6283 - val_acc: 0.8680 Epoch 387/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5864 - acc: 0.8783 - val_loss: 0.6315 - val_acc: 0.8660 Epoch 388/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5842 - acc: 0.8793 - val_loss: 0.6250 - val_acc: 0.8676 Epoch 389/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5876 - acc: 0.8796 - val_loss: 0.6333 - val_acc: 0.8685 Epoch 390/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5907 - acc: 0.8784 - val_loss: 0.6327 - val_acc: 0.8655 Epoch 391/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5887 - acc: 0.8790 - val_loss: 0.6402 - val_acc: 0.8676 Epoch 392/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5937 - acc: 0.8767 - val_loss: 0.6210 - val_acc: 0.8708 Epoch 393/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5870 - acc: 0.8801 - val_loss: 0.6186 - val_acc: 0.8750 Epoch 394/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5937 - acc: 0.8774 - val_loss: 0.6369 - val_acc: 0.8652 Epoch 395/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5891 - acc: 0.8805 - val_loss: 0.6279 - val_acc: 0.8700 Epoch 396/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5955 - acc: 0.8776 - val_loss: 0.6179 - val_acc: 0.8702 Epoch 397/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.5877 - acc: 0.8793 - val_loss: 0.6340 - val_acc: 0.8660 Epoch 398/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5899 - acc: 0.8787 - val_loss: 0.5990 - val_acc: 0.8802 Epoch 399/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.5899 - acc: 0.8802 - val_loss: 0.6270 - val_acc: 0.8694 Epoch 400/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.5942 - acc: 0.8774 - val_loss: 0.6336 - val_acc: 0.8639 Epoch 401/1000 lr changed to 0.010000000149011612 500/500 [==============================] - 52s 105ms/step - loss: 0.5041 - acc: 0.9091 - val_loss: 0.5454 - val_acc: 0.8967 Epoch 402/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.4483 - acc: 0.9265 - val_loss: 0.5314 - val_acc: 0.8978 Epoch 403/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.4280 - acc: 0.9327 - val_loss: 0.5212 - val_acc: 0.9015 Epoch 404/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.4145 - acc: 0.9351 - val_loss: 0.5156 - val_acc: 0.9033 Epoch 405/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.4053 - acc: 0.9367 - val_loss: 0.5152 - val_acc: 0.9042 Epoch 406/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3948 - acc: 0.9398 - val_loss: 0.5083 - val_acc: 0.9021 Epoch 407/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3880 - acc: 0.9389 - val_loss: 0.5085 - val_acc: 0.9031 Epoch 408/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3771 - acc: 0.9433 - val_loss: 0.5094 - val_acc: 0.8987 Epoch 409/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3694 - acc: 0.9441 - val_loss: 0.5006 - val_acc: 0.9039 Epoch 410/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3669 - acc: 0.9432 - val_loss: 0.4927 - val_acc: 0.9054 Epoch 411/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3564 - acc: 0.9466 - val_loss: 0.4973 - val_acc: 0.9034 Epoch 412/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.3508 - acc: 0.9476 - val_loss: 0.4929 - val_acc: 0.9032 Epoch 413/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.3464 - acc: 0.9468 - val_loss: 0.4919 - val_acc: 0.9024 Epoch 414/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3394 - acc: 0.9487 - val_loss: 0.4842 - val_acc: 0.9032 Epoch 415/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3329 - acc: 0.9498 - val_loss: 0.4827 - val_acc: 0.9059 Epoch 416/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3317 - acc: 0.9494 - val_loss: 0.4873 - val_acc: 0.9024 Epoch 417/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3281 - acc: 0.9485 - val_loss: 0.4812 - val_acc: 0.9074 Epoch 418/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3205 - acc: 0.9514 - val_loss: 0.4796 - val_acc: 0.9038 Epoch 419/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3207 - acc: 0.9497 - val_loss: 0.4775 - val_acc: 0.9039 Epoch 420/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.3140 - acc: 0.9518 - val_loss: 0.4753 - val_acc: 0.9052 Epoch 421/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.3094 - acc: 0.9520 - val_loss: 0.4840 - val_acc: 0.9020 Epoch 422/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.3091 - acc: 0.9513 - val_loss: 0.4684 - val_acc: 0.9064 Epoch 423/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.3055 - acc: 0.9515 - val_loss: 0.4629 - val_acc: 0.9065 Epoch 424/1000 500/500 [==============================] - 52s 104ms/step - loss: 0.2973 - acc: 0.9526 - val_loss: 0.4696 - val_acc: 0.9044 Epoch 425/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2965 - acc: 0.9529 - val_loss: 0.4659 - val_acc: 0.9030 Epoch 426/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2955 - acc: 0.9531 - val_loss: 0.4584 - val_acc: 0.9067 Epoch 427/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2948 - acc: 0.9519 - val_loss: 0.4514 - val_acc: 0.9071 Epoch 428/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2871 - acc: 0.9542 - val_loss: 0.4584 - val_acc: 0.9081 Epoch 429/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2849 - acc: 0.9541 - val_loss: 0.4684 - val_acc: 0.9037 Epoch 430/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2832 - acc: 0.9537 - val_loss: 0.4588 - val_acc: 0.9049 Epoch 431/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2785 - acc: 0.9553 - val_loss: 0.4595 - val_acc: 0.9063 Epoch 432/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2777 - acc: 0.9546 - val_loss: 0.4516 - val_acc: 0.9059 Epoch 433/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2788 - acc: 0.9528 - val_loss: 0.4521 - val_acc: 0.9031 Epoch 434/1000 500/500 [==============================] - 53s 107ms/step - loss: 0.2743 - acc: 0.9555 - val_loss: 0.4679 - val_acc: 0.9015 Epoch 435/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2739 - acc: 0.9540 - val_loss: 0.4512 - val_acc: 0.9053 Epoch 436/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2701 - acc: 0.9555 - val_loss: 0.4622 - val_acc: 0.9034 Epoch 437/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2697 - acc: 0.9547 - val_loss: 0.4585 - val_acc: 0.9015 Epoch 438/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2663 - acc: 0.9552 - val_loss: 0.4556 - val_acc: 0.9027 Epoch 439/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2641 - acc: 0.9553 - val_loss: 0.4538 - val_acc: 0.9023 Epoch 440/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2649 - acc: 0.9548 - val_loss: 0.4458 - val_acc: 0.9047 Epoch 441/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2601 - acc: 0.9561 - val_loss: 0.4499 - val_acc: 0.9032 Epoch 442/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2610 - acc: 0.9549 - val_loss: 0.4533 - val_acc: 0.9042 Epoch 443/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2608 - acc: 0.9543 - val_loss: 0.4542 - val_acc: 0.9054 Epoch 444/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2606 - acc: 0.9547 - val_loss: 0.4585 - val_acc: 0.9003 Epoch 445/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2567 - acc: 0.9554 - val_loss: 0.4549 - val_acc: 0.8993 Epoch 446/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2551 - acc: 0.9555 - val_loss: 0.4653 - val_acc: 0.8983 Epoch 447/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2554 - acc: 0.9558 - val_loss: 0.4561 - val_acc: 0.9000 Epoch 448/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2565 - acc: 0.9540 - val_loss: 0.4562 - val_acc: 0.9002 Epoch 449/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2528 - acc: 0.9551 - val_loss: 0.4515 - val_acc: 0.8996 Epoch 450/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2545 - acc: 0.9545 - val_loss: 0.4475 - val_acc: 0.9015 Epoch 451/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2554 - acc: 0.9543 - val_loss: 0.4460 - val_acc: 0.9059 Epoch 452/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2506 - acc: 0.9545 - val_loss: 0.4526 - val_acc: 0.8997 Epoch 453/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2517 - acc: 0.9542 - val_loss: 0.4442 - val_acc: 0.8999 Epoch 454/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2517 - acc: 0.9543 - val_loss: 0.4523 - val_acc: 0.9001 Epoch 455/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2458 - acc: 0.9560 - val_loss: 0.4329 - val_acc: 0.9029 Epoch 456/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2495 - acc: 0.9546 - val_loss: 0.4407 - val_acc: 0.9026 Epoch 457/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2451 - acc: 0.9553 - val_loss: 0.4378 - val_acc: 0.9025 Epoch 458/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2472 - acc: 0.9543 - val_loss: 0.4403 - val_acc: 0.9026 Epoch 459/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2461 - acc: 0.9550 - val_loss: 0.4359 - val_acc: 0.9041 Epoch 460/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2475 - acc: 0.9531 - val_loss: 0.4423 - val_acc: 0.9021 Epoch 461/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2450 - acc: 0.9537 - val_loss: 0.4392 - val_acc: 0.9019 Epoch 462/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2452 - acc: 0.9543 - val_loss: 0.4408 - val_acc: 0.8996 Epoch 463/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2441 - acc: 0.9545 - val_loss: 0.4495 - val_acc: 0.8999 Epoch 464/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2439 - acc: 0.9539 - val_loss: 0.4413 - val_acc: 0.9029 Epoch 465/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2406 - acc: 0.9555 - val_loss: 0.4503 - val_acc: 0.8977 Epoch 466/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2445 - acc: 0.9541 - val_loss: 0.4388 - val_acc: 0.9025 Epoch 467/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2402 - acc: 0.9547 - val_loss: 0.4306 - val_acc: 0.9027 Epoch 468/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2402 - acc: 0.9565 - val_loss: 0.4391 - val_acc: 0.9040 Epoch 469/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2419 - acc: 0.9546 - val_loss: 0.4442 - val_acc: 0.8987 Epoch 470/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2409 - acc: 0.9537 - val_loss: 0.4414 - val_acc: 0.9007 Epoch 471/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2446 - acc: 0.9527 - val_loss: 0.4478 - val_acc: 0.8971 Epoch 472/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2374 - acc: 0.9553 - val_loss: 0.4522 - val_acc: 0.8967 Epoch 473/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2418 - acc: 0.9528 - val_loss: 0.4440 - val_acc: 0.8983 Epoch 474/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2394 - acc: 0.9552 - val_loss: 0.4418 - val_acc: 0.9000 Epoch 475/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2403 - acc: 0.9539 - val_loss: 0.4379 - val_acc: 0.9031 Epoch 476/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2379 - acc: 0.9543 - val_loss: 0.4358 - val_acc: 0.8999 Epoch 477/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2409 - acc: 0.9529 - val_loss: 0.4433 - val_acc: 0.9006 Epoch 478/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2389 - acc: 0.9533 - val_loss: 0.4410 - val_acc: 0.9009 Epoch 479/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2385 - acc: 0.9544 - val_loss: 0.4427 - val_acc: 0.9007 Epoch 480/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2399 - acc: 0.9538 - val_loss: 0.4248 - val_acc: 0.9018 Epoch 481/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2367 - acc: 0.9540 - val_loss: 0.4425 - val_acc: 0.9005 Epoch 482/1000 500/500 [==============================] - 53s 106ms/step - loss: 0.2376 - acc: 0.9544 - val_loss: 0.4424 - val_acc: 0.9010 Epoch 483/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2400 - acc: 0.9537 - val_loss: 0.4414 - val_acc: 0.8987 Epoch 484/1000 500/500 [==============================] - 52s 105ms/step - loss: 0.2367 - acc: 0.9539 - val_loss: 0.4423 - val_acc: 0.8994 Epoch 485/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2356 - acc: 0.9547 - val_loss: 0.4297 - val_acc: 0.9013 Epoch 486/1000 500/500 [==============================] - 53s 105ms/step - loss: 0.2378 - acc: 0.9543 - val_loss: 0.4286 - val_acc: 0.9039 Epoch 487/1000 500/500 [==============================] - 53s 107ms/step - loss: 0.2347 - acc: 0.9551 - val_loss: 0.4304 - val_acc: 0.9018 Epoch 488/1000 87/500 [====>.........................] - ETA: 42s - loss: 0.2317 - acc: 0.9568Traceback (most recent call last): File "C:\Users\hitwh\.spyder-py3\temp.py", line 148, in <module> verbose=1, callbacks=[reduce_lr], workers=4) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper return func(*args, **kwargs) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training.py", line 1415, in fit_generator initial_epoch=initial_epoch) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training_generator.py", line 213, in fit_generator class_weight=class_weight) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training.py", line 1215, in train_on_batch outputs = self.train_function(ins) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\backend\tensorflow_backend.py", line 2666, in __call__ return self._call(inputs) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\backend\tensorflow_backend.py", line 2636, in _call fetched = self._callable_fn(*array_vals) File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\tensorflow\python\client\session.py", line 1382, in __call__ run_metadata_ptr) KeyboardInterrupt
無意中按了Ctrl+C,把程式給中斷了,沒跑完。本來設定跑1000個epoch,只跑到第488個。驗證集上的準確率已經到了90%。
Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458
https://ieeexplore.ieee.org/document/8998530
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/69972329/viewspace-2687501/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄1)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄3)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄4)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄5)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄6)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄7)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄8)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄9)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄10)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄11)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄12)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄13)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄14)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄15)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄16)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄17)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄26)Cifar10~95.92%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄18)Cifar10~94.28%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄19)Cifar10~93.96%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄23)Cifar10~95.47%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄24)Cifar10~95.80%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄20)Cifar10~94.17%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄22)Cifar10~95.25%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄21)Cifar10~95.12%函式
- 注意力機制下的啟用函式:自適應引數化ReLU函式
- 深度殘差網路(ResNet)
- 深度學習之殘差網路深度學習
- 殘差網路再升級之深度殘差收縮網路(附Keras程式碼)Keras
- 深度殘差收縮網路:(三)網路結構
- 深度學習故障診斷——深度殘差收縮網路深度學習
- 深度殘差收縮網路:(一)背景知識
- 深度殘差收縮網路:(二)整體思路
- 學習筆記16:殘差網路筆記
- 十分鐘弄懂深度殘差收縮網路
- 深度殘差收縮網路:(五)實驗驗證
- 深度殘差收縮網路:(六)程式碼實現
- 從ReLU到GELU,一文概覽神經網路的啟用函式神經網路函式
- PHP函式,引數,可變參函式.PHP函式