深度殘差網路+自適應引數化ReLU啟用函式(調參記錄21)Cifar10~95.12%
本文在調參記錄20的基礎上,將殘差模組的個數,從27個增加到60個,繼續測試深度殘差網路ResNet+自適應引數化ReLU啟用函式在Cifar10資料集上的表現。
自適應引數化ReLU函式被放在了殘差模組的第二個卷積層之後,這與Squeeze-and-Excitation Networks或者深度殘差收縮網路是相似的。其基本原理如下
Keras程式如下:
#!/usr/bin/env python3 # -*- coding: utf-8 -*- """ Created on Tue Apr 14 04:17:45 2020 Implemented using TensorFlow 1.10.0 and Keras 2.2.1 Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458 @author: Minghang Zhao """ from __future__ import print_function import keras import numpy as np from keras.datasets import cifar10 from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape from keras.regularizers import l2 from keras import backend as K from keras.models import Model from keras import optimizers from keras.preprocessing.image import ImageDataGenerator from keras.callbacks import LearningRateScheduler K.set_learning_phase(1) # The data, split between train and test sets (x_train, y_train), (x_test, y_test) = cifar10.load_data() # Noised data x_train = x_train.astype('float32') / 255. x_test = x_test.astype('float32') / 255. x_test = x_test-np.mean(x_train) x_train = x_train-np.mean(x_train) print('x_train shape:', x_train.shape) print(x_train.shape[0], 'train samples') print(x_test.shape[0], 'test samples') # convert class vectors to binary class matrices y_train = keras.utils.to_categorical(y_train, 10) y_test = keras.utils.to_categorical(y_test, 10) # Schedule the learning rate, multiply 0.1 every 150 epoches def scheduler(epoch): if epoch % 150 == 0 and epoch != 0: lr = K.get_value(model.optimizer.lr) K.set_value(model.optimizer.lr, lr * 0.1) print("lr changed to {}".format(lr * 0.1)) return K.get_value(model.optimizer.lr) # An adaptively parametric rectifier linear unit (APReLU) def aprelu(inputs): # get the number of channels channels = inputs.get_shape().as_list()[-1] # get a zero feature map zeros_input = keras.layers.subtract([inputs, inputs]) # get a feature map with only positive features pos_input = Activation('relu')(inputs) # get a feature map with only negative features neg_input = Minimum()([inputs,zeros_input]) # define a network to obtain the scaling coefficients scales_p = GlobalAveragePooling2D()(pos_input) scales_n = GlobalAveragePooling2D()(neg_input) scales = Concatenate()([scales_n, scales_p]) scales = Dense(channels//16, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('relu')(scales) scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales) scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales) scales = Activation('sigmoid')(scales) scales = Reshape((1,1,channels))(scales) # apply a paramtetric relu neg_part = keras.layers.multiply([scales, neg_input]) return keras.layers.add([pos_input, neg_part]) # Residual Block def residual_block(incoming, nb_blocks, out_channels, downsample=False, downsample_strides=2): residual = incoming in_channels = incoming.get_shape().as_list()[-1] for i in range(nb_blocks): identity = residual if not downsample: downsample_strides = 1 residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = Activation('relu')(residual) residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual) residual = Activation('relu')(residual) residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(residual) residual = aprelu(residual) # Downsampling if downsample_strides > 1: identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity) # Zero_padding to match channels if in_channels != out_channels: zeros_identity = keras.layers.subtract([identity, identity]) identity = keras.layers.concatenate([identity, zeros_identity]) in_channels = out_channels residual = keras.layers.add([residual, identity]) return residual # define and train a model inputs = Input(shape=(32, 32, 3)) net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs) net = residual_block(net, 20, 32, downsample=False) net = residual_block(net, 1, 32, downsample=True) net = residual_block(net, 19, 32, downsample=False) net = residual_block(net, 1, 64, downsample=True) net = residual_block(net, 19, 64, downsample=False) net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net) net = Activation('relu')(net) net = GlobalAveragePooling2D()(net) outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net) model = Model(inputs=inputs, outputs=outputs) sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True) model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy']) # data augmentation datagen = ImageDataGenerator( # randomly rotate images in the range (deg 0 to 180) rotation_range=30, # Range for random zoom zoom_range = 0.2, # shear angle in counter-clockwise direction in degrees shear_range = 30, # randomly flip images horizontal_flip=True, # randomly shift images horizontally width_shift_range=0.125, # randomly shift images vertically height_shift_range=0.125) reduce_lr = LearningRateScheduler(scheduler) # fit the model on the batches generated by datagen.flow(). model.fit_generator(datagen.flow(x_train, y_train, batch_size=100), validation_data=(x_test, y_test), epochs=500, verbose=1, callbacks=[reduce_lr], workers=4) # get results K.set_learning_phase(0) DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0) print('Train loss:', DRSN_train_score[0]) print('Train accuracy:', DRSN_train_score[1]) DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0) print('Test loss:', DRSN_test_score[0]) print('Test accuracy:', DRSN_test_score[1])
實驗結果如下:
Using TensorFlow backend. x_train shape: (50000, 32, 32, 3) 50000 train samples 10000 test samples Epoch 1/500 156s 312ms/step - loss: 3.7450 - acc: 0.4151 - val_loss: 3.1432 - val_acc: 0.5763 Epoch 2/500 113s 226ms/step - loss: 2.9954 - acc: 0.5750 - val_loss: 2.5940 - val_acc: 0.6689 Epoch 3/500 113s 226ms/step - loss: 2.5203 - acc: 0.6476 - val_loss: 2.1871 - val_acc: 0.7254 Epoch 4/500 113s 225ms/step - loss: 2.1855 - acc: 0.6865 - val_loss: 1.9171 - val_acc: 0.7488 Epoch 5/500 113s 225ms/step - loss: 1.9224 - acc: 0.7144 - val_loss: 1.6662 - val_acc: 0.7774 Epoch 6/500 113s 225ms/step - loss: 1.7111 - acc: 0.7331 - val_loss: 1.4882 - val_acc: 0.7915 Epoch 7/500 113s 226ms/step - loss: 1.5472 - acc: 0.7483 - val_loss: 1.3414 - val_acc: 0.7994 Epoch 8/500 113s 226ms/step - loss: 1.4095 - acc: 0.7633 - val_loss: 1.2149 - val_acc: 0.8194 Epoch 9/500 113s 226ms/step - loss: 1.3008 - acc: 0.7739 - val_loss: 1.1264 - val_acc: 0.8234 Epoch 10/500 113s 226ms/step - loss: 1.2077 - acc: 0.7824 - val_loss: 1.0474 - val_acc: 0.8322 Epoch 11/500 113s 225ms/step - loss: 1.1382 - acc: 0.7885 - val_loss: 0.9929 - val_acc: 0.8343 Epoch 12/500 113s 225ms/step - loss: 1.0722 - acc: 0.7955 - val_loss: 0.9418 - val_acc: 0.8400 Epoch 13/500 113s 225ms/step - loss: 1.0242 - acc: 0.8032 - val_loss: 0.9018 - val_acc: 0.8421 Epoch 14/500 113s 225ms/step - loss: 0.9843 - acc: 0.8083 - val_loss: 0.8639 - val_acc: 0.8506 Epoch 15/500 113s 225ms/step - loss: 0.9520 - acc: 0.8101 - val_loss: 0.8522 - val_acc: 0.8491 Epoch 16/500 113s 226ms/step - loss: 0.9313 - acc: 0.8130 - val_loss: 0.8124 - val_acc: 0.8541 Epoch 17/500 113s 226ms/step - loss: 0.9033 - acc: 0.8190 - val_loss: 0.8156 - val_acc: 0.8484 Epoch 18/500 113s 226ms/step - loss: 0.8791 - acc: 0.8223 - val_loss: 0.7796 - val_acc: 0.8572 Epoch 19/500 113s 226ms/step - loss: 0.8628 - acc: 0.8289 - val_loss: 0.7842 - val_acc: 0.8559 Epoch 20/500 113s 225ms/step - loss: 0.8528 - acc: 0.8292 - val_loss: 0.7725 - val_acc: 0.8533 Epoch 21/500 113s 225ms/step - loss: 0.8432 - acc: 0.8292 - val_loss: 0.7405 - val_acc: 0.8687 Epoch 22/500 113s 225ms/step - loss: 0.8260 - acc: 0.8347 - val_loss: 0.7425 - val_acc: 0.8648 Epoch 23/500 113s 225ms/step - loss: 0.8180 - acc: 0.8357 - val_loss: 0.7319 - val_acc: 0.8666 Epoch 24/500 113s 226ms/step - loss: 0.8146 - acc: 0.8385 - val_loss: 0.7158 - val_acc: 0.8761 Epoch 25/500 113s 226ms/step - loss: 0.8029 - acc: 0.8387 - val_loss: 0.7228 - val_acc: 0.8705 Epoch 26/500 113s 225ms/step - loss: 0.7968 - acc: 0.8425 - val_loss: 0.7160 - val_acc: 0.8725 Epoch 27/500 113s 225ms/step - loss: 0.7940 - acc: 0.8433 - val_loss: 0.7176 - val_acc: 0.8747 Epoch 28/500 113s 226ms/step - loss: 0.7904 - acc: 0.8439 - val_loss: 0.7080 - val_acc: 0.8747 Epoch 29/500 113s 225ms/step - loss: 0.7810 - acc: 0.8450 - val_loss: 0.7234 - val_acc: 0.8679 Epoch 30/500 113s 225ms/step - loss: 0.7807 - acc: 0.8457 - val_loss: 0.6999 - val_acc: 0.8754 Epoch 31/500 113s 225ms/step - loss: 0.7795 - acc: 0.8487 - val_loss: 0.7116 - val_acc: 0.8745 Epoch 32/500 113s 225ms/step - loss: 0.7722 - acc: 0.8497 - val_loss: 0.7064 - val_acc: 0.8798 Epoch 33/500 113s 226ms/step - loss: 0.7678 - acc: 0.8533 - val_loss: 0.7148 - val_acc: 0.8709 Epoch 34/500 113s 226ms/step - loss: 0.7634 - acc: 0.8528 - val_loss: 0.7095 - val_acc: 0.8741 Epoch 35/500 113s 225ms/step - loss: 0.7684 - acc: 0.8535 - val_loss: 0.7070 - val_acc: 0.8768 Epoch 36/500 113s 225ms/step - loss: 0.7630 - acc: 0.8540 - val_loss: 0.6935 - val_acc: 0.8804 Epoch 37/500 113s 225ms/step - loss: 0.7557 - acc: 0.8566 - val_loss: 0.6997 - val_acc: 0.8785 Epoch 38/500 113s 225ms/step - loss: 0.7518 - acc: 0.8591 - val_loss: 0.7090 - val_acc: 0.8771 Epoch 39/500 113s 225ms/step - loss: 0.7537 - acc: 0.8581 - val_loss: 0.6784 - val_acc: 0.8879 Epoch 40/500 113s 226ms/step - loss: 0.7537 - acc: 0.8566 - val_loss: 0.6778 - val_acc: 0.8854 Epoch 41/500 113s 226ms/step - loss: 0.7461 - acc: 0.8613 - val_loss: 0.6941 - val_acc: 0.8800 Epoch 42/500 113s 226ms/step - loss: 0.7518 - acc: 0.8586 - val_loss: 0.7230 - val_acc: 0.8731 Epoch 43/500 113s 225ms/step - loss: 0.7562 - acc: 0.8561 - val_loss: 0.6876 - val_acc: 0.8859 Epoch 44/500 113s 225ms/step - loss: 0.7398 - acc: 0.8626 - val_loss: 0.6793 - val_acc: 0.8861 Epoch 45/500 113s 225ms/step - loss: 0.7402 - acc: 0.8638 - val_loss: 0.6860 - val_acc: 0.8857 Epoch 46/500 113s 225ms/step - loss: 0.7430 - acc: 0.8626 - val_loss: 0.6878 - val_acc: 0.8857 Epoch 47/500 113s 225ms/step - loss: 0.7372 - acc: 0.8656 - val_loss: 0.6758 - val_acc: 0.8885 Epoch 48/500 113s 225ms/step - loss: 0.7364 - acc: 0.8649 - val_loss: 0.6837 - val_acc: 0.8849 Epoch 49/500 113s 226ms/step - loss: 0.7374 - acc: 0.8639 - val_loss: 0.6730 - val_acc: 0.8902 Epoch 50/500 113s 226ms/step - loss: 0.7389 - acc: 0.8657 - val_loss: 0.6848 - val_acc: 0.8868 Epoch 51/500 113s 227ms/step - loss: 0.7354 - acc: 0.8654 - val_loss: 0.6788 - val_acc: 0.8892 Epoch 52/500 113s 227ms/step - loss: 0.7286 - acc: 0.8691 - val_loss: 0.6942 - val_acc: 0.8800 Epoch 53/500 113s 225ms/step - loss: 0.7365 - acc: 0.8653 - val_loss: 0.6929 - val_acc: 0.8820 Epoch 54/500 113s 226ms/step - loss: 0.7295 - acc: 0.8685 - val_loss: 0.6761 - val_acc: 0.8892 Epoch 55/500 113s 226ms/step - loss: 0.7319 - acc: 0.8694 - val_loss: 0.6715 - val_acc: 0.8886 Epoch 56/500 113s 226ms/step - loss: 0.7315 - acc: 0.8681 - val_loss: 0.6807 - val_acc: 0.8891 Epoch 57/500 113s 226ms/step - loss: 0.7330 - acc: 0.8679 - val_loss: 0.6705 - val_acc: 0.8943 Epoch 58/500 113s 226ms/step - loss: 0.7269 - acc: 0.8715 - val_loss: 0.7076 - val_acc: 0.8776 Epoch 59/500 113s 226ms/step - loss: 0.7314 - acc: 0.8690 - val_loss: 0.6747 - val_acc: 0.8884 Epoch 60/500 113s 226ms/step - loss: 0.7323 - acc: 0.8699 - val_loss: 0.6775 - val_acc: 0.8867 Epoch 61/500 113s 225ms/step - loss: 0.7289 - acc: 0.8698 - val_loss: 0.6851 - val_acc: 0.8838 Epoch 62/500 112s 225ms/step - loss: 0.7290 - acc: 0.8688 - val_loss: 0.6995 - val_acc: 0.8838 Epoch 63/500 112s 225ms/step - loss: 0.7302 - acc: 0.8696 - val_loss: 0.6758 - val_acc: 0.8913 Epoch 64/500 113s 225ms/step - loss: 0.7264 - acc: 0.8714 - val_loss: 0.6770 - val_acc: 0.8907 Epoch 65/500 113s 225ms/step - loss: 0.7238 - acc: 0.8725 - val_loss: 0.6898 - val_acc: 0.8865 Epoch 66/500 113s 225ms/step - loss: 0.7218 - acc: 0.8728 - val_loss: 0.6712 - val_acc: 0.8936 Epoch 67/500 113s 225ms/step - loss: 0.7235 - acc: 0.8729 - val_loss: 0.6829 - val_acc: 0.8888 Epoch 68/500 112s 225ms/step - loss: 0.7226 - acc: 0.8740 - val_loss: 0.6635 - val_acc: 0.8967 Epoch 69/500 112s 225ms/step - loss: 0.7281 - acc: 0.8713 - val_loss: 0.6750 - val_acc: 0.8912 Epoch 70/500 112s 225ms/step - loss: 0.7218 - acc: 0.8735 - val_loss: 0.6937 - val_acc: 0.8855 Epoch 71/500 113s 225ms/step - loss: 0.7207 - acc: 0.8738 - val_loss: 0.7040 - val_acc: 0.8796 Epoch 72/500 113s 225ms/step - loss: 0.7215 - acc: 0.8748 - val_loss: 0.6944 - val_acc: 0.8890 Epoch 73/500 113s 225ms/step - loss: 0.7206 - acc: 0.8742 - val_loss: 0.6757 - val_acc: 0.8903 Epoch 74/500 113s 225ms/step - loss: 0.7172 - acc: 0.8750 - val_loss: 0.6872 - val_acc: 0.8889 Epoch 75/500 113s 225ms/step - loss: 0.7183 - acc: 0.8758 - val_loss: 0.6691 - val_acc: 0.8950 Epoch 76/500 112s 225ms/step - loss: 0.7188 - acc: 0.8749 - val_loss: 0.6823 - val_acc: 0.8872 Epoch 77/500 112s 225ms/step - loss: 0.7165 - acc: 0.8753 - val_loss: 0.6794 - val_acc: 0.8913 Epoch 78/500 113s 225ms/step - loss: 0.7159 - acc: 0.8760 - val_loss: 0.7313 - val_acc: 0.8730 Epoch 79/500 112s 225ms/step - loss: 0.7146 - acc: 0.8772 - val_loss: 0.7072 - val_acc: 0.8798 Epoch 80/500 113s 225ms/step - loss: 0.7196 - acc: 0.8754 - val_loss: 0.6698 - val_acc: 0.8951 Epoch 81/500 113s 225ms/step - loss: 0.7112 - acc: 0.8789 - val_loss: 0.6696 - val_acc: 0.8939 Epoch 82/500 113s 225ms/step - loss: 0.7180 - acc: 0.8757 - val_loss: 0.6697 - val_acc: 0.8944 Epoch 83/500 113s 225ms/step - loss: 0.7126 - acc: 0.8770 - val_loss: 0.6615 - val_acc: 0.8972 Epoch 84/500 112s 225ms/step - loss: 0.7112 - acc: 0.8799 - val_loss: 0.6893 - val_acc: 0.8848 Epoch 85/500 112s 225ms/step - loss: 0.7149 - acc: 0.8766 - val_loss: 0.6679 - val_acc: 0.8963 Epoch 86/500 112s 225ms/step - loss: 0.7109 - acc: 0.8769 - val_loss: 0.6713 - val_acc: 0.8953 Epoch 87/500 112s 225ms/step - loss: 0.7088 - acc: 0.8803 - val_loss: 0.6571 - val_acc: 0.8985 Epoch 88/500 112s 225ms/step - loss: 0.7119 - acc: 0.8789 - val_loss: 0.6786 - val_acc: 0.8919 Epoch 89/500 113s 225ms/step - loss: 0.7111 - acc: 0.8767 - val_loss: 0.6741 - val_acc: 0.8925 Epoch 90/500 113s 225ms/step - loss: 0.7096 - acc: 0.8788 - val_loss: 0.7048 - val_acc: 0.8829 Epoch 91/500 113s 225ms/step - loss: 0.7056 - acc: 0.8787 - val_loss: 0.6714 - val_acc: 0.8933 Epoch 92/500 113s 225ms/step - loss: 0.7121 - acc: 0.8786 - val_loss: 0.6962 - val_acc: 0.8857 Epoch 93/500 112s 225ms/step - loss: 0.7078 - acc: 0.8805 - val_loss: 0.6854 - val_acc: 0.8882 Epoch 94/500 112s 225ms/step - loss: 0.7026 - acc: 0.8830 - val_loss: 0.6821 - val_acc: 0.8894 Epoch 95/500 112s 225ms/step - loss: 0.7063 - acc: 0.8812 - val_loss: 0.6900 - val_acc: 0.8866 Epoch 96/500 113s 225ms/step - loss: 0.7091 - acc: 0.8803 - val_loss: 0.6765 - val_acc: 0.8961 Epoch 97/500 113s 225ms/step - loss: 0.7036 - acc: 0.8810 - val_loss: 0.6744 - val_acc: 0.8946 Epoch 98/500 113s 225ms/step - loss: 0.7081 - acc: 0.8794 - val_loss: 0.6673 - val_acc: 0.8952 Epoch 99/500 113s 225ms/step - loss: 0.7091 - acc: 0.8799 - val_loss: 0.6713 - val_acc: 0.8931 Epoch 100/500 112s 225ms/step - loss: 0.7066 - acc: 0.8814 - val_loss: 0.6701 - val_acc: 0.8938 Epoch 101/500 112s 225ms/step - loss: 0.7114 - acc: 0.8797 - val_loss: 0.6702 - val_acc: 0.8961 Epoch 102/500 112s 225ms/step - loss: 0.7028 - acc: 0.8816 - val_loss: 0.6682 - val_acc: 0.8965 Epoch 103/500 115s 229ms/step - loss: 0.7026 - acc: 0.8826 - val_loss: 0.6839 - val_acc: 0.8905 Epoch 104/500 116s 232ms/step - loss: 0.7047 - acc: 0.8810 - val_loss: 0.6711 - val_acc: 0.8953 Epoch 105/500 113s 227ms/step - loss: 0.7039 - acc: 0.8814 - val_loss: 0.6785 - val_acc: 0.8928 Epoch 106/500 113s 227ms/step - loss: 0.7064 - acc: 0.8824 - val_loss: 0.6767 - val_acc: 0.8928 Epoch 107/500 114s 227ms/step - loss: 0.7069 - acc: 0.8804 - val_loss: 0.6523 - val_acc: 0.9039 Epoch 108/500 113s 226ms/step - loss: 0.7051 - acc: 0.8813 - val_loss: 0.6804 - val_acc: 0.8919 Epoch 109/500 113s 227ms/step - loss: 0.6994 - acc: 0.8833 - val_loss: 0.6735 - val_acc: 0.8955 Epoch 110/500 113s 226ms/step - loss: 0.7034 - acc: 0.8829 - val_loss: 0.6633 - val_acc: 0.8982 Epoch 111/500 113s 226ms/step - loss: 0.7008 - acc: 0.8839 - val_loss: 0.6726 - val_acc: 0.8911 Epoch 112/500 113s 226ms/step - loss: 0.7010 - acc: 0.8828 - val_loss: 0.6609 - val_acc: 0.8981 Epoch 113/500 113s 226ms/step - loss: 0.7055 - acc: 0.8811 - val_loss: 0.6971 - val_acc: 0.8839 Epoch 114/500 113s 226ms/step - loss: 0.7023 - acc: 0.8834 - val_loss: 0.6695 - val_acc: 0.8949 Epoch 115/500 113s 227ms/step - loss: 0.7028 - acc: 0.8832 - val_loss: 0.6720 - val_acc: 0.8975 Epoch 116/500 113s 226ms/step - loss: 0.7005 - acc: 0.8843 - val_loss: 0.6934 - val_acc: 0.8880 Epoch 117/500 113s 226ms/step - loss: 0.7030 - acc: 0.8842 - val_loss: 0.6827 - val_acc: 0.8932 Epoch 118/500 113s 226ms/step - loss: 0.7016 - acc: 0.8861 - val_loss: 0.6817 - val_acc: 0.8936 Epoch 119/500 112s 225ms/step - loss: 0.7037 - acc: 0.8841 - val_loss: 0.6781 - val_acc: 0.8958 Epoch 120/500 113s 226ms/step - loss: 0.7014 - acc: 0.8837 - val_loss: 0.6793 - val_acc: 0.8936 Epoch 121/500 113s 227ms/step - loss: 0.7016 - acc: 0.8829 - val_loss: 0.6608 - val_acc: 0.9021 Epoch 122/500 113s 227ms/step - loss: 0.6984 - acc: 0.8848 - val_loss: 0.6910 - val_acc: 0.8891 Epoch 123/500 113s 227ms/step - loss: 0.6991 - acc: 0.8846 - val_loss: 0.6739 - val_acc: 0.8955 Epoch 124/500 113s 226ms/step - loss: 0.6990 - acc: 0.8846 - val_loss: 0.6570 - val_acc: 0.9016 Epoch 125/500 113s 226ms/step - loss: 0.6992 - acc: 0.8846 - val_loss: 0.6822 - val_acc: 0.8909 Epoch 126/500 113s 226ms/step - loss: 0.7034 - acc: 0.8824 - val_loss: 0.6745 - val_acc: 0.8981 Epoch 127/500 114s 227ms/step - loss: 0.6946 - acc: 0.8866 - val_loss: 0.6683 - val_acc: 0.8949 Epoch 128/500 113s 227ms/step - loss: 0.6965 - acc: 0.8850 - val_loss: 0.6737 - val_acc: 0.8963 Epoch 129/500 113s 227ms/step - loss: 0.7051 - acc: 0.8827 - val_loss: 0.6649 - val_acc: 0.8981 Epoch 130/500 113s 227ms/step - loss: 0.6976 - acc: 0.8846 - val_loss: 0.6652 - val_acc: 0.8990 Epoch 131/500 113s 227ms/step - loss: 0.7012 - acc: 0.8841 - val_loss: 0.6639 - val_acc: 0.8959 Epoch 132/500 113s 226ms/step - loss: 0.6958 - acc: 0.8850 - val_loss: 0.6691 - val_acc: 0.8946 Epoch 133/500 113s 226ms/step - loss: 0.6963 - acc: 0.8849 - val_loss: 0.6856 - val_acc: 0.8914 Epoch 134/500 113s 225ms/step - loss: 0.6970 - acc: 0.8862 - val_loss: 0.6668 - val_acc: 0.8966 Epoch 135/500 112s 225ms/step - loss: 0.7032 - acc: 0.8821 - val_loss: 0.6686 - val_acc: 0.8974 Epoch 136/500 113s 226ms/step - loss: 0.6983 - acc: 0.8875 - val_loss: 0.6755 - val_acc: 0.8957 Epoch 137/500 113s 225ms/step - loss: 0.6947 - acc: 0.8871 - val_loss: 0.6649 - val_acc: 0.8966 Epoch 138/500 113s 226ms/step - loss: 0.6941 - acc: 0.8877 - val_loss: 0.6825 - val_acc: 0.8892 Epoch 139/500 113s 225ms/step - loss: 0.6954 - acc: 0.8870 - val_loss: 0.6597 - val_acc: 0.9013 Epoch 140/500 113s 225ms/step - loss: 0.6950 - acc: 0.8855 - val_loss: 0.6797 - val_acc: 0.8891 Epoch 141/500 113s 225ms/step - loss: 0.6965 - acc: 0.8854 - val_loss: 0.6886 - val_acc: 0.8924 Epoch 142/500 113s 225ms/step - loss: 0.6912 - acc: 0.8879 - val_loss: 0.6643 - val_acc: 0.8985 Epoch 143/500 113s 225ms/step - loss: 0.6955 - acc: 0.8869 - val_loss: 0.6971 - val_acc: 0.8889 Epoch 144/500 112s 225ms/step - loss: 0.6932 - acc: 0.8870 - val_loss: 0.6666 - val_acc: 0.8969 Epoch 145/500 113s 225ms/step - loss: 0.6914 - acc: 0.8875 - val_loss: 0.6700 - val_acc: 0.8981 Epoch 146/500 113s 225ms/step - loss: 0.6989 - acc: 0.8856 - val_loss: 0.6825 - val_acc: 0.8936 Epoch 147/500 113s 225ms/step - loss: 0.6970 - acc: 0.8861 - val_loss: 0.6667 - val_acc: 0.8995 Epoch 148/500 113s 225ms/step - loss: 0.6911 - acc: 0.8880 - val_loss: 0.6808 - val_acc: 0.8912 Epoch 149/500 112s 225ms/step - loss: 0.6987 - acc: 0.8853 - val_loss: 0.6893 - val_acc: 0.8885 Epoch 150/500 112s 225ms/step - loss: 0.6952 - acc: 0.8868 - val_loss: 0.6745 - val_acc: 0.8932 Epoch 151/500 lr changed to 0.010000000149011612 113s 225ms/step - loss: 0.5880 - acc: 0.9249 - val_loss: 0.5801 - val_acc: 0.9269 Epoch 152/500 113s 225ms/step - loss: 0.5264 - acc: 0.9440 - val_loss: 0.5680 - val_acc: 0.9276 Epoch 153/500 113s 225ms/step - loss: 0.5067 - acc: 0.9467 - val_loss: 0.5533 - val_acc: 0.9320 Epoch 154/500 113s 225ms/step - loss: 0.4909 - acc: 0.9512 - val_loss: 0.5453 - val_acc: 0.9325 Epoch 155/500 112s 225ms/step - loss: 0.4762 - acc: 0.9550 - val_loss: 0.5348 - val_acc: 0.9330 Epoch 156/500 112s 225ms/step - loss: 0.4647 - acc: 0.9559 - val_loss: 0.5253 - val_acc: 0.9360 Epoch 157/500 112s 225ms/step - loss: 0.4550 - acc: 0.9583 - val_loss: 0.5218 - val_acc: 0.9354 Epoch 158/500 113s 225ms/step - loss: 0.4475 - acc: 0.9579 - val_loss: 0.5165 - val_acc: 0.9351 Epoch 159/500 112s 225ms/step - loss: 0.4348 - acc: 0.9615 - val_loss: 0.5185 - val_acc: 0.9346 Epoch 160/500 112s 225ms/step - loss: 0.4245 - acc: 0.9629 - val_loss: 0.5120 - val_acc: 0.9342 Epoch 161/500 113s 225ms/step - loss: 0.4177 - acc: 0.9638 - val_loss: 0.5018 - val_acc: 0.9365 Epoch 162/500 113s 225ms/step - loss: 0.4123 - acc: 0.9638 - val_loss: 0.5089 - val_acc: 0.9323 Epoch 163/500 113s 225ms/step - loss: 0.4046 - acc: 0.9647 - val_loss: 0.4858 - val_acc: 0.9379 Epoch 164/500 112s 225ms/step - loss: 0.3988 - acc: 0.9654 - val_loss: 0.4954 - val_acc: 0.9334 Epoch 165/500 112s 225ms/step - loss: 0.3880 - acc: 0.9677 - val_loss: 0.4836 - val_acc: 0.9362 Epoch 166/500 112s 225ms/step - loss: 0.3873 - acc: 0.9656 - val_loss: 0.4829 - val_acc: 0.9364 Epoch 167/500 112s 225ms/step - loss: 0.3819 - acc: 0.9661 - val_loss: 0.4774 - val_acc: 0.9362 Epoch 168/500 113s 225ms/step - loss: 0.3697 - acc: 0.9691 - val_loss: 0.4738 - val_acc: 0.9353 Epoch 169/500 113s 225ms/step - loss: 0.3664 - acc: 0.9688 - val_loss: 0.4863 - val_acc: 0.9318 Epoch 170/500 113s 225ms/step - loss: 0.3630 - acc: 0.9687 - val_loss: 0.4720 - val_acc: 0.9349 Epoch 171/500 113s 225ms/step - loss: 0.3587 - acc: 0.9687 - val_loss: 0.4613 - val_acc: 0.9355 Epoch 172/500 112s 225ms/step - loss: 0.3558 - acc: 0.9680 - val_loss: 0.4569 - val_acc: 0.9381 Epoch 173/500 112s 225ms/step - loss: 0.3453 - acc: 0.9714 - val_loss: 0.4611 - val_acc: 0.9359 Epoch 174/500 112s 225ms/step - loss: 0.3427 - acc: 0.9712 - val_loss: 0.4663 - val_acc: 0.9335 Epoch 175/500 112s 225ms/step - loss: 0.3369 - acc: 0.9709 - val_loss: 0.4493 - val_acc: 0.9386 Epoch 176/500 112s 225ms/step - loss: 0.3342 - acc: 0.9709 - val_loss: 0.4462 - val_acc: 0.9390 Epoch 177/500 113s 225ms/step - loss: 0.3293 - acc: 0.9721 - val_loss: 0.4442 - val_acc: 0.9368 Epoch 178/500 113s 225ms/step - loss: 0.3271 - acc: 0.9712 - val_loss: 0.4484 - val_acc: 0.9373 Epoch 179/500 113s 225ms/step - loss: 0.3217 - acc: 0.9730 - val_loss: 0.4435 - val_acc: 0.9335 Epoch 180/500 112s 225ms/step - loss: 0.3189 - acc: 0.9730 - val_loss: 0.4352 - val_acc: 0.9372 Epoch 181/500 112s 225ms/step - loss: 0.3133 - acc: 0.9748 - val_loss: 0.4449 - val_acc: 0.9313 Epoch 182/500 112s 225ms/step - loss: 0.3109 - acc: 0.9737 - val_loss: 0.4395 - val_acc: 0.9365 Epoch 183/500 112s 225ms/step - loss: 0.3092 - acc: 0.9720 - val_loss: 0.4329 - val_acc: 0.9374 Epoch 184/500 113s 225ms/step - loss: 0.3045 - acc: 0.9743 - val_loss: 0.4374 - val_acc: 0.9362 Epoch 185/500 113s 225ms/step - loss: 0.3005 - acc: 0.9741 - val_loss: 0.4256 - val_acc: 0.9371 Epoch 186/500 113s 225ms/step - loss: 0.3022 - acc: 0.9728 - val_loss: 0.4335 - val_acc: 0.9344 Epoch 187/500 112s 225ms/step - loss: 0.2969 - acc: 0.9737 - val_loss: 0.4246 - val_acc: 0.9343 Epoch 188/500 113s 225ms/step - loss: 0.2931 - acc: 0.9751 - val_loss: 0.4229 - val_acc: 0.9339 Epoch 189/500 112s 225ms/step - loss: 0.2929 - acc: 0.9734 - val_loss: 0.4216 - val_acc: 0.9362 Epoch 190/500 112s 225ms/step - loss: 0.2892 - acc: 0.9743 - val_loss: 0.4263 - val_acc: 0.9358 Epoch 191/500 112s 225ms/step - loss: 0.2869 - acc: 0.9744 - val_loss: 0.4181 - val_acc: 0.9342 Epoch 192/500 113s 225ms/step - loss: 0.2867 - acc: 0.9743 - val_loss: 0.4099 - val_acc: 0.9367 Epoch 193/500 113s 225ms/step - loss: 0.2848 - acc: 0.9739 - val_loss: 0.4184 - val_acc: 0.9378 Epoch 194/500 113s 226ms/step - loss: 0.2820 - acc: 0.9744 - val_loss: 0.4223 - val_acc: 0.9360 Epoch 195/500 113s 225ms/step - loss: 0.2827 - acc: 0.9726 - val_loss: 0.4049 - val_acc: 0.9375 Epoch 196/500 113s 225ms/step - loss: 0.2778 - acc: 0.9743 - val_loss: 0.4126 - val_acc: 0.9321 Epoch 197/500 113s 225ms/step - loss: 0.2761 - acc: 0.9738 - val_loss: 0.4225 - val_acc: 0.9305 Epoch 198/500 113s 225ms/step - loss: 0.2750 - acc: 0.9743 - val_loss: 0.4122 - val_acc: 0.9330 Epoch 199/500 113s 226ms/step - loss: 0.2713 - acc: 0.9751 - val_loss: 0.4222 - val_acc: 0.9323 Epoch 200/500 113s 226ms/step - loss: 0.2710 - acc: 0.9742 - val_loss: 0.4112 - val_acc: 0.9348 Epoch 201/500 113s 227ms/step - loss: 0.2696 - acc: 0.9743 - val_loss: 0.4100 - val_acc: 0.9359 Epoch 202/500 113s 227ms/step - loss: 0.2694 - acc: 0.9729 - val_loss: 0.4060 - val_acc: 0.9333 Epoch 203/500 113s 226ms/step - loss: 0.2662 - acc: 0.9741 - val_loss: 0.4018 - val_acc: 0.9387 Epoch 204/500 113s 226ms/step - loss: 0.2695 - acc: 0.9731 - val_loss: 0.3977 - val_acc: 0.9361 Epoch 205/500 113s 226ms/step - loss: 0.2605 - acc: 0.9757 - val_loss: 0.3963 - val_acc: 0.9366 Epoch 206/500 113s 226ms/step - loss: 0.2609 - acc: 0.9750 - val_loss: 0.3835 - val_acc: 0.9405 Epoch 207/500 113s 226ms/step - loss: 0.2599 - acc: 0.9744 - val_loss: 0.3933 - val_acc: 0.9370 Epoch 208/500 113s 226ms/step - loss: 0.2628 - acc: 0.9737 - val_loss: 0.4033 - val_acc: 0.9340 Epoch 209/500 113s 226ms/step - loss: 0.2612 - acc: 0.9731 - val_loss: 0.3999 - val_acc: 0.9342 Epoch 210/500 113s 226ms/step - loss: 0.2619 - acc: 0.9736 - val_loss: 0.3882 - val_acc: 0.9348 Epoch 211/500 113s 226ms/step - loss: 0.2550 - acc: 0.9753 - val_loss: 0.3986 - val_acc: 0.9367 Epoch 212/500 113s 226ms/step - loss: 0.2590 - acc: 0.9730 - val_loss: 0.3952 - val_acc: 0.9347 Epoch 213/500 113s 226ms/step - loss: 0.2566 - acc: 0.9742 - val_loss: 0.3871 - val_acc: 0.9378 Epoch 214/500 113s 226ms/step - loss: 0.2521 - acc: 0.9751 - val_loss: 0.3802 - val_acc: 0.9393 Epoch 215/500 113s 226ms/step - loss: 0.2532 - acc: 0.9745 - val_loss: 0.3808 - val_acc: 0.9370 Epoch 216/500 113s 227ms/step - loss: 0.2480 - acc: 0.9764 - val_loss: 0.3828 - val_acc: 0.9356 Epoch 217/500 113s 226ms/step - loss: 0.2516 - acc: 0.9742 - val_loss: 0.3902 - val_acc: 0.9355 Epoch 218/500 113s 227ms/step - loss: 0.2479 - acc: 0.9761 - val_loss: 0.3846 - val_acc: 0.9358 Epoch 219/500 113s 226ms/step - loss: 0.2514 - acc: 0.9745 - val_loss: 0.3882 - val_acc: 0.9344 Epoch 220/500 113s 227ms/step - loss: 0.2563 - acc: 0.9715 - val_loss: 0.3814 - val_acc: 0.9362 Epoch 221/500 113s 226ms/step - loss: 0.2500 - acc: 0.9738 - val_loss: 0.3930 - val_acc: 0.9326 Epoch 222/500 113s 226ms/step - loss: 0.2479 - acc: 0.9739 - val_loss: 0.3908 - val_acc: 0.9330 Epoch 223/500 113s 226ms/step - loss: 0.2487 - acc: 0.9733 - val_loss: 0.3893 - val_acc: 0.9334 Epoch 224/500 113s 226ms/step - loss: 0.2468 - acc: 0.9741 - val_loss: 0.3931 - val_acc: 0.9317 Epoch 225/500 113s 227ms/step - loss: 0.2467 - acc: 0.9743 - val_loss: 0.3810 - val_acc: 0.9346 Epoch 226/500 113s 227ms/step - loss: 0.2484 - acc: 0.9735 - val_loss: 0.3867 - val_acc: 0.9356 Epoch 227/500 113s 226ms/step - loss: 0.2420 - acc: 0.9752 - val_loss: 0.3772 - val_acc: 0.9341 Epoch 228/500 112s 225ms/step - loss: 0.2455 - acc: 0.9740 - val_loss: 0.3844 - val_acc: 0.9348 Epoch 229/500 112s 224ms/step - loss: 0.2452 - acc: 0.9729 - val_loss: 0.3765 - val_acc: 0.9355 Epoch 230/500 112s 224ms/step - loss: 0.2447 - acc: 0.9742 - val_loss: 0.3883 - val_acc: 0.9315 Epoch 231/500 112s 224ms/step - loss: 0.2451 - acc: 0.9743 - val_loss: 0.3814 - val_acc: 0.9350 Epoch 232/500 113s 226ms/step - loss: 0.2422 - acc: 0.9745 - val_loss: 0.3960 - val_acc: 0.9344 Epoch 233/500 113s 226ms/step - loss: 0.2392 - acc: 0.9759 - val_loss: 0.3841 - val_acc: 0.9340 Epoch 234/500 113s 226ms/step - loss: 0.2401 - acc: 0.9751 - val_loss: 0.3749 - val_acc: 0.9378 Epoch 235/500 113s 226ms/step - loss: 0.2428 - acc: 0.9733 - val_loss: 0.3801 - val_acc: 0.9339 Epoch 236/500 113s 226ms/step - loss: 0.2423 - acc: 0.9728 - val_loss: 0.3838 - val_acc: 0.9317 Epoch 237/500 113s 226ms/step - loss: 0.2447 - acc: 0.9739 - val_loss: 0.3912 - val_acc: 0.9336 Epoch 238/500 113s 226ms/step - loss: 0.2415 - acc: 0.9734 - val_loss: 0.3828 - val_acc: 0.9316 Epoch 239/500 113s 225ms/step - loss: 0.2422 - acc: 0.9736 - val_loss: 0.3828 - val_acc: 0.9348 Epoch 240/500 113s 225ms/step - loss: 0.2409 - acc: 0.9735 - val_loss: 0.3760 - val_acc: 0.9357 Epoch 241/500 113s 225ms/step - loss: 0.2414 - acc: 0.9738 - val_loss: 0.3782 - val_acc: 0.9333 Epoch 242/500 113s 225ms/step - loss: 0.2379 - acc: 0.9747 - val_loss: 0.3821 - val_acc: 0.9334 Epoch 243/500 113s 225ms/step - loss: 0.2370 - acc: 0.9746 - val_loss: 0.3912 - val_acc: 0.9333 Epoch 244/500 113s 225ms/step - loss: 0.2399 - acc: 0.9730 - val_loss: 0.3748 - val_acc: 0.9351 Epoch 245/500 112s 225ms/step - loss: 0.2402 - acc: 0.9729 - val_loss: 0.3815 - val_acc: 0.9326 Epoch 246/500 112s 225ms/step - loss: 0.2405 - acc: 0.9732 - val_loss: 0.3700 - val_acc: 0.9370 Epoch 247/500 113s 225ms/step - loss: 0.2383 - acc: 0.9743 - val_loss: 0.3789 - val_acc: 0.9350 Epoch 248/500 113s 226ms/step - loss: 0.2354 - acc: 0.9752 - val_loss: 0.3728 - val_acc: 0.9353 Epoch 249/500 113s 226ms/step - loss: 0.2341 - acc: 0.9751 - val_loss: 0.3940 - val_acc: 0.9303 Epoch 250/500 113s 226ms/step - loss: 0.2365 - acc: 0.9742 - val_loss: 0.3741 - val_acc: 0.9354 Epoch 251/500 113s 226ms/step - loss: 0.2384 - acc: 0.9741 - val_loss: 0.3947 - val_acc: 0.9274 Epoch 252/500 113s 226ms/step - loss: 0.2348 - acc: 0.9744 - val_loss: 0.3767 - val_acc: 0.9321 Epoch 253/500 113s 226ms/step - loss: 0.2389 - acc: 0.9733 - val_loss: 0.3813 - val_acc: 0.9313 Epoch 254/500 113s 226ms/step - loss: 0.2364 - acc: 0.9744 - val_loss: 0.3834 - val_acc: 0.9344 Epoch 255/500 113s 226ms/step - loss: 0.2392 - acc: 0.9737 - val_loss: 0.3870 - val_acc: 0.9295 Epoch 256/500 113s 226ms/step - loss: 0.2359 - acc: 0.9737 - val_loss: 0.3754 - val_acc: 0.9334 Epoch 257/500 113s 227ms/step - loss: 0.2395 - acc: 0.9726 - val_loss: 0.3790 - val_acc: 0.9330 Epoch 258/500 113s 226ms/step - loss: 0.2328 - acc: 0.9752 - val_loss: 0.3878 - val_acc: 0.9319 Epoch 259/500 113s 225ms/step - loss: 0.2371 - acc: 0.9728 - val_loss: 0.3820 - val_acc: 0.9336 Epoch 260/500 112s 225ms/step - loss: 0.2331 - acc: 0.9749 - val_loss: 0.3849 - val_acc: 0.9307 Epoch 261/500 113s 225ms/step - loss: 0.2357 - acc: 0.9736 - val_loss: 0.3882 - val_acc: 0.9310 Epoch 262/500 113s 225ms/step - loss: 0.2369 - acc: 0.9735 - val_loss: 0.3761 - val_acc: 0.9344 Epoch 263/500 113s 225ms/step - loss: 0.2344 - acc: 0.9741 - val_loss: 0.3788 - val_acc: 0.9324 Epoch 264/500 113s 225ms/step - loss: 0.2360 - acc: 0.9730 - val_loss: 0.3844 - val_acc: 0.9285 Epoch 265/500 113s 225ms/step - loss: 0.2370 - acc: 0.9737 - val_loss: 0.3862 - val_acc: 0.9309 Epoch 266/500 113s 226ms/step - loss: 0.2353 - acc: 0.9735 - val_loss: 0.3754 - val_acc: 0.9333 Epoch 267/500 113s 225ms/step - loss: 0.2355 - acc: 0.9737 - val_loss: 0.3944 - val_acc: 0.9294 Epoch 268/500 113s 225ms/step - loss: 0.2296 - acc: 0.9758 - val_loss: 0.3946 - val_acc: 0.9307 Epoch 269/500 112s 225ms/step - loss: 0.2355 - acc: 0.9732 - val_loss: 0.3855 - val_acc: 0.9322 Epoch 270/500 112s 225ms/step - loss: 0.2351 - acc: 0.9742 - val_loss: 0.3753 - val_acc: 0.9336 Epoch 271/500 113s 225ms/step - loss: 0.2336 - acc: 0.9745 - val_loss: 0.3856 - val_acc: 0.9281 Epoch 272/500 113s 225ms/step - loss: 0.2359 - acc: 0.9736 - val_loss: 0.3606 - val_acc: 0.9368 Epoch 273/500 113s 225ms/step - loss: 0.2301 - acc: 0.9751 - val_loss: 0.3759 - val_acc: 0.9334 Epoch 274/500 113s 225ms/step - loss: 0.2307 - acc: 0.9751 - val_loss: 0.3776 - val_acc: 0.9322 Epoch 275/500 113s 225ms/step - loss: 0.2349 - acc: 0.9742 - val_loss: 0.3715 - val_acc: 0.9376 Epoch 276/500 113s 225ms/step - loss: 0.2393 - acc: 0.9719 - val_loss: 0.3619 - val_acc: 0.9383 Epoch 277/500 113s 225ms/step - loss: 0.2299 - acc: 0.9750 - val_loss: 0.3697 - val_acc: 0.9340 Epoch 278/500 112s 225ms/step - loss: 0.2314 - acc: 0.9743 - val_loss: 0.3743 - val_acc: 0.9303 Epoch 279/500 112s 225ms/step - loss: 0.2325 - acc: 0.9735 - val_loss: 0.3725 - val_acc: 0.9317 Epoch 280/500 112s 225ms/step - loss: 0.2337 - acc: 0.9738 - val_loss: 0.3929 - val_acc: 0.9284 Epoch 281/500 113s 225ms/step - loss: 0.2311 - acc: 0.9751 - val_loss: 0.3826 - val_acc: 0.9303 Epoch 282/500 113s 225ms/step - loss: 0.2316 - acc: 0.9753 - val_loss: 0.3922 - val_acc: 0.9295 Epoch 283/500 113s 225ms/step - loss: 0.2321 - acc: 0.9741 - val_loss: 0.3757 - val_acc: 0.9313 Epoch 284/500 112s 225ms/step - loss: 0.2323 - acc: 0.9744 - val_loss: 0.3874 - val_acc: 0.9296 Epoch 285/500 112s 225ms/step - loss: 0.2318 - acc: 0.9752 - val_loss: 0.4014 - val_acc: 0.9278 Epoch 286/500 112s 225ms/step - loss: 0.2314 - acc: 0.9744 - val_loss: 0.3838 - val_acc: 0.9332 Epoch 287/500 112s 225ms/step - loss: 0.2324 - acc: 0.9741 - val_loss: 0.3912 - val_acc: 0.9284 Epoch 288/500 112s 225ms/step - loss: 0.2325 - acc: 0.9735 - val_loss: 0.3842 - val_acc: 0.9317 Epoch 289/500 113s 225ms/step - loss: 0.2285 - acc: 0.9760 - val_loss: 0.3814 - val_acc: 0.9328 Epoch 290/500 113s 225ms/step - loss: 0.2286 - acc: 0.9759 - val_loss: 0.3796 - val_acc: 0.9326 Epoch 291/500 112s 225ms/step - loss: 0.2306 - acc: 0.9752 - val_loss: 0.3871 - val_acc: 0.9281 Epoch 292/500 112s 225ms/step - loss: 0.2304 - acc: 0.9742 - val_loss: 0.3822 - val_acc: 0.9302 Epoch 293/500 112s 225ms/step - loss: 0.2300 - acc: 0.9742 - val_loss: 0.3958 - val_acc: 0.9304 Epoch 294/500 112s 225ms/step - loss: 0.2308 - acc: 0.9740 - val_loss: 0.3838 - val_acc: 0.9301 Epoch 295/500 113s 225ms/step - loss: 0.2336 - acc: 0.9721 - val_loss: 0.3784 - val_acc: 0.9347 Epoch 296/500 113s 225ms/step - loss: 0.2316 - acc: 0.9743 - val_loss: 0.3737 - val_acc: 0.9308 Epoch 297/500 113s 225ms/step - loss: 0.2273 - acc: 0.9759 - val_loss: 0.3791 - val_acc: 0.9345 Epoch 298/500 113s 225ms/step - loss: 0.2303 - acc: 0.9750 - val_loss: 0.3935 - val_acc: 0.9289 Epoch 299/500 112s 225ms/step - loss: 0.2291 - acc: 0.9750 - val_loss: 0.3793 - val_acc: 0.9300 Epoch 300/500 113s 225ms/step - loss: 0.2299 - acc: 0.9746 - val_loss: 0.3846 - val_acc: 0.9306 Epoch 301/500 lr changed to 0.0009999999776482583 112s 225ms/step - loss: 0.2081 - acc: 0.9831 - val_loss: 0.3536 - val_acc: 0.9390 Epoch 302/500 113s 226ms/step - loss: 0.1928 - acc: 0.9890 - val_loss: 0.3467 - val_acc: 0.9398 Epoch 303/500 113s 226ms/step - loss: 0.1888 - acc: 0.9899 - val_loss: 0.3437 - val_acc: 0.9412 Epoch 304/500 113s 226ms/step - loss: 0.1863 - acc: 0.9907 - val_loss: 0.3394 - val_acc: 0.9441 Epoch 305/500 113s 226ms/step - loss: 0.1840 - acc: 0.9912 - val_loss: 0.3429 - val_acc: 0.9433 Epoch 306/500 113s 227ms/step - loss: 0.1829 - acc: 0.9915 - val_loss: 0.3398 - val_acc: 0.9446 Epoch 307/500 113s 226ms/step - loss: 0.1798 - acc: 0.9928 - val_loss: 0.3412 - val_acc: 0.9450 Epoch 308/500 113s 226ms/step - loss: 0.1800 - acc: 0.9920 - val_loss: 0.3410 - val_acc: 0.9457 Epoch 309/500 113s 227ms/step - loss: 0.1785 - acc: 0.9929 - val_loss: 0.3397 - val_acc: 0.9451 Epoch 310/500 113s 226ms/step - loss: 0.1784 - acc: 0.9926 - val_loss: 0.3417 - val_acc: 0.9449 Epoch 311/500 113s 226ms/step - loss: 0.1759 - acc: 0.9935 - val_loss: 0.3421 - val_acc: 0.9452 Epoch 312/500 113s 226ms/step - loss: 0.1747 - acc: 0.9942 - val_loss: 0.3403 - val_acc: 0.9456 Epoch 313/500 113s 227ms/step - loss: 0.1750 - acc: 0.9936 - val_loss: 0.3413 - val_acc: 0.9442 Epoch 314/500 113s 227ms/step - loss: 0.1745 - acc: 0.9941 - val_loss: 0.3404 - val_acc: 0.9459 Epoch 315/500 113s 226ms/step - loss: 0.1714 - acc: 0.9948 - val_loss: 0.3407 - val_acc: 0.9466 Epoch 316/500 113s 227ms/step - loss: 0.1709 - acc: 0.9949 - val_loss: 0.3393 - val_acc: 0.9478 Epoch 317/500 113s 226ms/step - loss: 0.1714 - acc: 0.9944 - val_loss: 0.3402 - val_acc: 0.9464 Epoch 318/500 113s 227ms/step - loss: 0.1709 - acc: 0.9946 - val_loss: 0.3412 - val_acc: 0.9453 Epoch 319/500 113s 227ms/step - loss: 0.1700 - acc: 0.9949 - val_loss: 0.3433 - val_acc: 0.9454 Epoch 320/500 113s 227ms/step - loss: 0.1697 - acc: 0.9948 - val_loss: 0.3413 - val_acc: 0.9452 Epoch 321/500 113s 226ms/step - loss: 0.1689 - acc: 0.9948 - val_loss: 0.3382 - val_acc: 0.9460 Epoch 322/500 113s 226ms/step - loss: 0.1680 - acc: 0.9951 - val_loss: 0.3406 - val_acc: 0.9461 Epoch 323/500 113s 226ms/step - loss: 0.1674 - acc: 0.9953 - val_loss: 0.3395 - val_acc: 0.9467 Epoch 324/500 113s 225ms/step - loss: 0.1683 - acc: 0.9947 - val_loss: 0.3424 - val_acc: 0.9473 Epoch 325/500 112s 225ms/step - loss: 0.1659 - acc: 0.9957 - val_loss: 0.3431 - val_acc: 0.9458 Epoch 326/500 113s 225ms/step - loss: 0.1666 - acc: 0.9951 - val_loss: 0.3427 - val_acc: 0.9461 Epoch 327/500 113s 225ms/step - loss: 0.1655 - acc: 0.9955 - val_loss: 0.3434 - val_acc: 0.9454 Epoch 328/500 113s 225ms/step - loss: 0.1666 - acc: 0.9948 - val_loss: 0.3415 - val_acc: 0.9466 Epoch 329/500 113s 226ms/step - loss: 0.1660 - acc: 0.9955 - val_loss: 0.3420 - val_acc: 0.9461 Epoch 330/500 113s 225ms/step - loss: 0.1655 - acc: 0.9954 - val_loss: 0.3414 - val_acc: 0.9461 Epoch 331/500 113s 225ms/step - loss: 0.1654 - acc: 0.9951 - val_loss: 0.3424 - val_acc: 0.9461 Epoch 332/500 113s 225ms/step - loss: 0.1638 - acc: 0.9959 - val_loss: 0.3433 - val_acc: 0.9455 Epoch 333/500 112s 225ms/step - loss: 0.1635 - acc: 0.9958 - val_loss: 0.3471 - val_acc: 0.9449 Epoch 334/500 113s 225ms/step - loss: 0.1641 - acc: 0.9955 - val_loss: 0.3459 - val_acc: 0.9453 Epoch 335/500 112s 225ms/step - loss: 0.1625 - acc: 0.9960 - val_loss: 0.3452 - val_acc: 0.9448 Epoch 336/500 113s 225ms/step - loss: 0.1623 - acc: 0.9957 - val_loss: 0.3459 - val_acc: 0.9452 Epoch 337/500 113s 226ms/step - loss: 0.1623 - acc: 0.9958 - val_loss: 0.3450 - val_acc: 0.9455 Epoch 338/500 113s 225ms/step - loss: 0.1608 - acc: 0.9962 - val_loss: 0.3457 - val_acc: 0.9459 Epoch 339/500 113s 225ms/step - loss: 0.1609 - acc: 0.9959 - val_loss: 0.3453 - val_acc: 0.9461 Epoch 340/500 112s 225ms/step - loss: 0.1609 - acc: 0.9958 - val_loss: 0.3462 - val_acc: 0.9444 Epoch 341/500 113s 225ms/step - loss: 0.1601 - acc: 0.9961 - val_loss: 0.3452 - val_acc: 0.9470 Epoch 342/500 113s 225ms/step - loss: 0.1603 - acc: 0.9959 - val_loss: 0.3451 - val_acc: 0.9459 Epoch 343/500 113s 225ms/step - loss: 0.1602 - acc: 0.9961 - val_loss: 0.3421 - val_acc: 0.9462 Epoch 344/500 113s 225ms/step - loss: 0.1607 - acc: 0.9959 - val_loss: 0.3442 - val_acc: 0.9456 Epoch 345/500 113s 226ms/step - loss: 0.1589 - acc: 0.9964 - val_loss: 0.3431 - val_acc: 0.9461 Epoch 346/500 113s 226ms/step - loss: 0.1588 - acc: 0.9962 - val_loss: 0.3445 - val_acc: 0.9461 Epoch 347/500 113s 225ms/step - loss: 0.1585 - acc: 0.9960 - val_loss: 0.3415 - val_acc: 0.9452 Epoch 348/500 113s 225ms/step - loss: 0.1569 - acc: 0.9967 - val_loss: 0.3407 - val_acc: 0.9459 Epoch 349/500 112s 225ms/step - loss: 0.1574 - acc: 0.9966 - val_loss: 0.3378 - val_acc: 0.9473 Epoch 350/500 113s 226ms/step - loss: 0.1580 - acc: 0.9960 - val_loss: 0.3403 - val_acc: 0.9466 Epoch 351/500 113s 226ms/step - loss: 0.1577 - acc: 0.9961 - val_loss: 0.3405 - val_acc: 0.9461 Epoch 352/500 113s 226ms/step - loss: 0.1560 - acc: 0.9968 - val_loss: 0.3381 - val_acc: 0.9478 Epoch 353/500 113s 226ms/step - loss: 0.1569 - acc: 0.9962 - val_loss: 0.3405 - val_acc: 0.9467 Epoch 354/500 113s 226ms/step - loss: 0.1564 - acc: 0.9964 - val_loss: 0.3428 - val_acc: 0.9446 Epoch 355/500 113s 226ms/step - loss: 0.1557 - acc: 0.9967 - val_loss: 0.3414 - val_acc: 0.9453 Epoch 356/500 113s 226ms/step - loss: 0.1552 - acc: 0.9965 - val_loss: 0.3409 - val_acc: 0.9451 Epoch 357/500 113s 226ms/step - loss: 0.1557 - acc: 0.9964 - val_loss: 0.3384 - val_acc: 0.9463 Epoch 358/500 113s 226ms/step - loss: 0.1553 - acc: 0.9965 - val_loss: 0.3404 - val_acc: 0.9476 Epoch 359/500 113s 226ms/step - loss: 0.1545 - acc: 0.9962 - val_loss: 0.3439 - val_acc: 0.9462 Epoch 360/500 113s 226ms/step - loss: 0.1552 - acc: 0.9963 - val_loss: 0.3407 - val_acc: 0.9468 Epoch 361/500 113s 227ms/step - loss: 0.1544 - acc: 0.9966 - val_loss: 0.3405 - val_acc: 0.9462 Epoch 362/500 113s 226ms/step - loss: 0.1538 - acc: 0.9968 - val_loss: 0.3421 - val_acc: 0.9458 Epoch 363/500 113s 227ms/step - loss: 0.1537 - acc: 0.9964 - val_loss: 0.3379 - val_acc: 0.9475 Epoch 364/500 113s 226ms/step - loss: 0.1534 - acc: 0.9964 - val_loss: 0.3379 - val_acc: 0.9464 Epoch 365/500 113s 226ms/step - loss: 0.1518 - acc: 0.9970 - val_loss: 0.3386 - val_acc: 0.9465 Epoch 366/500 113s 226ms/step - loss: 0.1524 - acc: 0.9968 - val_loss: 0.3393 - val_acc: 0.9477 Epoch 367/500 113s 226ms/step - loss: 0.1517 - acc: 0.9969 - val_loss: 0.3394 - val_acc: 0.9469 Epoch 368/500 113s 226ms/step - loss: 0.1513 - acc: 0.9969 - val_loss: 0.3384 - val_acc: 0.9478 Epoch 369/500 113s 227ms/step - loss: 0.1525 - acc: 0.9961 - val_loss: 0.3363 - val_acc: 0.9481 Epoch 370/500 113s 227ms/step - loss: 0.1518 - acc: 0.9962 - val_loss: 0.3387 - val_acc: 0.9476 Epoch 371/500 113s 226ms/step - loss: 0.1508 - acc: 0.9967 - val_loss: 0.3377 - val_acc: 0.9464 Epoch 372/500 113s 226ms/step - loss: 0.1504 - acc: 0.9968 - val_loss: 0.3354 - val_acc: 0.9480 Epoch 373/500 113s 226ms/step - loss: 0.1501 - acc: 0.9970 - val_loss: 0.3368 - val_acc: 0.9482 Epoch 374/500 113s 226ms/step - loss: 0.1507 - acc: 0.9962 - val_loss: 0.3427 - val_acc: 0.9460 Epoch 375/500 113s 226ms/step - loss: 0.1501 - acc: 0.9966 - val_loss: 0.3393 - val_acc: 0.9467 Epoch 376/500 113s 226ms/step - loss: 0.1502 - acc: 0.9968 - val_loss: 0.3370 - val_acc: 0.9473 Epoch 377/500 113s 226ms/step - loss: 0.1502 - acc: 0.9963 - val_loss: 0.3394 - val_acc: 0.9483 Epoch 378/500 113s 227ms/step - loss: 0.1493 - acc: 0.9968 - val_loss: 0.3388 - val_acc: 0.9462 Epoch 379/500 113s 226ms/step - loss: 0.1488 - acc: 0.9967 - val_loss: 0.3359 - val_acc: 0.9469 Epoch 380/500 113s 226ms/step - loss: 0.1480 - acc: 0.9971 - val_loss: 0.3339 - val_acc: 0.9491 Epoch 381/500 113s 226ms/step - loss: 0.1490 - acc: 0.9969 - val_loss: 0.3339 - val_acc: 0.9491 Epoch 382/500 113s 226ms/step - loss: 0.1480 - acc: 0.9967 - val_loss: 0.3327 - val_acc: 0.9488 Epoch 383/500 113s 226ms/step - loss: 0.1471 - acc: 0.9970 - val_loss: 0.3320 - val_acc: 0.9482 Epoch 384/500 113s 226ms/step - loss: 0.1464 - acc: 0.9972 - val_loss: 0.3324 - val_acc: 0.9473 Epoch 385/500 113s 226ms/step - loss: 0.1476 - acc: 0.9967 - val_loss: 0.3372 - val_acc: 0.9466 Epoch 386/500 113s 226ms/step - loss: 0.1474 - acc: 0.9966 - val_loss: 0.3369 - val_acc: 0.9467 Epoch 387/500 113s 227ms/step - loss: 0.1478 - acc: 0.9964 - val_loss: 0.3360 - val_acc: 0.9486 Epoch 388/500 113s 227ms/step - loss: 0.1474 - acc: 0.9967 - val_loss: 0.3312 - val_acc: 0.9481 Epoch 389/500 113s 226ms/step - loss: 0.1460 - acc: 0.9969 - val_loss: 0.3304 - val_acc: 0.9486 Epoch 390/500 113s 226ms/step - loss: 0.1448 - acc: 0.9974 - val_loss: 0.3322 - val_acc: 0.9502 Epoch 391/500 113s 226ms/step - loss: 0.1456 - acc: 0.9971 - val_loss: 0.3331 - val_acc: 0.9494 Epoch 392/500 113s 227ms/step - loss: 0.1455 - acc: 0.9970 - val_loss: 0.3367 - val_acc: 0.9477 Epoch 393/500 113s 227ms/step - loss: 0.1452 - acc: 0.9968 - val_loss: 0.3359 - val_acc: 0.9479 Epoch 394/500 113s 226ms/step - loss: 0.1446 - acc: 0.9971 - val_loss: 0.3331 - val_acc: 0.9484 Epoch 395/500 113s 226ms/step - loss: 0.1455 - acc: 0.9965 - val_loss: 0.3309 - val_acc: 0.9512 Epoch 396/500 113s 227ms/step - loss: 0.1451 - acc: 0.9966 - val_loss: 0.3285 - val_acc: 0.9498 Epoch 397/500 113s 226ms/step - loss: 0.1439 - acc: 0.9970 - val_loss: 0.3292 - val_acc: 0.9496 Epoch 398/500 113s 226ms/step - loss: 0.1436 - acc: 0.9971 - val_loss: 0.3320 - val_acc: 0.9488 Epoch 399/500 113s 226ms/step - loss: 0.1436 - acc: 0.9969 - val_loss: 0.3312 - val_acc: 0.9491 Epoch 400/500 113s 226ms/step - loss: 0.1447 - acc: 0.9967 - val_loss: 0.3280 - val_acc: 0.9486 Epoch 401/500 113s 225ms/step - loss: 0.1435 - acc: 0.9969 - val_loss: 0.3281 - val_acc: 0.9489 Epoch 402/500 113s 225ms/step - loss: 0.1421 - acc: 0.9973 - val_loss: 0.3280 - val_acc: 0.9483 Epoch 403/500 113s 226ms/step - loss: 0.1426 - acc: 0.9970 - val_loss: 0.3281 - val_acc: 0.9478 Epoch 404/500 113s 227ms/step - loss: 0.1427 - acc: 0.9969 - val_loss: 0.3269 - val_acc: 0.9484 Epoch 405/500 113s 226ms/step - loss: 0.1425 - acc: 0.9969 - val_loss: 0.3267 - val_acc: 0.9495 Epoch 406/500 113s 226ms/step - loss: 0.1417 - acc: 0.9971 - val_loss: 0.3263 - val_acc: 0.9483 Epoch 407/500 113s 226ms/step - loss: 0.1422 - acc: 0.9971 - val_loss: 0.3268 - val_acc: 0.9496 Epoch 408/500 113s 226ms/step - loss: 0.1413 - acc: 0.9971 - val_loss: 0.3270 - val_acc: 0.9487 Epoch 409/500 113s 226ms/step - loss: 0.1417 - acc: 0.9970 - val_loss: 0.3246 - val_acc: 0.9499 Epoch 410/500 113s 226ms/step - loss: 0.1412 - acc: 0.9969 - val_loss: 0.3243 - val_acc: 0.9488 Epoch 411/500 113s 226ms/step - loss: 0.1405 - acc: 0.9973 - val_loss: 0.3263 - val_acc: 0.9503 Epoch 412/500 113s 226ms/step - loss: 0.1406 - acc: 0.9971 - val_loss: 0.3222 - val_acc: 0.9497 Epoch 413/500 113s 226ms/step - loss: 0.1412 - acc: 0.9968 - val_loss: 0.3249 - val_acc: 0.9497 Epoch 414/500 113s 226ms/step - loss: 0.1401 - acc: 0.9971 - val_loss: 0.3257 - val_acc: 0.9487 Epoch 415/500 113s 226ms/step - loss: 0.1394 - acc: 0.9973 - val_loss: 0.3263 - val_acc: 0.9492 Epoch 416/500 113s 226ms/step - loss: 0.1394 - acc: 0.9973 - val_loss: 0.3279 - val_acc: 0.9470 Epoch 417/500 113s 227ms/step - loss: 0.1393 - acc: 0.9973 - val_loss: 0.3298 - val_acc: 0.9473 Epoch 418/500 113s 227ms/step - loss: 0.1387 - acc: 0.9973 - val_loss: 0.3277 - val_acc: 0.9478 Epoch 419/500 113s 227ms/step - loss: 0.1383 - acc: 0.9970 - val_loss: 0.3247 - val_acc: 0.9482 Epoch 420/500 113s 226ms/step - loss: 0.1390 - acc: 0.9971 - val_loss: 0.3288 - val_acc: 0.9465 Epoch 421/500 113s 226ms/step - loss: 0.1374 - acc: 0.9976 - val_loss: 0.3266 - val_acc: 0.9480 Epoch 422/500 113s 226ms/step - loss: 0.1385 - acc: 0.9972 - val_loss: 0.3261 - val_acc: 0.9489 Epoch 423/500 113s 226ms/step - loss: 0.1382 - acc: 0.9971 - val_loss: 0.3274 - val_acc: 0.9479 Epoch 424/500 113s 226ms/step - loss: 0.1377 - acc: 0.9973 - val_loss: 0.3287 - val_acc: 0.9478 Epoch 425/500 113s 226ms/step - loss: 0.1374 - acc: 0.9973 - val_loss: 0.3291 - val_acc: 0.9484 Epoch 426/500 113s 226ms/step - loss: 0.1367 - acc: 0.9977 - val_loss: 0.3282 - val_acc: 0.9483 Epoch 427/500 113s 226ms/step - loss: 0.1365 - acc: 0.9974 - val_loss: 0.3260 - val_acc: 0.9497 Epoch 428/500 113s 226ms/step - loss: 0.1366 - acc: 0.9973 - val_loss: 0.3257 - val_acc: 0.9498 Epoch 429/500 113s 225ms/step - loss: 0.1353 - acc: 0.9978 - val_loss: 0.3262 - val_acc: 0.9489 Epoch 430/500 112s 225ms/step - loss: 0.1365 - acc: 0.9972 - val_loss: 0.3315 - val_acc: 0.9463 Epoch 431/500 113s 225ms/step - loss: 0.1364 - acc: 0.9976 - val_loss: 0.3292 - val_acc: 0.9476 Epoch 432/500 113s 225ms/step - loss: 0.1356 - acc: 0.9973 - val_loss: 0.3270 - val_acc: 0.9489 Epoch 433/500 113s 225ms/step - loss: 0.1348 - acc: 0.9976 - val_loss: 0.3246 - val_acc: 0.9495 Epoch 434/500 113s 225ms/step - loss: 0.1350 - acc: 0.9975 - val_loss: 0.3265 - val_acc: 0.9479 Epoch 435/500 113s 225ms/step - loss: 0.1360 - acc: 0.9969 - val_loss: 0.3319 - val_acc: 0.9479 Epoch 436/500 113s 225ms/step - loss: 0.1344 - acc: 0.9975 - val_loss: 0.3297 - val_acc: 0.9472 Epoch 437/500 112s 225ms/step - loss: 0.1351 - acc: 0.9969 - val_loss: 0.3296 - val_acc: 0.9484 Epoch 438/500 112s 225ms/step - loss: 0.1349 - acc: 0.9972 - val_loss: 0.3268 - val_acc: 0.9483 Epoch 439/500 113s 225ms/step - loss: 0.1337 - acc: 0.9974 - val_loss: 0.3236 - val_acc: 0.9485 Epoch 440/500 113s 226ms/step - loss: 0.1335 - acc: 0.9978 - val_loss: 0.3239 - val_acc: 0.9473 Epoch 441/500 113s 226ms/step - loss: 0.1337 - acc: 0.9975 - val_loss: 0.3215 - val_acc: 0.9489 Epoch 442/500 113s 226ms/step - loss: 0.1327 - acc: 0.9976 - val_loss: 0.3201 - val_acc: 0.9497 Epoch 443/500 113s 226ms/step - loss: 0.1338 - acc: 0.9973 - val_loss: 0.3210 - val_acc: 0.9501 Epoch 444/500 113s 227ms/step - loss: 0.1335 - acc: 0.9975 - val_loss: 0.3232 - val_acc: 0.9487 Epoch 445/500 113s 226ms/step - loss: 0.1325 - acc: 0.9974 - val_loss: 0.3232 - val_acc: 0.9487 Epoch 446/500 113s 226ms/step - loss: 0.1344 - acc: 0.9968 - val_loss: 0.3225 - val_acc: 0.9485 Epoch 447/500 113s 226ms/step - loss: 0.1317 - acc: 0.9978 - val_loss: 0.3251 - val_acc: 0.9471 Epoch 448/500 113s 226ms/step - loss: 0.1331 - acc: 0.9969 - val_loss: 0.3241 - val_acc: 0.9493 Epoch 449/500 113s 226ms/step - loss: 0.1322 - acc: 0.9974 - val_loss: 0.3257 - val_acc: 0.9484 Epoch 450/500 113s 226ms/step - loss: 0.1313 - acc: 0.9978 - val_loss: 0.3216 - val_acc: 0.9492 Epoch 451/500 lr changed to 9.999999310821295e-05 113s 226ms/step - loss: 0.1308 - acc: 0.9979 - val_loss: 0.3216 - val_acc: 0.9498 Epoch 452/500 113s 226ms/step - loss: 0.1318 - acc: 0.9971 - val_loss: 0.3211 - val_acc: 0.9492 Epoch 453/500 112s 225ms/step - loss: 0.1308 - acc: 0.9976 - val_loss: 0.3210 - val_acc: 0.9497 Epoch 454/500 113s 225ms/step - loss: 0.1297 - acc: 0.9981 - val_loss: 0.3207 - val_acc: 0.9494 Epoch 455/500 113s 225ms/step - loss: 0.1309 - acc: 0.9978 - val_loss: 0.3204 - val_acc: 0.9493 Epoch 456/500 113s 226ms/step - loss: 0.1312 - acc: 0.9978 - val_loss: 0.3202 - val_acc: 0.9494 Epoch 457/500 113s 225ms/step - loss: 0.1300 - acc: 0.9979 - val_loss: 0.3200 - val_acc: 0.9496 Epoch 458/500 113s 226ms/step - loss: 0.1307 - acc: 0.9979 - val_loss: 0.3196 - val_acc: 0.9497 Epoch 459/500 113s 226ms/step - loss: 0.1303 - acc: 0.9978 - val_loss: 0.3195 - val_acc: 0.9505 Epoch 460/500 112s 225ms/step - loss: 0.1305 - acc: 0.9976 - val_loss: 0.3195 - val_acc: 0.9499 Epoch 461/500 113s 225ms/step - loss: 0.1301 - acc: 0.9979 - val_loss: 0.3194 - val_acc: 0.9501 Epoch 462/500 112s 225ms/step - loss: 0.1303 - acc: 0.9978 - val_loss: 0.3187 - val_acc: 0.9498 Epoch 463/500 113s 226ms/step - loss: 0.1306 - acc: 0.9977 - val_loss: 0.3191 - val_acc: 0.9503 Epoch 464/500 113s 225ms/step - loss: 0.1299 - acc: 0.9978 - val_loss: 0.3188 - val_acc: 0.9506 Epoch 465/500 113s 225ms/step - loss: 0.1302 - acc: 0.9978 - val_loss: 0.3189 - val_acc: 0.9501 Epoch 466/500 113s 227ms/step - loss: 0.1300 - acc: 0.9980 - val_loss: 0.3187 - val_acc: 0.9499 Epoch 467/500 113s 226ms/step - loss: 0.1302 - acc: 0.9980 - val_loss: 0.3187 - val_acc: 0.9502 Epoch 468/500 113s 225ms/step - loss: 0.1299 - acc: 0.9979 - val_loss: 0.3184 - val_acc: 0.9501 Epoch 469/500 113s 225ms/step - loss: 0.1291 - acc: 0.9982 - val_loss: 0.3185 - val_acc: 0.9503 Epoch 470/500 113s 225ms/step - loss: 0.1298 - acc: 0.9980 - val_loss: 0.3182 - val_acc: 0.9501 Epoch 471/500 113s 225ms/step - loss: 0.1297 - acc: 0.9979 - val_loss: 0.3181 - val_acc: 0.9503 Epoch 472/500 113s 225ms/step - loss: 0.1300 - acc: 0.9979 - val_loss: 0.3184 - val_acc: 0.9503 Epoch 473/500 113s 225ms/step - loss: 0.1299 - acc: 0.9980 - val_loss: 0.3184 - val_acc: 0.9505 Epoch 474/500 113s 225ms/step - loss: 0.1306 - acc: 0.9976 - val_loss: 0.3180 - val_acc: 0.9506 Epoch 475/500 112s 225ms/step - loss: 0.1302 - acc: 0.9978 - val_loss: 0.3178 - val_acc: 0.9504 Epoch 476/500 113s 225ms/step - loss: 0.1297 - acc: 0.9977 - val_loss: 0.3177 - val_acc: 0.9503 Epoch 477/500 113s 225ms/step - loss: 0.1295 - acc: 0.9980 - val_loss: 0.3173 - val_acc: 0.9501 Epoch 478/500 112s 225ms/step - loss: 0.1297 - acc: 0.9981 - val_loss: 0.3172 - val_acc: 0.9501 Epoch 479/500 112s 225ms/step - loss: 0.1299 - acc: 0.9978 - val_loss: 0.3171 - val_acc: 0.9508 Epoch 480/500 113s 225ms/step - loss: 0.1291 - acc: 0.9980 - val_loss: 0.3174 - val_acc: 0.9506 Epoch 481/500 113s 225ms/step - loss: 0.1297 - acc: 0.9981 - val_loss: 0.3177 - val_acc: 0.9499 Epoch 482/500 113s 226ms/step - loss: 0.1295 - acc: 0.9980 - val_loss: 0.3178 - val_acc: 0.9506 Epoch 483/500 113s 225ms/step - loss: 0.1298 - acc: 0.9977 - val_loss: 0.3176 - val_acc: 0.9508 Epoch 484/500 113s 225ms/step - loss: 0.1295 - acc: 0.9977 - val_loss: 0.3181 - val_acc: 0.9503 Epoch 485/500 113s 225ms/step - loss: 0.1286 - acc: 0.9984 - val_loss: 0.3184 - val_acc: 0.9502 Epoch 486/500 112s 225ms/step - loss: 0.1290 - acc: 0.9981 - val_loss: 0.3175 - val_acc: 0.9508 Epoch 487/500 112s 225ms/step - loss: 0.1292 - acc: 0.9980 - val_loss: 0.3177 - val_acc: 0.9505 Epoch 488/500 113s 225ms/step - loss: 0.1292 - acc: 0.9982 - val_loss: 0.3175 - val_acc: 0.9503 Epoch 489/500 113s 226ms/step - loss: 0.1300 - acc: 0.9978 - val_loss: 0.3176 - val_acc: 0.9503 Epoch 490/500 113s 225ms/step - loss: 0.1293 - acc: 0.9979 - val_loss: 0.3176 - val_acc: 0.9505 Epoch 491/500 113s 225ms/step - loss: 0.1289 - acc: 0.9981 - val_loss: 0.3177 - val_acc: 0.9501 Epoch 492/500 113s 225ms/step - loss: 0.1293 - acc: 0.9982 - val_loss: 0.3174 - val_acc: 0.9504 Epoch 493/500 112s 225ms/step - loss: 0.1285 - acc: 0.9983 - val_loss: 0.3178 - val_acc: 0.9503 Epoch 494/500 112s 225ms/step - loss: 0.1297 - acc: 0.9979 - val_loss: 0.3178 - val_acc: 0.9501 Epoch 495/500 113s 225ms/step - loss: 0.1290 - acc: 0.9979 - val_loss: 0.3174 - val_acc: 0.9505 Epoch 496/500 113s 225ms/step - loss: 0.1292 - acc: 0.9979 - val_loss: 0.3171 - val_acc: 0.9508 Epoch 497/500 113s 225ms/step - loss: 0.1291 - acc: 0.9982 - val_loss: 0.3176 - val_acc: 0.9506 Epoch 498/500 113s 226ms/step - loss: 0.1285 - acc: 0.9982 - val_loss: 0.3180 - val_acc: 0.9505 Epoch 499/500 113s 225ms/step - loss: 0.1298 - acc: 0.9978 - val_loss: 0.3183 - val_acc: 0.9500 Epoch 500/500 113s 225ms/step - loss: 0.1290 - acc: 0.9981 - val_loss: 0.3182 - val_acc: 0.9512 Train loss: 0.1252169744670391 Train accuracy: 0.9990800008773804 Test loss: 0.31817472279071807 Test accuracy: 0.9512000060081482
準確率到了95.12%,看來增加深度還是管用的。相較於調參記錄20的94.17%高了接近1%。
如果深度再翻倍會怎麼樣呢?
Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458
https://ieeexplore.ieee.org/document/8998530
作者的哈工大主頁:
————————————————
版權宣告:本文為CSDN博主「dangqing1988」的原創文章,遵循CC 4.0 BY-SA版權協議,轉載請附上原文出處連結及本宣告。
原文連結:https://blog.csdn.net/dangqing1988/article/details/106157819
來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/69972329/viewspace-2692613/,如需轉載,請註明出處,否則將追究法律責任。
相關文章
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄18)Cifar10~94.28%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄19)Cifar10~93.96%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄26)Cifar10~95.92%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄20)Cifar10~94.17%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄22)Cifar10~95.25%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄23)Cifar10~95.47%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄24)Cifar10~95.80%函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄10)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄11)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄12)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄13)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄14)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄15)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄16)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄17)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄1)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄2)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄3)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄4)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄5)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄6)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄7)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄8)函式
- 深度殘差網路+自適應引數化ReLU啟用函式(調參記錄9)函式
- 注意力機制下的啟用函式:自適應引數化ReLU函式
- 深度殘差網路(ResNet)
- 深度學習之殘差網路深度學習
- 殘差網路再升級之深度殘差收縮網路(附Keras程式碼)Keras
- 深度殘差收縮網路:(三)網路結構
- 深度學習故障診斷——深度殘差收縮網路深度學習
- 深度殘差收縮網路:(一)背景知識
- 深度殘差收縮網路:(二)整體思路
- 學習筆記16:殘差網路筆記
- 從ReLU到GELU,一文概覽神經網路的啟用函式神經網路函式
- 十分鐘弄懂深度殘差收縮網路
- 深度殘差收縮網路:(五)實驗驗證
- 深度殘差收縮網路:(六)程式碼實現
- 殘差網路(Residual Networks, ResNets)