This is my CNN model structure.
def make_dcnn_model():
model = models.Sequential()
model.add(layers.Conv2D(5, (5, 5), input_shape=(9, 128,1), padding='same', strides = (1,2), activity_regularizer=tf.keras.regularizers.l1(0.001)))
model.add(layers.LeakyReLU())
model.add(BatchNormalization())
model.add(layers.AveragePooling2D((4, 4), strides = (2,4)))
model.add(layers.Conv2D(10, (5, 5), padding='same', activity_regularizer=tf.keras.regularizers.l1(0.001)))
model.add(layers.LeakyReLU())
model.add(BatchNormalization())
model.add(layers.AveragePooling2D((2, 2), strides = (1,2)))
model.add(layers.Flatten())
model.add(layers.Dense(50, activity_regularizer=tf.keras.regularizers.l1(0.001)))
model.add(layers.LeakyReLU())
model.add(BatchNormalization())
model.add(layers.Dense(6, activation='softmax'))
return model
The result shows that this model fit well the training data and for the validation data the great fluctuation of validation accuracy occurred.
Train on 7352 samples, validate on 2947 samples Epoch 1/3000 7352/7352 [==============================] - 3s 397us/sample - loss: 0.1016 - accuracy: 0.9698 - val_loss: 4.0896 - val_accuracy: 0.5816 Epoch 2/3000 7352/7352 [==============================] - 2s 214us/sample - loss: 0.0965 - accuracy: 0.9727 - val_loss: 1.2296 - val_accuracy: 0.7384 Epoch 3/3000 7352/7352 [==============================] - 1s 198us/sample - loss: 0.0930 - accuracy: 0.9727 - val_loss: 0.9901 - val_accuracy: 0.7855 Epoch 4/3000 7352/7352 [==============================] - 2s 211us/sample - loss: 0.1013 - accuracy: 0.9701 - val_loss: 0.5319 - val_accuracy: 0.9114 Epoch 5/3000 7352/7352 [==============================] - 1s 201us/sample - loss: 0.0958 - accuracy: 0.9721 - val_loss: 0.6938 - val_accuracy: 0.8388 Epoch 6/3000 7352/7352 [==============================] - 2s 205us/sample - loss: 0.0925 - accuracy: 0.9743 - val_loss: 1.4033 - val_accuracy: 0.7472 Epoch 7/3000 7352/7352 [==============================] - 1s 203us/sample - loss: 0.0948 - accuracy: 0.9740 - val_loss: 0.8375 - val_accuracy: 0.7998