I am new to keras and deep learnin.When i crate a sample basic model,i fit it and my model's log loss is same always.
model = Sequential()
model.add(Convolution2D(32, 3, 3, border_mode='same', init='he_normal',
input_shape=(color_type, img_rows, img_cols)))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="th"))
model.add(Dropout(0.5))
model.add(Convolution2D(64, 3, 3, border_mode='same', init='he_normal'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="th")) #this part is wrong
model.add(Dropout(0.5))
model.add(Convolution2D(128, 3, 3, border_mode='same', init='he_normal'))
model.add(MaxPooling2D(pool_size=(2, 2), dim_ordering="th"))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(10))
model.add(Activation('softmax'))
model.compile(Adam(lr=1e-3), loss='categorical_crossentropy')
model.fit(x_train, y_train, batch_size=64, nb_epoch=200,
verbose=1, validation_data=(x_valid,y_valid))
Train on 17939 samples, validate on 4485 samples
Epoch 1/200 17939/17939 [==============================] - 8s - loss: 99.8137 - acc: 0.3096 - val_loss: 99.9626 - val_acc: 0.0000e+00
Epoch 2/200 17939/17939 [==============================] - 8s - loss: 99.8135 - acc: 0.2864 - val_loss: 99.9626 - val_acc: 0.0000e+00
Epoch 3/200 17939/17939 [==============================] - 8s - loss: 99.8135 - acc: 0.3120 - val_loss: 99.9626 - val_acc: 1.0000
Epoch 4/200 17939/17939 [==============================] - 10s - loss: 99.8135 - acc: 0.3315 - val_loss: 99.9626 - val_acc: 1.0000
Epoch 5/200 17939/17939 [==============================] - 10s - loss: 99.8138 - acc: 0.3435 - val_loss: 99.9626 - val_acc: 0.4620
..
...
it's going like this
Do you know whicc part i made wrong ?