I am running the tutorial example cifar10_cnn.py from here.
Here is the environment/configuration for the test:
- Windows 10
- Keras 1.2.0
- Theano 0.8.2
- Numpy 1.11.2
- Enthought/Canopy/MKL(2017.0.1-1)
- .theanorc [blas] ldflags -L... -lmk2_rt
The program took about one and half day to finish 200 epochs. The following figure shows the header and the last three epochs.
Using Theano backend
X_train shape: (50000L, 32L, 32L, 3L) ...
Epoch 198/200 50000/50000 [==============================] - 639s - loss: 1.7894 - acc: 0.3497 - val_loss: 1.5930 - val_acc: 0.3968
Epoch 199/200 50000/50000 [==============================] - 617s - loss: 1.8111 - acc: 0.3446 - val_loss: 1.6960 - val_acc: 0.3824
Epoch 200/200 50000/50000 [==============================] - 612s - loss: 1.8005 - acc: 0.3497 - val_loss: 1.6164 - val_acc: 0.4041
I have two questions here:
- The accuracy (0.34/0.40) is too bad. Does anyone have similar problems?
- The shape of X_train as (50000, 32, 32, 3) seems strange to me because other keras/cifar examples give the shape of X_train as (50000, 3, 32, 32). If I add "set_image_dim_ordering('th')" to the code, the shape will become (50000, 3, 32, 32) but the program will give even lower accuracy as "Epoch 80/200 50000/50000 - 635s - loss: 14.5010 - acc: 0.1003 - val_loss: 14.5063 - val_acc: 0.1000" How to explain the effect of set dim here?
Thanks for any comments.