2
votes

Here is my code:

AE_0 = Sequential()

encoder = Sequential([Dense(output_dim=100, input_dim=256, activation='sigmoid')])
decoder = Sequential([Dense(output_dim=256, input_dim=100, activation='linear')])

AE_0.add(AutoEncoder(encoder=encoder, decoder=decoder, output_reconstruction=True))
AE_0.compile(loss='mse', optimizer=SGD(lr=0.03, momentum=0.9, decay=0.001, nesterov=True))
AE_0.fit(X, X, batch_size=21, nb_epoch=500, show_accuracy=True)

X has a shape (537621, 256). I'm trying to find a way to compress the vectors of size 256 to 100, then to 70, then to 50. I have done this is Lasagne but in Keras it seems to be easier to work w/ Autoencoders.

Here is the output:

Epoch 1/500 537621/537621 [==============================] - 27s - loss: 0.1339 - acc: 0.0036
Epoch 2/500 537621/537621 [==============================] - 32s - loss: 0.1339 - acc: 0.0036
Epoch 3/500 252336/537621 [=============>................] - ETA: 14s - loss: 0.1339 - acc: 0.0035

And it continues like this on and on..

1
I have same problem :/ - Snurka Bill
haven't you solved it yet? - Snurka Bill

1 Answers

0
votes

It's now fixed on master:) openning issues is sometimes best choice https://github.com/fchollet/keras/issues/1604