3
votes

I am practicing with RNN. I randomly create 5 integers. If the first integer is an odd number, the y value is 1, otherwise y is 0 (So, only the first x counts). Problem is, when I run this model, it does not 'learn': val_loss and val_accuracy does not change over epochs. What would be the cause?

from keras.layers import SimpleRNN, LSTM, GRU, Dropout, Dense
from keras.models import Sequential
import numpy as np

data_len = 300
x = []
y = []
for i in range(data_len):
    a = np.random.randint(1,10,5)
    if a[0] % 2 == 0:
        y.append('0')
    else:
        y.append('1')

    a = a.reshape(5, 1)
    x.append(a)
    print(x)

X = np.array(x)
Y = np.array(y)   

model = Sequential()
model.add(GRU(units=24, activation='relu', return_sequences=True, input_shape=[5,1])) 
model.add(Dropout(rate=0.5))
model.add(GRU(units=12, activation='relu'))
model.add(Dropout(rate=0.5))
model.add(Dense(units=1, activation='softmax'))

model.compile(optimizer='adam', loss='mse', metrics=['accuracy'])
model.summary()

history = model.fit(X[:210], Y[:210], epochs=20, validation_split=0.2)

Epoch 1/20 168/168 [==============================] - 1s 6ms/step - loss: 0.4345 - accuracy: 0.5655 - val_loss: 0.5000 - val_accuracy: 0.5000 ...

Epoch 20/20 168/168 [==============================] - 0s 315us/step - loss: 0.4345 - accuracy: 0.5655 - val_loss: 0.5000 - val_accuracy: 0.5000

1
I found a good explanation on this issue: dlology.com/blog/…tony lee

1 Answers

1
votes

You're using softmax activation with 1 neuron, which always returns [1]. Use sigmoid activation with 1 neuron for binary classification, and softmax for multiple neurons for multiclass classification