0
votes

actually I am trying to build a LSTM-Model in Keras and Tensorflow. My dataset has about 3200 items with 4 features and 3 labels.

X Shape: (3200, 4)
Y Shape: (3200, 3)

If I want about 5 times steps, so do i have to reshape like that:

n_time_steps= 5
n_features = 4
X_train = X_train.reshape((-1, n_time_steps, n_features))

so I get these shapes:

X Shape: (640, 5, 4)
Y Shape: (3200, 3)

I am kinda confused, because 640 =! 3200 data points... but the model compiles and fits without any error. But the acc and loss are insane. When I try to reshape Y_train too Y Shape: (640, 5, 3) throws

Incompatible shapes: [10,3] vs. [10,5,3] [[node sub (defined at :12) ]] [Op:__inference_train_function_74818 Function call stack: train_function

Here is my model

opt = 'adam'
model = keras.Sequential()
model.add(layers.LSTM(128, input_shape=(n_time_steps,4)))
model.add(layers.Dropout(0.2))
model.add(layers.Dense(3 ,activation="sigmoid"))
model.compile(optimizer=opt,loss=hn_multilabel_loss,metrics=['accuracy','mae'])
model.summary()

history = model.fit(X_train, Y_train,batch_size = 10, epochs=10, validation_split = 0.1)

Anyone has an idea how to create a LSTM with 5 time steps and 4 features ? What is the right input and output shape?

Thanks guys!

1

1 Answers

0
votes

You can use this function to transform a 2D dataset to a dataset with a customizable number of time steps:

def multivariate_data(dataset, target, start_index, end_index, history_size,
                      target_size, step, single_step=False):
  data = []
  labels = []

  start_index = start_index + history_size
  if end_index is None:
    end_index = len(dataset) - target_size

  for i in range(start_index, end_index):
    indices = range(i-history_size, i, step)
    data.append(dataset[indices])

    if single_step:
      labels.append(target[i+target_size])
    else:
      labels.append(target[i:i+target_size])

  return np.array(data), np.array(labels)

I did it successfully with your task (I simplified it a little):

import tensorflow as tf
import numpy as np
from tensorflow.keras import layers

X_train = np.random.rand(3200, 4)
y_train = np.random.randint(0, 2, (3200, 3))

def multivariate_data(dataset, target, start_index, end_index, history_size,
                      target_size, step, single_step=False):
  data, labels = [], []
  start_index = start_index + history_size
  if end_index is None:
    end_index = len(dataset) - target_size
  for i in range(start_index, end_index):
    indices = range(i-history_size, i, step)
    data.append(dataset[indices])
    if single_step:
      labels.append(target[i+target_size])
    else:
      labels.append(target[i:i+target_size])

  return np.array(data), np.array(labels)

X_train, y_train = multivariate_data(X_train, y_train, 0, 3200, 5, 0, 1, True)

n_time_steps, n_features = 5, 4

model = tf.keras.Sequential()
model.add(layers.LSTM(128, input_shape=(n_time_steps,4)))
model.add(layers.Dense(3))
model.compile(optimizer='adam',loss='mae')

history = model.fit(X_train, y_train, batch_size = 10, epochs=1)

Output:

  10/3195 [..............................] - ETA: 16:12 - loss: 0.3244
 120/3195 [>.............................] - ETA: 1:19 - loss: 0.2725 
 230/3195 [=>............................] - ETA: 40s - loss: 0.2536 
 330/3195 [==>...........................] - ETA: 27s - loss: 0.2545
 440/3195 [===>..........................] - ETA: 20s - loss: 0.2597