I'm trying to figure out how to feed a functional model to LSTM gates in keras. I've got a time series of tuples (int, float, float). The ints are not ordered and should go through and embedding layer. I then want the tuples (after embedding the int) to go through an LSTM layer.
I've started with
from keras.layers import Input
from keras.layers import Embedding
from keras.layers import LSTM
import keras
inputs = [(42, 0.5, 0.6), (36, 0.4, 0.7), (50, 0.2, 0.9)] # example. The real data is a sequence of millions of tuples
input_id = Input(shape=(1,), dtype='int32', name='input_type') # id is within [0, 99]
embed_id = Embedding(output_dim=3, input_dim=20, input_length=1)(input_id)
input_v1 = Input(shape=(1,), dtype='float', name='input_v1')
input_v2 = Input(shape=(1,), dtype='float', name='input_v2')
input_merged = keras.layers.concatenate([embed_id, input_v1, input_v2], axis=-1)
lstm = LSTM(40) # how do I tell it to use input_merged as input ?
The concatenate complains:
ValueError: Concatenate
layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 1, 3), (None, 1), (None, 1)]
This I'm pretty sure could be arrange with reshapes. But What I'm really wondering: Is this the right approach to feeding Keras time sequences of data needing some processing ?
I'm also unsure how to feed the LSTM gates the concatenated results. All the recurrent examples I could find use sequential models.