4
votes

I'm a bit new to Keras and deep learning. I'm currently trying to replicate this paper but when I'm compiling the second model (with the LSTMs) I get the following error:

"TypeError: unsupported operand type(s) for +: 'NoneType' and 'int'"

The description of the model is this:

  1. Input (length T is appliance specific window size)
  2. Parallel 1D convolution with filter size 3, 5, and 7 respectively, stride=1, number of filters=32, activation type=linear, border mode=same
  3. Merge layer which concatenates the output of parallel 1D convolutions
  4. Bidirectional LSTM consists of a forward LSTM and a backward LSTM, output_dim=128
  5. Bidirectional LSTM consists of a forward LSTM and a backward LSTM, output_dim=128
  6. Dense layer, output_dim=128, activation type=ReLU
  7. Dense layer, output_dim= T , activation type=linear

My code is this:

from keras import layers, Input
from keras.models import Model

def lstm_net(T):
    input_layer = Input(shape=(T,1))
    branch_a = layers.Conv1D(32, 3, activation='linear', padding='same', strides=1)(input_layer)
    branch_b = layers.Conv1D(32, 5, activation='linear', padding='same', strides=1)(input_layer)
    branch_c = layers.Conv1D(32, 7, activation='linear', padding='same', strides=1)(input_layer)

    merge_layer = layers.Concatenate(axis=-1)([branch_a, branch_b, branch_c])
    print(merge_layer.shape)
    BLSTM1 = layers.Bidirectional(layers.LSTM(128, input_shape=(8,40,96)))(merge_layer)
    print(BLSTM1.shape)
    BLSTM2 = layers.Bidirectional(layers.LSTM(128))(BLSTM1)
    dense_layer = layers.Dense(128, activation='relu')(BLSTM2)
    output_dense = layers.Dense(1, activation='linear')(dense_layer)
    model = Model(input_layer, output_dense)
    model.name = "lstm_net"
    return model

model = lstm_net(40)

After that I get the above error. My goal is to give as input a batch of 8 sequences of length 40 and get as output a batch of 8 sequences of length 40 too. I found this issue on Keras Github LSTM layer cannot connect to Dense layer after Flatten #818 and there @fchollet suggests that I should specify the 'input_shape' in the first layer which I did but probably not correctly. I put the two print statements to see how the shape is changing and the output is:

(?, 40, 96)
(?, 256)

The error occurs on the line BLSTM2 is defined and can be seen in full here

1
Please post at which line of the code this error appears.Daniel Möller
It's on the last line of my post :)itroulli

1 Answers

3
votes

Your problem lies in these three lines:

BLSTM1 = layers.Bidirectional(layers.LSTM(128, input_shape=(8,40,96)))(merge_layer)
print(BLSTM1.shape)
BLSTM2 = layers.Bidirectional(layers.LSTM(128))(BLSTM1)

As a default, LSTM is returning only the last element of computations - so your data is losing its sequential nature. That's why the proceeding layer raises an error. Change this line to:

BLSTM1 = layers.Bidirectional(layers.LSTM(128, return_sequences=True))(merge_layer)
print(BLSTM1.shape)
BLSTM2 = layers.Bidirectional(layers.LSTM(128))(BLSTM1)

In order to make the input to the second LSTM to have sequential nature also.

Aside of this - I'd rather not use input_shape in middle model layer as it's automatically inferred.