Here is my code in tensor flow, I have defined a Bi-LSTM and for a certain task I need to loop over my graph. although I have set reuse= True in Scope Variable, but it produces the error mentioned below the code.
for run in range(0, 2):
with tf.variable_scope("LSTM", reuse= True) as scope:
def LSTM(input_data):
LSTM_cell_fw= tf.contrib.rnn.BasicLSTMCell(num_units= hidden_size)
LSTM_cell_bw= tf.contrib.rnn.BasicLSTMCell(num_units= hidden_size)
output, states = tf.nn.bidirectional_dynamic_rnn(LSTM_cell_fw, LSTM_cell_bw, inputs= input_data, dtype=tf.float32)
output_1= output[0]
output_2= output[1]
output_1= output_1[-1, -1, :]
output_1= tf.reshape(output_1, shape= (1, hidden_size))
output_2= output_2[-1, -1, :]
output_2= tf.reshape(output_2, shape= (1, hidden_size))
fin_output= tf.concat((output_1, output_2), axis=1)
return fin_output
and the error is: ValueError: Variable bidirectional_rnn/fw/basic_lstm_cell/kernel already exists, disallowed. Did you mean to set reuse=True in VarScope? Originally defined at:
File "alpha-rep.py", line 65, in LSTM output, states = tf.nn.bidirectional_dynamic_rnn(LSTM_cell_fw, LSTM_cell_bw, inputs= input_data, dtype=tf.float32) File "alpha-rep.py", line 77, in out= LSTM(input_data)