I had trained network N first and saved it with the saver into checkpoint Checkpoint_N. There were some variable scopes defined within N.
Now, I want to build a siamese network using this trained network N as below:
with tf.variable_scope('siameseN',reuse=False) as scope:
networkN = N()
embedding_1 = networkN.buildN()
# this defines the network graph and all the variables.
tf.train.Saver().restore(session_variable,Checkpoint_N)
scope.reuse_variables()
embedding_2 = networkN.buildN()
# define 2nd branch of the Siamese, by reusing previously restored variables.
When I do the above, the restore statement throws a Key Error that siameseN/conv1 was not found in the checkpoint file for every variable in N's graph.
Is there a way to do this, without changing the code of N? I just basically added a parent scope to every variable and operation in N. Can I restore the weights to the right variables by telling tensorflow to ignore the parent scope or something?