1
votes

The code for this problem is quite complex because I'm trying to implement fractalNet but changing the convolution base block to just a dense layer. I'm trying to separately build two fractalNets (one after the other so I don't think they should be interfering). One for the policy and one for the value function.

There are also a number of issues I have seen so far that may or may not be related. One is that I can't import numpy as np and use np which is why I've been forced to use numpy(). The other is that my code seems to trying to be working on tensors tf.Tensor[stuff] as well as Tensor[stuff] in different sections at the same time. The build_model function below outputs Tensor[stuff] from the Input call whereas the neural network builder code uses tf.Tensor[stuff]. I tried but to no avail to stick to type.

Here is the complete error that keeps killing the code:

/home/ryan/.local/lib/python3.6/site-packages/keras/engine/network.py:190: UserWarning: Model inputs must come from `keras.layers.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to your model was not an Input tensor, it was generated by layer activation_1.
Note that input tensors are instantiated via `tensor = keras.layers.Input(shape)`.
The tensor that caused the issue was: activation_1/Relu:0
  str(x.name))
Traceback (most recent call last):
  File "train.py", line 355, in <module>
    main(**vars(args))
  File "train.py", line 302, in main
    val_func = NNValueFunction(bl,c,layersizes,dropout,deepest,obs_dim) # Initialize the value function
  File "/home/ryan/trpo_fractalNN/trpo/value.py", line 37, in __init__
    self.model = self._build_model()
  File "/home/ryan/trpo_fractalNN/trpo/value.py", line 56, in _build_model
    model = Model(inputs=obs_input, outputs=outputs)
  File "/home/ryan/.local/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/home/ryan/.local/lib/python3.6/site-packages/keras/engine/network.py", line 94, in __init__
    self._init_graph_network(*args, **kwargs)
  File "/home/ryan/.local/lib/python3.6/site-packages/keras/engine/network.py", line 241, in _init_graph_network
    self.inputs, self.outputs)
  File "/home/ryan/.local/lib/python3.6/site-packages/keras/engine/network.py", line 1511, in _map_graph_network
    str(layers_with_complete_input))
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 29), dtype=float32) at layer "input_1". The following previous layers were accessed without issue: []

So here is the part of the code that I'm suspicious of at the moment because of the fact that somehow it is breaking at the very beginning on the value function's neural net.

def _build_model(self):
    """ Construct TensorFlow graph, including loss function, init op and train op """
    # hid1 layer size is 10x obs_dim, hid3 size is 10, and hid2 is geometric mean
    # hid3_units = 5  # 5 chosen empirically on 'Hopper-v1'
    # hid2_units = int(np.sqrt(hid1_units * hid3_units))
    # heuristic to set learning rate based on NN size (tuned on 'Hopper-v1')

    obs = keras.layers.Input(shape=(self.obs_dim,))
    # I'm not sure why it won't work with np??????????????????????????????????????????????????????????????????????????????????
    obs_input = Dense(int(self.layersizes[0][0].numpy()))(obs) # Initial fully-connected layer that brings obs number up to a len that will work with fractal architecture
    obs_input = Activation('relu')(obs_input)
    self.lr = 1e-2 / np.sqrt(self.layersizes[2][0])  # 1e-2 empirically determined
    print('Value Params -- lr: {:.3g}'
          .format(self.lr))
    outputs = fractal_net(self,bl=self.bl,c=self.c,layersizes=self.layersizes,
        drop_path=0.15,dropout=self.dropout,
        deepest=self.deepest)(obs_input)
    model = Model(inputs=obs_input, outputs=outputs)
    optimizer = Adam(self.lr)
    model.compile(optimizer=optimizer, loss='mse')



    return model
1
These are what the imports look like for all files generally.Ryan Maxwell
""" State-Value Function Written by Patrick Coady (pat-coady.github.io) """ import keras from keras import Model from keras.layers import Input, Dense from keras.optimizers import Adam from keras import backend as K import numpy as np from fractalnet_regularNN import *Ryan Maxwell
Also, does anyone know if there is any keras or tensorflow code specifically in this code github.com/snf/keras-fractalnet which I've been modifying to fit this code github.com/pat-coady/trpo which is specifically to be used as a convolutional code only and not a regular neural network? Because I've gotten some other results that seemed to indicate that somehow the fractal code is requiring a convolutional base layer but I'm not sureRyan Maxwell

1 Answers

0
votes

I found out the issue. The problem was that since I was trying to combine multiple files, I had a 'Dense' call to bring the obs_len to the desired size and then took that and plugged it into the fractalNet code. However, I didn't realize that this would break things. I solved the issue by removing the initial Dense call and placing it inside the fractalNet code itself.

So moral of the story, don't try to break up different parts of the NN layers into separate files. Just as a side comment, In the current fractalNN code, it calls fractal_net and then a Dense layer afterwards and apparently this still works. But I think it breaks things to try to reverse this order. I hope this helps someone else.