I would like to train an InceptionV3 Neural Network from scratch. I already have an implementation running which utilizes this TensorFlow Hub module: https://tfhub.dev/google/imagenet/inception_v3/feature_vector/1 and performs fine-tuning utilizing the included pre-trained weights.
I now would like to use the same TensorFlow Hub module but discard the provided weights and use my own kernel initializer (e.g. tf.initializers.truncated_normal, tf.initializers.he_normal, etc...).
How can I modify the trainable variables in the TFHub module to use a custom initializer? Just to be clear, I want to replace the pre-trained weights at runtime, and just keep the model architecture. Please let me know if I really should be using TFSlim, or the model zoo.
Here is what I have so far:
import tensorflow as tf
import tensorflow_hub as hub
tfhub_module_url = 'https://tfhub.dev/google/imagenet/inception_v3/feature_vector/1'
initializer = tf.truncated_normal
def main(_):
_graph = tf.Graph()
with _graph.as_default():
module_spec = hub.load_module_spec(tfhub_module_url)
height, width = hub.get_expected_image_size(module_spec)
resized_input_tensor = tf.placeholder(tf.float32, [None, height, width, 3], name='resized_input_tensor')
m = hub.Module(module_spec, trainable=True)
bottleneck_tensor = m(resized_input_tensor)
trainable_vars = tf.trainable_variables()
# TODO: This fails, because this probably isn't how this is supposed to be done:
for trainable_var in trainable_vars:
trainable_var.initializer = tf.initializers.he_normal
with tf.Session(graph=_graph) as sess:
print(trainable_vars)
tf.logging.set_verbosity(tf.logging.INFO)
tf.app.run()
What is the correct way of doing this?