I am training models in keras and tensorflow in Google Cloud Datalab. I want to run the training jobs in Cloud ML. Is it possible to do this with a %% bash [...]
command?
Let's say my model looks something like this:
X = tf.placeholder(tf.float32, shape=[None, num_inputs])
hidden = fully_connected(X, num_hidden, activation_fn=None)
outputs = fully_connected(hidden, num_outputs, activation_fn=None)
loss = tf.reduce_mean(tf.square(outputs - X))
optimizer = tf.train.AdamOptimizer(learning_rate)
train = optimizer.minimize( loss)
init = tf.global_variables_initializer()
num_steps = 3000
with tf.Session() as sess:
sess.run(init)
for iteration in range(num_steps):
sess.run(train,feed_dict={X: scaled_data})
How can I go about training this in a cloud ML job?