Please help me. I am using Tensorflow 2.0 GPU. I train the model and save in .h5 format
model = keras.Sequential()
model.add(layers.Bidirectional(layers.CuDNNLSTM(self._window_size, return_sequences=True),
input_shape=(self._window_size, x_train.shape[-1])))
model.add(layers.Dropout(rate=self._dropout, seed=self._seed))
model.add(layers.Bidirectional(layers.CuDNNLSTM((self._window_size * 2), return_sequences=True)))
model.add(layers.Dropout(rate=self._dropout, seed=self._seed))
model.add(layers.Bidirectional(layers.CuDNNLSTM(self._window_size, return_sequences=False)))
model.add(layers.Dense(units=1))
model.add(layers.Activation('linear'))
model.summary()
model.compile(
loss='mean_squared_error',
optimizer='adam'
)
# обучаем модель
history = model.fit(
x_train,
y_train,
epochs=self._epochs,
batch_size=self._batch_size,
shuffle=False,
validation_split=0.1
)
model.save('rts.h5')
Then I load this model and use it for forecasting and everything works.
model = keras.models.load_model('rts.h5')
y_hat = model.predict(x_test)
But the question arose of using a trained model in Tensorflow Serving. And the model in .h5 format is not accepted. I run:
sudo docker run --gpus 1 -p 8501:8501 --mount type=bind,source=/home/alex/PycharmProjects/TensorflowServingTestData/RtsModel,target=/models/rts_model -e MODEL_NAME=rts_model -t tensorflow/serving:latest-gpu
But the question arose of using a trained model in Tensorflow Serving. And the model in .h5 format is not accepted. I run: And I get the error:
tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:267] No versions of servable rts_model found under base path /models/rts_model
I try to save the trained model as described here, https://www.tensorflow.org/guide/saved_model#using_savedmodel_with_estimators:
And I get the error:
ValueError: Layer has 2 states but was passed 0 initial states.
I tried to save the model as follows, https://www.tensorflow.org/api_docs/python/tf/keras/models/save_model:
And have the same error:
ValueError: Layer has 2 states but was passed 0 initial states.
The only thing that works to save the model in the format for Tensorflow Serving is:
keras.experimental.export_saved_model(model, 'saved_model/1/')
Saved model work in Serving. But I get a warning that this method is deprecated and will be removed in a future version.
Instructions for updating:
Please use `model.save(..., save_format="tf")` or `tf.keras.models.save_model(..., save_format="tf")`.
And it closed me. When I try to use these methods, it gives an error. When I use what works, writes that it is deprecated.
Please, help. How to save a trained model in Tensorflow 2.0. so that it can be used for Tensorflow Serving.