I am a newbie to aws sagemaker. I am trying to setup a model in aws sagemaker using keras with GPU support. The docker base image used to infer the model is given below
FROM tensorflow/tensorflow:1.10.0-gpu-py3
RUN apt-get update && apt-get install -y --no-install-recommends nginx curl
...
This is the keras code I'm using to check if a GPU is identified by keras in flask.
import keras
@app.route('/ping', methods=['GET'])
def ping():
keras.backend.tensorflow_backend._get_available_gpus()
return flask.Response(response='\n', status=200,mimetype='application/json')
When I spin up a notebook instance in a sagemaker using the GPU the keras code shows available GPUs. So, in order to access GPU in the inference phase(model) do I need to install any additional libraries in the docker file apart from the tensorflow GPU base image?
Thanks in advance.