3
votes

I am a newbie to aws sagemaker. I am trying to setup a model in aws sagemaker using keras with GPU support. The docker base image used to infer the model is given below

FROM tensorflow/tensorflow:1.10.0-gpu-py3

RUN apt-get update && apt-get install -y --no-install-recommends nginx curl
...

This is the keras code I'm using to check if a GPU is identified by keras in flask.

import keras
@app.route('/ping', methods=['GET'])
def ping():

    keras.backend.tensorflow_backend._get_available_gpus()

    return flask.Response(response='\n', status=200,mimetype='application/json')

When I spin up a notebook instance in a sagemaker using the GPU the keras code shows available GPUs. So, in order to access GPU in the inference phase(model) do I need to install any additional libraries in the docker file apart from the tensorflow GPU base image?

Thanks in advance.

1

1 Answers

4
votes

You shouldn't need to install anything else. Keras relies on TensorFlow for GPU detection and configuration.

The only thing worth noting is how to use multiple GPUs during training. I'd recommend passing 'gpu_count' as an hyper parameter, and setting things up like so:

from keras.utils import multi_gpu_model
model = Sequential()
model.add(...)
...
if gpu_count > 1:
    model = multi_gpu_model(model, gpus=gpu_count)
model.compile(...)