1
votes

Hello Everyone,

I am working on a Image classification problem using tensorflow and Convolution Neural Network. My model is having following layers.

  • Input image of size 2456x2058
  • 3 convolution Layer {Con1-shape(10,10,1,32); Con2-shape(5,5,32,64); Con3-shape(5,5,64,64)}
  • 3 max pool 2x2 layer
  • 1 fully connected layer.

I have tried using the NVIDIA-SMI tool but it shows me the GPU memory consumption as the model runs.
I would like to know if there is any method or a way to find the estimate of memory before running the model on GPU. So that I can design models with the consideration of available memory.
I have tried using this method for estimation but my calculated memory and observed memory utilisation are no where near to each other.

Thank you all for your time.

3

3 Answers

1
votes

As far as I understand, when you open a session with tensorflow-gpu, it allocates all the memory in the GPUS that are available. So, when you look at the nvidia-smi output, you will always see the same amount of used memory, even if it actually uses only a part of it. There are options when opening a session to force tensorflow to allocate only a part of the available memory (see How to prevent tensorflow from allocating the totality of a GPU memory? for instance)

0
votes

You can control the memory allocation of GPU in TensorFlow. Once you calculated your memory requirements for your Deep learning model you can use tf.GPUOptions.

For example if you want to allocate 4 GB(approximately) of GPU memory out of 8 GB.

config = tf.ConfigProto()
config.gpu_options.per_process_gpu_memory_fraction = 0.4
session = tf.Session(config=config, ...)

Once done pass it in tf.Session using config parameter

The per_process_gpu_memory_fraction is used to bound the available amount of GPU memory.

Here's the link to documentation :-

https://www.tensorflow.org/tutorials/using_gpu

0
votes

NVIDIA-SMI ... shows me the GPU memory consumption as the model run

TF preallocates all available memory when you use it, so NVIDIA-SMI would show nearly 100% memory usage ...

but my calculated memory and observed memory utilisation are no where near to each other.

.. so this is unsurprising.