I have a SavedModel
with saved_model.pbtxt
and variables\
which was pre-trained on a single GPU from this repo: https://github.com/sthalles/deeplab_v3. I'm trying to serve this SavedModel
with tensorflow-serving, and it can only utilise GPU:0 in a multi-GPU machine. I learned from https://github.com/tensorflow/serving/issues/311 that tensorflow-serving loads the graph with tensorflow, and this model was trained on a single GPU. I tried to save the model with clear_devices=True
flag but no help, still ran on GPU:0.
Then I try to read the GraphDef
in saved_model.pbtxt
, from https://www.tensorflow.org/guide/extend/model_files#device I know that the device assigned to one node/operation is defined in NodeDef
.
My problem is, in this saved_model.pbtxt
, only CPU was assigned for some operations/nodes in NodeDef
as device: "/device:CPU:0"
, while no GPU was specifically assigned. All those operations executed on GPU don't have a device
tag in their NodeDef
.
I wonder where are the device placement infomation for GPU operations was saved in SavedModel
and can I change the device info in a graph? Thanks for your help.
For example, in this saved_model.pbtxt
a CPU op was defined as:
node {
name: "save/RestoreV2/tensor_names"
op: "Const"
device: "/device:CPU:0"
...
}
A computation op was:
node {
name: "resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/kernel/Regularizer/l2_regularizer"
op: "Mul"
input: "resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/kernel/Regularizer/l2_regularizer/scale"
input: "resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/kernel/Regularizer/l2_regularizer/L2Loss"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/weights"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}