What are the steps to use the Prometheus exporter for Tensorflow serving? According to 1.11 TF serving supports prometheus metrics: https://github.com/tensorflow/serving/releases/tag/1.11.0
I'm starting a docker from the example https://www.tensorflow.org/serving/docker and the following:
docker run -p 8501:8501 -p 8500:8500 \ --mount type=bind,\ source=/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu,\ target=/models/half_plus_two \ -e MODEL_NAME=half_plus_two -t tensorflow/serving &
Prometheus configuration file: global: scrape_interval: 10s evaluation_interval: 10s external_labels: monitor: 'tf-serving-monitor'
scrape_configs: - job_name: 'tensorflow' scrape_interval: 5s static_configs: - targets: ['localhost:8501']
But prometheus fails to find the metrics exposed by tf serving. Is there a specific port I should open on docker or some parameter I should pass to TF serving?