8
votes

From this link I found that Google Cloud Dataflow uses Docker containers for its workers: Image for Google Cloud Dataflow instances

I see it's possible to find out the image name of the docker container.

But, is there a way I can get this docker container (ie from which repository do I go to get it?), modify it, and then indicate my Dataflow job to use this new docker container?

The reason I ask is that we need to install various C++ and Fortran and other library code on our dockers so that the Dataflow jobs can call them, but these installations are very time consuming so we don't want to use the "resource" property option in df.

3
Not technically an answer to your question but you can probably pull off what you want using Google Cloud Dataproc. Dataproc runs your code using Spark instead of Dataflow but essentially it accomplishes the exact same goal of writing a data pipeline. Dataproc also supports custom Docker images.Jack Edmonds
See issues.apache.org/jira/browse/… about which SDKs allow what kind of containers.ron

3 Answers

6
votes

Update for May 2020

Custom containers are only supported within the Beam portability framework.

Pipelines launched within portability framework currently must pass --experiments=beam_fn_api explicitly (user-provided flag) or implicitly (for example, all Python streaming pipelines pass that).

See the documentation here: https://cloud.google.com/dataflow/docs/guides/using-custom-containers?hl=en#docker

There will be more Dataflow-specific documentation once custom containers are fully supported by Dataflow runner. For support of custom containers in other Beam runners, see: http://beam.apache.org/documentation/runtime/environments.


The docker containers used for the Dataflow workers are currently private, and can't be modified or customized.

In fact, they are served from a private docker repository, so I don't think you're able to install them on your machine.

2
votes
1
votes

you can generate a template from your job (see https://cloud.google.com/dataflow/docs/templates/creating-templates for details), then inspect the template file to find the workerHarnessContainerImage used

I just created one for a job using the Python SDK and the image used in there is dataflow.gcr.io/v1beta3/python:2.0.0

Alternatively, you can run a job, then ssh into one of the instances and use docker ps to see all running docker containers. Use docker inspect [container_id] to see more details about volumes bound to the container etc.