1
votes

We developed an application based on Google Cloud Platform, that uses Cloud Dataflow to write data to BigQuery. I am now trying to setup this application on a new GCP project on another organization.

The problem

I am experiencing this issue:

Workflow failed. Causes: Unable to bring up enough workers: minimum 1, actual 0. Please check your quota and retry later, or please try in a different zone/region.

It happens on two dataflow templates: 1. One takes data from a Pub/Sub topic and writes to a Pub/Sub topic, 2. The other takes data from a Pub/Sub topic and writes to BigQuery.

Jobs are created from the Cloud Dataflow API. The templates are pretty standard, with 3 maximum workers and the THROUGHPUT_BASED autoscaling mode.

As suggested on similar questions, I checked the Compute engine quota, that are far from exceeded. I also changed the region, and the machine type; the problem still happens. Compute Engine and Dataflow APIs are enabled.

The question

As it works on projects on another organization, I believe that it comes from the GCP organization that have specific restrictions. Is it possible? What other points should I check to make it work?

1
Is a billing account set on this project? - guillaume blaquiere
@guillaumeblaquiere, yes a billing account is set on the project. - Grégoire G.
If you have already setup your billing and quotas correctly, this is highly unusual. I suggest contacting Google Cloud support who should be able to help you regarding this specific request. - chamikara

1 Answers

1
votes

After multiple tests, we managed to make it work properly.

It was indeed not a problem with regions and machine types, though most of the related Stackoverflow threads suggest that you should start with that.

It was in fact because of a restriction on external IP addresses through a GCP Organization policy. As pointed in this question, standard configuration of Dataflow requires an external IP address.