I tried setting up a Dataflow streaming job using the "Pub/Sub topic to BigQuery" template. My org has an image constraint policy in place. According to the documentation for image constraints (https://cloud.google.com/compute/docs/images/restricting-image-access#limitations), any image used by a GCP service should not be affected by these constraints. However the dataflow workers fail to launch, citing image constraints as a reason. What is the correct way to set image constraints in such a scenario?
This is what the error looked like -
{
insertId: "qnh47fd17tx"
labels: {
dataflow.googleapis.com/job_id: "job_id"
dataflow.googleapis.com/job_name: "job_name"
dataflow.googleapis.com/region: "us-central1"
}
logName: "projects/app/logs/dataflow.googleapis.com%2Fjob-message"
receiveTimestamp: ""
resource: {
labels: {
job_id: ""
job_name: ""
project_id: ""
region: "us-central1"
step_id: ""
}
type: "dataflow_step"
}
severity: "ERROR"
textPayload: "Workflow failed. Causes: Step "setup_resource_disks_harness50" failed., Step setup_resource_disks_harness50: Set up of resource disks_harness failed, Unable to create data disk(s)., Unknown error in operation 'operation-1600084247324-5af44a52c2574-7f195f5c-376e0b61': [CONDITION_NOT_MET] 'Constraint constraints/compute.trustedImageProjects violated for project getmega-app. Use of images from project dataflow-service-producer-prod is prohibited.'."
timestamp: ""
}