I have been attempting to run an apache beam job on Dataflow, but I'm getting an error from GCP with the following message:
The job graph is too large. Please try again with a smaller job graph, or split your job into two or more smaller jobs.
I have run jobs with larger graphs in the past and had no problems. The job also runs fine locally with DirectRunner. There are about 12 nodes in the graph including a read from Bigquery step, a WriteToText step and a CoGroupByKey step.
Is there a way to increase the graph size Dataflow is willing to accept?