We are using DataFlow to read from a set of PubSub topics and write the data to BigQuery. We are currently using one DataFlow job for each topic and writing them to the related BigQuery table. Is it possible to write one Dataflow job for this?
I see documentation about multiple sources to one output here: https://cloud.google.com/dataflow/pipelines/design-principles?hl=en#multiple-sources
Is there anything keeping me from just doing multiple "basic" pipelines in the same DataFlow job like in the basic flow: https://cloud.google.com/dataflow/pipelines/design-principles?hl=en#a-basic-pipeline
The documentation and my understanding of the code implies this can be done, but I'd like to be sure before I embark on the effort.