I'm new in google dataflow. I have 2 dataflow pipeline to execute 2 difference job. One is ETL process and load to Bigquery and another one is read from Bigquery to aggregate for report. I want to run pipeline ETL firt and after it complete the reports pipeline will run to make sure data in bigquery is latest update.
I had tried to run in one pipe line but it can't help. Now I have to run manual for ETL first and then I run report pipeline.
Can any body give me some advice to run 2 job in one pipeline. Thanks.