I can't seem to find any documentation about this. I have an apache-beam pipeline that takes some information, formats it into TableRows and then writes to BigQuery.
[+] The problem:
The rows are not written to BigQuery until the Dataflow job finishes. If I have a Dataflow job that takes a long time I'd like to be able to see the rows being inserted into BigQuery, can anybody point me the right direction?
Thanks in advance
--streaming
or correspondingpipelineOption
flag for Dataflow runner) and add triggering (e.g.GlobalWindow
+trigger every xx seconds
. – Anton