I'm using Google Dataflow with templates: a template is deployed to GCS by the CI server (Continuous Integration), and later a gcloud dataflow jobs run
command is used to start a batch job from this template. Now, within the pipeline itself I would like to know the start time of this exact pipeline (to use in the names of the output files).
Is this kind of introspection possible in Beam/Dataflow? Is it possible to get the job name and start time of the job from within the job itself? (That is, in the code which executes on the VMs by the Dataflow)?
Thank you!