I'm implementing a Cloud Dataflow job on GCP that needs to deal with 2 GCP projects. Both input and output are Bigquery partitionned tables. The issue I'm going through now is that I must read data from a project A and write it into a project B.
I havent seen anything related to cross project service accounts and I can't give Dataflow two different credential key either which is a bit annoying ? I don't know if someone else went through that kind of architecture or how you dealt with it.