I created a service account and assigned the Dataflow developer role, Compute Viewer role, and the Storage Object Admin role to the temporary bucket. Then, I created another bucket with my admin user account(which has the project owner role) named as gs://outputbucket. Finally, I submit a dataflow job with following cmd:
export GOOGLE_APPLICATION_CREDENRTIALS=<path-to-credential>
TMPBUCKET=temporarybucket
OUTBUCKET=outputbucket
PROJECT=myprojectid
python -m apache_beam.examples.wordcount \
--input gs://dataflow-samples/shakespeare/kinglear.txt \
--output gs://$OUTBUCKET/wordcount/outputs \
--runner DataflowRunner \
--project $PROJECT \
--temp_location gs://$TMPBUCKET/tmp/
Similarly, I create a Dataflow job from existing Cloud Pub/Sub to BigQuery template. It can write to any table in the same project without permission. Can anyone explain how could this be possible?
Furthermore, is this a potential security issue according to Google's Principle of Least Privilege?