2
votes

The goal is to access Google Cloud Platform project Bigquery data from Google Cloud Platform instance. The way I know how to do this locally is from Google Cloud Platform JSON credentials. I'm reluctant to put JSON credentials on Google Cloud Platform instances.

Can someone point me to documentation or pointers on if this is possible?

Essentially here: https://cloud.google.com/bigquery/docs/reference/libraries#client-libraries-install-python while in a GCP vm.

1
I would think Secrets Management would be the route you want to go. - zero298
So I learned you can associate service accounts with GCP compute instances. - Jonathan
The answer provided by Felipe Hoffa can be a good option for your use case. But if not, I would separate a fine-grained access control to access sensitive data in BQ ( IAM Roles would help for this case); and securing the json credentials file, in which case, more information can be found in this link to keep the file safe. If the information provided so far doesn't cover your use case, please elaborate on the issue. - rsantiago

1 Answers

1
votes

As your comment says, you can associate service accounts with GCP compute instances. There's no need to distribute secrets, credentials will be part of the environment.

When you create a new instance, you can choose to give it BigQuery access or not:

enter image description here