1
votes

I am able to load data into Google cloud storage from Google BigQuery by using python client library, but when I'm trying to load the data using cloud SDK I'm getting below error.

"Access Denied: BigQuery BigQuery: Permission denied while writing data"

This is the CLI command I'm using

bq --location=US extract  --compression GZIP "project_id:dataset_d.table_if" gs://bucket_name/2019-03-31/temporary_table*.csv

In addition, I was not able to get the proper status of my job using a Python client library. Python is only giving me "DONE" status whether the job is "SUCCESS" or "FAILURE". The bq tool is more comfortable in getting the status response that's why I'm switching to it.

1
Have you checked that you have proper permission to write? - Alan Williams
@Qilliams I have the right set of permission in fact I'm able to perform the same stuff with python client libraries. - sam
I can repro the same error message when I'm using "bucket_name" which doesn't exist. - Yun Zhang
@Zhang, The bucket exist in cloud storage and the same name I'm using in oython code there it's working fine - sam

1 Answers

-1
votes

This sounds like there may be a difference in the credentials between the two tools.

In the BQ CLI (part of the cloud SDK) you can configure how the credentials are setup using gcloud auth commands. For example, gcloud auth list should enumerate the credentials you've setup for working with the cloud SDK. More info on the SDK commands can be found in the gcloud reference docs.

In contrast, python client library users typically setup Application Default Credentials (ADC), usually via setting the value of the GOOGLE_APPLICATION_CREDENTIALS environment variable to point to a credential. More info on this can be found in the authentication docs.

Once the credentials between the two are consistent I would not expect to see the kind of difference in behavior that you're describing.