4
votes

This is very similar to this question -- I have basically the same question, and one of the commenters there said he had solved it but didn't flesh out the solution.

  • I have data in a google sheet, which I have set up as a google BigQuery federated data source (external data source).
  • I want to import this data in cloud datalab using pandas_gbq
  • When I import usual (non-federated) BigQuery tables, from the same project, in the datalab instance it all works fine.
  • When I try import the federated source (from the google sheet) I get the error: "GenericGBQException: Reason: accessDenied, Message: Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found."

I have tried to follow mattm's instructions on the original question post: "I had the exact same issue and I had to enable both the Drive API for the project (in addition to BigQuery API), as well as use the BigQuery+Drive scopes. I also had to then manually permission the sheets to allow access to the [email protected] account my project used to access the Google Sheets with."

  • I have enabled drive API for the project.
  • I added the service account that is used in the datalab instance to the sharing permissions list in the google sheet: [email protected]
  • I am not sure how to "use the BigQuery+Drive scopes" ?

Has anyone else managed to get this working? Is it a matter of using BigQuery and Drive scopes, and if so, how do I do that?

1

1 Answers

3
votes

When formatting your request, make sure credentials have the following two scopes: 'https://www.googleapis.com/auth/drive.readonly' and 'https://www.googleapis.com/auth/cloud-platform'