0
votes

We're encountering an issue where issuing a query of a BigTable external table with dry_run = True errors out when standard SQL is configured.

Is this expected? There's no issue when using legacy SQL, or when dry_run is set to false.

from google.oauth2.service_account import Credentials
from google.cloud.bigquery.client import Client
from google.cloud.bigquery.job import QueryJobConfig

creds = Credentials.from_service_account_file('secrets.json')
client = Client(project='project-id', credentials=creds)
query = 'select col from table limit 1'
job_config = QueryJobConfig()
job_config.dry_run = True
job_config.use_legacy_sql = False
client.query(query, job_config=job_config)

Error:

BadRequest: 400 POST https://www.googleapis.com/bigquery/v2/projects/project-id/jobs: Error while reading table: table, error message: Error accessing Cloud Bigtable: The API Key and the authentication credential are from different projects.

This code returns a google.cloud.bigquery.job.QueryJob with no error:

job_config = QueryJobConfig()
job_config.dry_run = True
job_config.use_legacy_sql = True
client.query(query, job_config=job_config)

We get the same issues when accessing the rest endpoints directly.

1
Hi! I'm with GCP Support, and I see that you had a Support case for this question. If possible, could you share the solution with the community? - amport

1 Answers

1
votes

Confirmed with GCP Support that this is an internal bug - a fix is estimated to be rolled out "within a month or two".