0
votes

From time to time we getting error from google while working with BigQuery API

File "../.venv/lib/python3.4/site-packages/google/cloud/bigquery/dataset.py", line 452, in exists query_params={'fields': 'id'})

File "../.venv/lib/python3.4/site-packages/google/cloud/_http.py", line 293, in api_request raise exceptions.from_http_response(response) google.cloud.exceptions.ServiceUnavailable: 503 GET https://www.googleapis.com/bigquery/v2/projects//datasets/?fields=id: Error encountered during execution. Retrying may solve the problem.

Caused by python code

   destination_dataset.exists()

Our system:

  • Python 3.4
  • google-cloud-bigquery==0.26.0
  • google-cloud==0.27.0

Last time this error has occurred:

  • 6.02.18 at 10:31pm CET
  • 4.02.18 at 6:30am CET
  • 25.01.19 at 8:50pm CET
  • 25.01.19 at 10:34pm CET

Any ideas why it happening and how we can avoid this error?

1
You cannot avoid errors like these. BigQuery's SLA (and most other public cloud provider services) is something like 99.99%. It's not 100%. You need to account for these types of errors/problems in your code by using strategies like exponential back-off-and-retry e.g. cloud.google.com/storage/docs/exponential-backoff (GCS)Graham Polley

1 Answers

0
votes

Adding here documentation links, to the excellent comment by Graham Polley. You may read "Truncated Exponential Backoff" and the "Exponential Backoff" sub-chapter of the "Loading Data into BigQuery from a Local Data Source" documentation page.