0
votes

While playing with bq API. I noticed that getting big table is rather time consuming from my local machine. That is when the question arose if it possible to make it work quicker if I deploy my project to GCP?

I looked through all the provided documentation from GCP and hadn't found any special treatment of google bq in case my app is deployed on GCP:

https://cloud.google.com/bigquery/docs/quickstarts/quickstart-client-libraries

https://cloud.google.com/bigquery/docs/query-overview

Is it correct that google bq doesn't work faster and it can't be made to work faster if it is deployed on GCP?

My concern is that I use http requests to remote host to deliver data from google bq, when GCP can use some other protocols and techniques for data delivery if google bq host is local to it.

1
rather time consuming is a vague term. What do you think is slow? Moving to the cloud will not help with query time but will help with data transfer time. Depends on the type of queries that you are making. - John Hanley

1 Answers

0
votes

Check out the storage API for a faster way to read from tables.

BigQuery supports an API called tabledata.list, which is free, but not very fast at transferring a significant number of rows. BigQuery recently released the new storage API linked above in beta, however, which has a different backend implementation and supports parallel reads for faster transfer of data.