While playing with bq API. I noticed that getting big table is rather time consuming from my local machine. That is when the question arose if it possible to make it work quicker if I deploy my project to GCP?
I looked through all the provided documentation from GCP and hadn't found any special treatment of google bq in case my app is deployed on GCP:
https://cloud.google.com/bigquery/docs/quickstarts/quickstart-client-libraries
https://cloud.google.com/bigquery/docs/query-overview
Is it correct that google bq doesn't work faster and it can't be made to work faster if it is deployed on GCP?
My concern is that I use http requests to remote host to deliver data from google bq, when GCP can use some other protocols and techniques for data delivery if google bq host is local to it.
rather time consumingis a vague term. What do you think is slow? Moving to the cloud will not help with query time but will help with data transfer time. Depends on the type of queries that you are making. - John Hanley