We can query results in any language from Google BigQuery using the predefined methods -> see docs.
Alternatively, we can also query the results and store them to cloud storage, for example in a .csv -> see docs on storing data to GCS
When we repeatedly need to extract the same data, eg lets say 100 times per day, does it make sense to cache the data to Cloud Storage and load it from there, or to redo the BigQuery requests?
What is more cost efficient and how would I obtain the unit cost of these requests, to estimate a % difference?