1
votes

I'm trying to use Google BigQuery to download a large dataset for the GitHub Data Challenge. I have designed my query and am able to run it in the console for Google BigQuery, but I am not allowed to export the data as CSV because it is too large. The recommended help tells me to save it to a table. This requires requires me to enable billing on my account and make a payment as far as I can tell.

Is there a way to save datasets as CSV (or JSON) files for export without payment?

For clarification, I do not need this data on Google's cloud and I only need to be able to download it once. No persistent storage required.

3

3 Answers

1
votes

If you can enable the BigQuery API without enabling billing on your application, you can try using the getQueryResult API call. You're best bet is probably to enable billing (you probably won't be charged for the limited usage you need as you will probably stay within the free tier but if you do get charged it should only be a few cents) and save your query as a Google Storage object. If its too large I don't think you'll be able to use the Web UI effectively.

0
votes

See this exact topic documentation:

Summary: Use the extract operation. You can export CSV, JSON, or Avro. Exporting is free, but you need to have Google Cloud Storage activated to put the resulting files there.

0
votes

use BQ command line tool $ bq query

use the --format flag to save results as CSV.