0
votes

I have tons of avro format documents saved in GCS.

I would like to use BigQuery REST API to load them back as BigQuery tables.

Is there a limit for the total amount of data (such as 10 TB) I can load per day?

Thanks, Yefu

1
Data size you should be safe, The limit is on the number of Load Job per table, per day. so as long as you run a batch, you can ingest data every 90 sec,Adrian

1 Answers

0
votes

You are limited by several quota [1]

  • Maximum size per load job — 15 TB across all input files for CSV, JSON, Avro, Parquet, and ORC
  • Maximum number of source URIs in job configuration — 10,000 URIs
  • Maximum number of files per load job — 10 Million total files including all files matching all wildcard URIs

From above, you can upload up to 15TB each time (per job)

  • Load jobs per table per day — 1,000 (including failures)
  • Load jobs per project per day — 100,000 (including failures)

With these 2 limits, you can upload up to 1000 * 15TB = 15PB per table per day.

[1] https://cloud.google.com/bigquery/quotas#load_jobs