0
votes

While loading large datasets into BigQuery . The table size is more than 170TB. In BigQuery we've heard that there is a limitation of not more than 15 TB of load limit per load job irrespective of file zip being Avro , parquet etc. If yes, then can you please share any workaround or options to load such high volume of data?

1

1 Answers

0
votes

Check the quotas page to find the documented limitations:

There's a "Maximum size per load job — 15 TB across all input files for CSV, JSON, Avro, Parquet, and ORC" restriction indeed.

But you should be able to load 170TB easily, across multiple load jobs. Or you are telling us that all you have is a single 170TB file?

If you can use multiple load jobs from multiple files to a single table, the limit is "Load jobs per table per day — 1,000". That s 2 orders of magnitude above the size the question asks about.