I was going through Google BigQuery documentation and I see there is a limit of 5TB file capacity for unencrypted file load and 4TB for the encrypted file load in BigQuery, with 15TB per load job.
I have a hypothetical question - How can I load a text file larger than 16TB (assuming encryption will bring it in the range of 4TB)? I also see the GCS Cloud storage limit is 5TB per file.
I have never done it but here is how I think of possible approach but not sure and looking for confirmation. First, we will have to split the file. Next, we have to encrypt them and transfer them to GCS. Next, load them in the Google BigQuery table.