I am trying to do a load operation in bigquery from GCS files using load_job
in ruby.
The problem is, when I have multiple files in GCS affecting different tables, there's a chance some might fail due to validation/network issues, leading to inconsistent data in bigquery. Let's say I want to load last hour data which is stored in 5 files, even if 1 of these load jobs fail, I'll be having bad data for analytics.
Is there a way I can batch all these load jobs in a single atomic request to bigquery?