I'm getting this type of error when loading a 1.3Gb json file with 10 million records using bq load --source_format=NEWLINE_DELIMITED_JSON
If I put only the first 1 million records into a separate file, it loads fine, but when I try to run on the full file, I get this:
Current status: PENDING
Waiting on bqjob_r6ac3e4 BigQuery error in load operation: Error processing job 'my-project-prod:bqjob_r6ac3e4da72b48e4f_000001528037b394_1': Too many errors encountered. Limit is: 0. Failure details: - File: 0: An internal error occurred and the request could not be completed.
I've been able to load other large tables but always get this error when I go to load this one. Is there a way to troubleshoot this other than breaking the file into smaller and smaller pieces to try to find the offending line?
(similar to Internal error while loading to Bigquery table)