I have uploaded a .avro file on Google Cloud Storage which is about 100MB. It is converted from a 800MB .csv file.
When trying to create a table from this file in the BigQuery web interface, I get the following error after a few seconds:
script: Resources exceeded during query execution: UDF out of memory. (error code: resourcesExceeded)
Job ID audiboxes:bquijob_4462680b_15607de51b9
I checked the BigQuery Quota Policy and I think my file does not exceed it.
Is there a workaround or do I need to split my original .csv in order to get multiple, smaller .avro files ?
Thanks in advance !