I am trying to export a table from BigQuery to google storage using the following command within the console:
bq --location=<hidden> extract --destination_format CSV --compression GZIP --field_delimiter "|" --print_header=true <project>:<dataset>.<table> gs://<airflow_bucket>/data/zip/20200706_<hidden_name>.gzip
I get the following error :
BigQuery error in extract operation: An internal error occurred and the request could not be completed.
Here is some information about the said table
Table ID <HIDDEN>
Table size 6,18 GB
Number of rows 25 854 282
Created 18.06.2020, 15:26:10
Table expiration Never
Last modified 14.07.2020, 17:35:25
Data location EU
What I'm trying to do here, is extract this table into google storage. Since the table is > 1 Gb, then it gets fragmented... I want to assemble all those fragments into one archive, into a google cloud storage bucket.
What is happening here? How do I fix this?
Note: I've hidden the actual names and locations of the table & other information with the mention <hidden> or <airflow_bucket> or `:.
`