0
votes

Export BigQuery table to Cloud Storage Avro file succeeds

table = bq.Table(bq_table_ref)
job_id = table.extract(destination=gs_object, format='avro')

while import Cloud Storage Avro file into BigQuery table fails

table = bq.Table(bq_table_ref)
job_id = table.load(source=gs_object, source_format='avro')

and the alternative fails as well as

%%bq load -m create -f avro -p gs_object -t bq_table_ref

Is loading Avro files not supported within google.datalab.bigquery ?

1

1 Answers

4
votes

According to the google.datalab.bigquery docs, the format of the data should be ‘csv’ or ‘json’. The default is ‘csv’. You can't use 'avro' format.