BigQuery generally does a good job of loading Avro data, but "bq load" is having a lot of trouble with timestamps and other date/time fields that use the Avro logicalType attribute.
- My data with Avro type timestamp-millis is mangled when BigQuery TIMESTAMP interprets them as microsecond timestamps (off by 1000).
- A timestamp-micros integer that can load into TIMESTAMP becomes INVALID in a BigQuery DATETIME. I can't find an explanation of what would be valid at https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types
- Strings in ISO8601 format can't load into TIMESTAMP or DATETIME (Incompatible types error) but I think BigQuery would support that if I was loading plain JSON.
- Avro "date" type fails to load into DATE (also Incompatible types).
I guess I could workaround these problems by always loading the data into temporary fields and using queries to CAST or transform them to additional fields, but that doesn't scale or support schema evolution or stream nicely. Producing data in Avro with well-defined schemas is supposed to avoid that extra step of transforming data again for different consumers.
Is BigQuery really this incompatible with Avro dates and times? (or am I doing something dumb)
Or is "bq load" the problem here? Is there a better way to load Avro data?