I am trying to figure out how to load dollar values into a Numeric column in BigQuery using an Avro file. I am using golang and the goavro package to generate the avro file.
It appears that the appropriate datatype in go to handle money is big.Rat.
BigQuery documentation indicates it should be possible to use Avro for this.
I can see from a few goavro test cases that encoding a *big.Rat into a fixed.decimal type is possible.
I am using a goavro.OCFWriter to encode data using a simple avro schema as follows:
{
"type": "record",
"name": "MyData",
"fields": [
{
"name": "ID",
"type": [
"string"
]
},
{
"name": "Cost",
"type": [
"null",
{
"type": "fixed",
"size": 12,
"logicalType": "decimal",
"precision": 4,
"scale": 2
}
]
}
]
}
I am attempting to Append data with the "Cost" field as follows:
map[string]interface{}{"fixed.decimal": big.NewRat(617, 50)}
This is successfully encoded, but the resulting avro file fails to load into BigQuery:
Err: load Table MyTable Job: {Location: ""; Message: "Error while reading data, error message: The Apache Avro library failed to parse the header with the following error: Missing Json field \"name\": {\"logicalType\":\"decimal\",\"precision\":4,\"scale\":2,\"size\":12,\"type\":\"fixed\"}"; Reason: "invalid"}
So am doing something wrong here... Hoping someone can point me in the right direction.