5
votes

Given example schema contains a field which is union of null and string,

Schema

    {
  "type":"record",
  "name":"DataFlowEntity",
  "namespace":"org.sdf.manage.commons.server",
  "fields":
  [
    {"name":"dataTypeGroupName","type":["null","string"]},
    {"name":"dataTypeName","type":"string"},
    {"name":"dataSchemaVersion","type":"string"}
  ]
}

I want to convert following json object,

Object

{
  "dataTypeGroupName": "dg_1",
  "dataTypeName": "dt_1",
  "dataSchemaVersion": "1"
}

into an avro object corresponding to above schema. I tried with Avro's JsonDecoder with code snppet described below,

    String dataFlowEntity = "{\"dataTypeGroupName\": \"dg_1\", \"dataTypeName\": \"dt_1\", \"dataSchemaVersion\": \"1\"}";
    Schema schema = DataFlowEntity.SCHEMA$;
    InputStream inputStream = new ByteArrayInputStream(dataFlowEntity.getBytes());
    DataInputStream dInputStream = new DataInputStream(inputStream);
    Decoder decoder = DecoderFactory.get().jsonDecoder(schema, dInputStream);
    DatumReader<DataFlowEntity> datumReader = new GenericDatumReader<DataFlowEntity>(schema);
    DataFlowEntity dataFlowEntityObject = DataFlowEntity.newBuilder().build();
    dataFlowEntityObject = datumReader.read(null, decoder);

It fails with exception,

threw exception [org.apache.avro.AvroRuntimeException: org.apache.avro.AvroRuntimeException: Field dataTypeGroupName type:UNION pos:0 not set and has no default value] with root cause
org.apache.avro.AvroRuntimeException: Field dataTypeGroupName type:UNION pos:0 not set and has no default value
  at org.apache.avro.generic.GenericData.getDefaultValue(GenericData.java:874)
  at org.apache.avro.data.RecordBuilderBase.defaultValue(RecordBuilderBase.java:135)
3

3 Answers

2
votes

If using node.js is an option, you can use avsc to do the conversion for you. Calling clone with wrapUnions set will automatically wrap values into the first union branch they match.

Using your example:

var avsc = require('avsc');

var type =  avsc.parse({
  "type":"record",
  "name":"DataFlowEntity",
  "namespace":"org.sdf.manage.commons.server",
  "fields": [
    {"name":"dataTypeGroupName","type":["null","string"]},
    {"name":"dataTypeName","type":"string"},
    {"name":"dataSchemaVersion","type":"string"}
  ]
}, {wrapUnions: true});

var invalidRecord = {
  "dataTypeGroupName": "dg_1",
  "dataTypeName": "dt_1",
  "dataSchemaVersion": "1"
};

var validRecord = type.clone(invalidRecord, {wrapUnions: true});
// == {
//   "dataTypeGroupName":{"string":"dg_1"},
//   "dataTypeName":"dt_1",
//   "dataSchemaVersion":"1"
// }
1
votes

Check this project out: https://github.com/allegro/hermes/pull/749/files

You are interested in the JsonAvroConverter. It de-serializes from json (without union types) to Avro generated objects (that have union types). Actually, it gets from the schema of types on the union and tries them one by one. It works excellent in our case.

This is doing the job: https://github.com/allegro/json-avro-converter/blob/master/converter/src/main/java/tech/allegro/schema/json2avro/converter/JsonGenericRecordReader.java

Regards!

0
votes

There is a new JSON encoder in the works that should address this common issue:

https://issues.apache.org/jira/browse/AVRO-1582

https://github.com/zolyfarkas/avro

This seems to be a common issue that lots of people run into when dealing with Avro.

If you switch your JSON to this it should work:

{
  "dataTypeGroupName": {"string" : "dg_1"},
  "dataTypeName": "dt_1",
  "dataSchemaVersion": "1"
}

This is because Avro encodes unions with a object type wrapper, unfortunately, even simple unions to represent the optional type which don't need a JSON object wrapper to disambiguate. Avro's intent never seemed to be to generate friendly JSON, more so to use JSON as a serialization format.

For more details: https://avro.apache.org/docs/1.7.7/spec.html#json_encoding