0
votes

When producing or consuming messages in kafka along with Avro schema's being stored in schema registry, is it possible to have the messages automatically convert to scala case classes?

val props = new Properties()
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, classOf[KafkaAvroSerializer].getName)
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, classOf[KafkaAvroSerializer].getName)
props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081")
...

val producer = new KafkaProducer[String, User](props)
val user = User(123, "my-name")
producer.send(new ProducerRecord[String, User]("my-topic", user.id, user)

Currently when I do this I get an error:

java.lang.IllegalArgumentException: Unsupported Avro type. Supported types are null, Boolean, Integer, Long, Float, Double, String, byte[] and IndexedRecord

My User object is just a case class like:

case class User(id: Int, name: String)

I believe I need to use some sort of Avro to java class generator so it will serialize/deserialize correctly right?

Is there a way for me to skip that step and have this somehow automatically map to a scala case class using Avro4s or something?

What are my options?

I want to only store my schemas in schema-registry ideally also.

1

1 Answers

0
votes

By default, case classes are not Avro subclasses.

You can use projects like avro4s to generate and map Avro schemas to classes, but to my knowledge there's no way to skip that.

If you want to skip class generation, use GenericRecord