I am trying to use the Confluent schema-registry and its working for me following some examples I found in Github (https://github.com/gAmUssA/springboot-kafka-avro).
When consumer and producer shares the same namespace as the model than its working.
When the consumer is in a different project with different namespace but the same class (name and properties wise) than it is not working.
Confluent Avro deserializer can deserialize to GenericData$Record class with the correct values but it cant cast it to the actual object.
I am trying this:
@Data
@AllArgsConstructor
public class User {
String name;
int age;
}
...
@KafkaListener(topics = "users", groupId = "group_id")
public void consume(ConsumerRecord<String, User> record) {
log.info(String.format("Consumed message -> %s", record.value().getName()));
}
The above code fails on a casting issue.
When I add specific.avro.reader=true to the props than it fails also.
Isn't this the whole purpose of the schema-registry, to be a central repo so the data can be deserialized by the schema in different projects and even different languages (python,java,.net,etc..)?
What am I missing?