I am using kafka connect framework from confluent to produce messages from my application servers into a kafka cluster (zookeeper + brokers + schema registry for avro support).
The data I am sending through connect is defined by an avro schema. My schema represents structured object containing ENUMS. Indeed Apache avro supports supports enumeration types. I dont have to commit my schema to the registry because kafka connect API does it automatically.
My problem is that kafka connect seems to parse ENUMS into String. When I try to consume I see that the schema commited by connect is not correct since it has converted all my ENUMS into String. Thereby I cannot consume my data withou implementing a conversion logic from String back to ENUMS.
I want to keep my logical information as an ENUM and to use kafka connect as well. I jumped into the kafka-connect code and it seems to not handle the enumeration types but only basic types.
My alternative current alternative is to build my own producing framework which keeps ENUMS by imitating connect framework but this is time consuming, and I cannot avoid to use ENUMS.
Have you manage to produce and consume record containing ENUMS to kafka using kafka-connect ?
Any help or experience feedback is welcomme, Thanks!