1
votes

I want to use a kafka connect jdbc sink connector with the avro converter.

Those are my avro configs :

"key.converter":"io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url" : "http://myurl.com" ,
"value.converter":"io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url" : "http://myurl.com" ,

Schemas are set to false

key.converter.schemas.enable=false
value.converter.schemas.enable=false

Now, when i start the connector, i get this error

Caused by: org.apache.kafka.connect.errors.ConnectException: Value schema must be of type Struct

From what i read, Struct are for json schemas, right ? i should not have any struct if i am using an avro schema ?

Avro schema types are : record, enum, arrays, maps, unions and fixed but there is no struct.

What am i missing ?

Thanks !!

2

2 Answers

1
votes

An Avro record creates a Struct Connect data type.

The error is saying your data is not a record.

Schemas are set to false

Those properties don't mean anything to the Avro converter. Avro always has a schema

I want to use a kafka connect jdbc sink connector with the avro converter.

Then the producer needs to send records with schemas. This includes Avro records or JSON with schemas enabled

0
votes

Avro structure has always schema. So you need to set as mentioned below

key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schema.registry.url=http://localhost:8081
key.converter.enhanced.avro.schema.support=true

value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
value.converter.enhanced.avro.schema.support=true