2
votes

I am using Confluent JDBC Kafka connector to publish messages into topic. The source connector will send data to topic along with schema on each poll. I want to retrieve this schema.

Is it possible? How? Can anyone suggest me

My intention is to create a KSQL stream or table based on schema build by Kafka connector on poll.

1

1 Answers

2
votes

The best way to do this is to use Avro, in which the schema is stored separately and automatically used by Kafka Connect and KSQL.

You can use Avro by configuring Kafka Connect to use the AvroConverter. In your Kafka Connect worker configuration set:

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://schema-registry:8081

(Update schema-registry to the hostname of where your Schema Registry is running)

From there, in KSQL you just use

CREATE STREAM my_stream WITH (KAFKA_TOPIC='source_topic', VALUE_FORMAT='AVRO');

You don't need to specify the schema itself here, because KSQL fetches it from the Schema Registry.

You can read more about Converters and serialisers here.

Disclaimer: I work for Confluent, and wrote the referenced blog post.