I am using Confluent JDBC Sink Connector to capture all changes from the Kafka topic to the database. My message is the JSON format without any attached schema. For example:
{ "key1": "value1", "key2": 100}
Here is my configuration:
name=sink-mysql-1
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=send_1
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
database.hostname=jdbc:mysql://0.0.0.0:3306/test_tbl
database.user=root
database.password=root
insert.mode=upsert
pk.mode=kafka
auto.create=true
auto.evolve=true
The issue that I met is: Because of the legacy system, I cannot change the message format. So my messages are the JSON object without schema information. Does the library support mapping fields? For example mapping from field A to field B under database.
Thanks