0
votes

We have modified the Kafka Connect JDBC to support a custom converter which will convert a single SinkRecord into multiple SinkRecords so to support transactional inserts. When creating a sink, one might specify in the configuration properties a class that implements SinkRecordConverter

We then tried packaged a uber jar with the implementation of this custom converter and we tried to deploy it in two ways:

  1. We placed in the same folder of kafka-connect-jdbc
  2. We modified the plugins.path in the connect-distributed.properties to /usr/local/share/java and we placed our converter in /usr/local/share/java/myconverter/myconverter-1.0.jar

Then we tried to deploy the sink, but in both cases the code that tries to create an instance of this converter by reflection fails with a java.lang.ClassNotFoundException.

We tried to debug the classloading issue by placing a breakpoint where the issue occurs in both cases:

  • In the first case the jar would appear as one of the jars on the URLClasspath
  • In the second case, it would not even appears as one of the jars on the URLClasspath

What is the correct way to add custom converters to kafka-connect-jdbc?

1

1 Answers

0
votes

We had two issues in one:

  1. To assembly the jars we were using a SBT plugin named oneJar, which creates a custom classloader
  2. We needed to access that classes from inside an existing kafka connector (jdbc) and not from kafka-connect only.

The solution we found is the following:

  • We abandoned the uber jar and we deploy all the libs on the kafka-connect instance using sbt pack.
  • We place the jars physically on the same folder where kafka-connect-jdbc is located