0
votes

i had started zookeeper ,kafka server ,kafka producer and kafka consumer and i had put jdbc sql connector jar downloaded from confluent and put the jar in the path and i have mentioned plugin.path in connect-standalone properties.and i have run connect-standalone.bat ....\config\connect-standalone.properties ....\config\sink-quickstart-mysql.properties without any error but it has many warnings and it is not getting started,but my data is not getting reflected in tables.what i have missed?can u please help me out i have below warnings

                                                                                                    org.reflections.ReflectionsException: could not get type for name io.netty.inter
nal.tcnative.SSLPrivateKeyMethod
        at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:312)
        at org.reflections.Reflections.expandSuperTypes(Reflections.java:382)
        at org.reflections.Reflections.<init>(Reflections.java:140)
        at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader$Inte
rnalReflections.<init>(DelegatingClassLoader.java:433)
        at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scan
PluginPath(DelegatingClassLoader.java:325)
        at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scan
UrlsAndAddPlugins(DelegatingClassLoader.java:261)
        at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.init
PluginLoader(DelegatingClassLoader.java:209)
        at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.init
Loaders(DelegatingClassLoader.java:202)
        at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.jav
a:60)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone
.java:79)
Caused by: java.lang.ClassNotFoundException: io.netty.internal.tcnative.SSLPriva
teKeyMethod
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:310)
        ... 9 more
1

1 Answers

0
votes

No need to write a source connector yourself unless you need to connect kafka to some exotic data source. Popular tools like mysql are already quite well covered. There is already a "jdbc-connector" by confluent that does what you want.

https://docs.confluent.io/current/connect/kafka-connect-jdbc/index.html

You'll need a working kafka-connect installation and then you can "connect" your mysql tables to kafka with an HTTP POST to the kafka connect API. Just specify a comma-separate list of the tables you'd like to be used as sources in the tables.whitelist attribute. For example, something like this....

curl -X POST $KAFKA_CONNECT_API/connectors -H "Content-Type: application/json" -d '{
      "name": "jdbc_source_mysql_01",
      "config": {
              "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
              "connection.url": "jdbc:mysql://mysql:3306/test",
              "connection.user": "connect_user",
              "connection.password": "connect_password",
              "topic.prefix": "mysql-01-",
              "poll.interval.ms" : 3600000,
              "table.whitelist" : "test.accounts",
              "mode":"bulk"
              }
      }'