0
votes

Context: i've installed a Kafka Cluster with the confluent helm chart on AWS Kubernetes.

And i've configured a Oracle Server so I can connect to it with Kafka Connect.

My Kafka connect configuration

{
    "name": "oracle-debez",
    "config": {
        "connector.class" : "io.debezium.connector.oracle.OracleConnector",
        "tasks.max" : "1",
        "database.server.name" : "servername",
        "database.hostname" : "myserver",
        "database.port" : "1521",
        "database.user" : "myuser",
        "database.password" : "mypass",
        "database.dbname" : "KAFKAPOC",
        "database.out.server.name" : "dbzxout",
        "database.history.kafka.bootstrap.servers" : "mybrokersvc:9092",
        "database.history.kafka.topic": "my-conf-topic",
        "table.include.list": "MYSCHEMA.MYTABLE",
        "database.oracle.version": 11,
        "errors.log.enable": "true"
    }
}

I've configured in this way and some topics are created:

my-conf-topic: Comes with the table DDL servername servername.MYSCHEMA.MYTABLE

In the 'kafka-poc-dev.MYSCHEMA.MYTABLE' topic are all the information from the table.

when i start the plugin all the information is saved with success! But the problem is that every new insert or update does not appears on the topic.

One more thing, my oracle is not the version 11, my version is Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production, but if I do not put the property "database.oracle.version": 11, it gives me the error:

"org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.\n\tat io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:42)\n\tat io.debezium.connector.oracle.xstream.XstreamStreamingChangeEventSource.execute(XstreamStreamingChangeEventSource.java:82)\n\tat io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:140)\n\tat io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:113)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat d.java:834)\nCaused by: oracle.streams.StreamsExa:343)\n\tat io.debezium.connector.oracle.xstream.XstreamStreamingChangeEventSource.execute(XstreamStreamingChangeEventSource.java:70)\n\t... 7 more\n"

Can somebody help me understand what i'm doing wrong here?

Now when i create the connector the table is being locked.. and the data is not arriving at the topics...

Table being locked

Thanks!

2

2 Answers

0
votes

I'm facing a similar problem, but currently using the LogMiner adapter.

The initial snapshot and streaming works just fine, but can't get any more update/insert events if I add more connectors to Kafka Connect to monitor different tables and schemas.

Everything just stops working, even though I can see that the LogMiner sessions are still active.

Did you enable Golden Gate replication and Archive log mode?

About the database.oracle.version problem you're facing, you should just use the default value as mentioned here:

https://debezium.io/documentation/reference/connectors/oracle.html#oracle-property-database-oracle-version

"database.oracle.version" : "12+"

Posting as an answer because I can't comment yet.

Hope it helps you somehow.

0
votes

You are using container and PDB version of oracle so you need to pass database.pdb.name value in your property. you must have a user with logminer or Xstream access.