3
votes

Currently I have a sink connector which gets data from topic A and sends its to an external service.

Now I have a use case when based on some logic I should send it to topic B instead of the service. And this logic based on the response of the target service,that will return response based on the data. So because the data should be sent to the target system every time I couldnt use the stream api.

Is that feasible somehow?

Or should I add a kafka producer manually to my sink? If so is there any drawback?

1

1 Answers

0
votes

The first option, is to create a custom Kafka Connect Single Message Transform that will implement the desired logic and possibly use ExtractTopic as well (depending on how your custom smt looks like).


The second option is to build your own consumer. For example:

Step 1: Create one more topic on top of topic A

Create one more topic, say topic_a_to_target_system

Step 2: Implement your custom consumer

Implement a Kafka Consumer that consumes all the messages from topic topic_a. At this point, you need to instantiate a Kafka Producer and based on the logic, decide whether the topic needs to be forwarded to topic_B or to the target system (topic_a_to_target_system).

Step 3: Start Sink connector on topic_a_to_target_system

Finally start your sink connector so that it sinks the data from topic topic_a_to_target_system to your target system.