0
votes

I have the following problem. I want to start a spring boot application which reads and writes from kafka using @KafkaListener.

I want to:

  1. initialise the kafka consumer to the latest offset.
  2. write messages to kafka
  3. then read those messages back from kafka, using the consumer created in step 1

I am facing the following problem Step 2 is sometimes running before the kafka consumer has had time to do the first poll() operation. This means that it then ignores these messages because it considers them before latest.

Is there anyway when using spring-kafka to ensure that the consumers poll before the application starts up.

1
If you are consuming and producing in the same thread. Then you can simply force the KafkaTemplate to send synchronously, meaning that the send will block until the message has been sent. E.g. kafkaTemplate.send("key", "value").get(). The .send(..) return a Future which you can block by calling .get().kkflf

1 Answers

0
votes

Typically it is wrong architecture to have a producer and consumer in the same application. So, there indeed no guarantee that one of them exists when another starts its job.

Anyway there is a trick for your application to start producing as late as possible.

for this purpose you need to consider to implement a SmartLifeCycle and place logic logic for sending into its start() implementation.