0
votes

We currently have a Dead Letter Topic (DLT) configuration in place by using Spring Kafka (in a Spring Boot application). We are using the DeadLetterPublishingRecoverer within the SeekToCurrentErrorHandler. We assigned the latter one to the ConcurrentKafkaListenerContainerFactory.

While processing our first messages; due to a stupid mistake in our service, we ended up with some NullPointerException exceptions and 20% of the messages ended up on the DLT (which is expected behaviour and is perfect for us).

We fixed the bug, but now we want to process those 20% messages again. Possibilities we see:

  • write a small application that copies the messages from the DLT to the original topic
  • add a second @KafkaEventListener in our application which reads from the DLT

Solution 2 is my preferred solution as moving it back to the original topic also implies that other consumer groups get the message again (should normally be OK, as all of our services are idempotent).

I was wondering if there are other best practices to solve this problem. If not, I was also wondering how I can dynamically activate/deactive the @KafkaEventListener for the DLT (as you don't want to have this listener all the time up)

Thanks for your feedback!

Jochen

1

1 Answers

2
votes

Solution number 2 looks perfect to me.

I was also wondering how I can dynamically activate/deactive the @KafkaEventListener for the DLT (as you don't want to have this listener all the time up)

You can use the @KafkaListener property autoStartup, introduced since 2.2.

@Autowired
private KafkaListenerEndpointRegistry registry;

@KafkaListener(id = "123", topics = "XXX.DLT", autoStartup = "true"){
    //do your processing
}

//After you are done
registry.getListenerContainer("123").stop();