1
votes

This is my first Spring Boot, Kafka project and my first Stack Overflow post.

I'm using Spring Boot 2.1.1 and spring-kafka 2.2.7.RELEASE.I am trying to configure Spring SeekToCurrentErrorHandler with a DeadLetterPublishingRecoverer to send de-serialization failure messages to a different topic. The new DLT queue is not being created.

While I am able to see the error message due to de-serialization failure as an ERROR in the application logs/IDE Console (and process subsequent messages when feeding the topic manually), the "originalTopic.DLT" topic is not created and hence the incorrect message is not written to the .DLT topic. I read in Spring documentation that “By default, the dead-letter record is sent to a topic named originalTopic.DLT (the original topic name suffixed with .DLT) and to the same partition as the original record”

Instead, I see the failed message in the log file (.log) along with the valid messages of the topic listed in @KafkaListner annotation.

I am trying to write the error message as-is to the .DLT topic for further Error processing.

Here is the configuration I have so far. Any direction regarding where I'm going wrong would be really helpful.

I referred the following links https://docs.spring.io/spring-kafka/reference/html/#serdes, Configuring Spring Kafka to use DeadLetterPublishingRecoverer and SeekToCurrentErrorHandler: DeadLetterPublishingRecoverer is not handling deserialize errors to figure out a solution. But the issue I am facing is that the .DLT is not being created.

@EnableKafka
@Configuration
@ConditionalOnMissingBean(type = "org.springframework.kafka.core.KafkaTemplate")
public class SubscriberConfig {
    @Value("${spring.kafka.bootstrap-servers}")
    private String bootstrapServers;

    @Autowired
    private KafkaTemplate<Object, Object> kafkaTemplate;

    @Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
        props.put(ErrorHandlingDeserializer2.KEY_DESERIALIZER_CLASS, StringDeserializer.class);
        props.put(ErrorHandlingDeserializer2.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName());
        props.put(JsonDeserializer.KEY_DEFAULT_TYPE, "java.lang.String");
        props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, "com.sample.main.entity.Transaction");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        return props;
    }

    @Bean
    public ConsumerFactory<String, Transaction> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),
                new JsonDeserializer<>(Transaction.class, false));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, Transaction> kafkaListenerContainerFactory() {
      ConcurrentKafkaListenerContainerFactory<String, Transaction> factory = new ConcurrentKafkaListenerContainerFactory<>();
      factory.setConsumerFactory(consumerFactory());
      factory.setErrorHandler(new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(kafkaTemplate),3));
      return factory;

    @KafkaListener(topics = "${spring.kafka.subscription.topic}", groupId="json")
    public void consume(@Payload Transaction message, @Headers MessageHeaders headers) {
    //Business Logic...... 
    this.sendMsgToNewTopic(newTopicName, transformedTrans);

}
}
}

Console output is 2019-07-29 15:28:03 ERROR LoggingErrorHandler:37 - Error while processing: ConsumerRecord(topic = trisyntrans, partition = 0, offset = 10, CreateTime = 1564432082456, serialized key size = -1, serialized value size = 30, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = this is failed deserialization)
org.springframework.kafka.support.converter.ConversionException: Failed to convert from JSON; nested exception is com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'this': was expecting 'null', 'true', 'false' or NaN
 at [Source: (String)"this is failed deserialization"; line: 1, column: 5]
    at org.springframework.kafka.support.converter.StringJsonMessageConverter.extractAndConvertValue(StringJsonMessageConverter.java:128)
    at org.springframework.kafka.support.converter.MessagingMessageConverter.toMessage(MessagingMessageConverter.java:132)
    at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.toMessagingMessage(MessagingMessageListenerAdapter.java:264)
    at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:74)
    at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:50)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:1275)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:1258)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1219)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1200)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1120)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:935)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:751)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:700)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.lang.Thread.run(Thread.java:748)
Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'this': was expecting 'null', 'true', 'false' or NaN
 at [Source: (String)"this is failed deserialization"; line: 1, column: 5]
    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)
    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:679)
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._reportInvalidToken(ReaderBasedJsonParser.java:2839)
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._reportInvalidToken(ReaderBasedJsonParser.java:2817)
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._matchToken(ReaderBasedJsonParser.java:2606)
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._matchTrue(ReaderBasedJsonParser.java:2558)
    at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:717)
    at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4141)
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4000)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3042)
    at org.springframework.kafka.support.converter.StringJsonMessageConverter.extractAndConvertValue(StringJsonMessageConverter.java:125)
    ... 15 more

Example for non-conforming message could be a simple string such as "This is a test message"

1
did you manage to get the original payload(as is) in the .DLT?jumping_monkey
No, I have not been able to.beginner1234

1 Answers

3
votes

You have to create the DLT topic yourself.

The framework will do it for you, if you add a bean to the application context

@Bean
public NewTopic dlt(@Value("${spring.kafka.subscription.topic}" String mainTopic) {
    return new NewTopic(mainTopic + ".DLT", 10, (short) 3);
}

As long as there is a KafkaAdmin @Bean in the application context (if you are using Spring Boot, one will be auto-configured for you).