1
votes

I have a Kafka spring application with a read-process-write pattern. I want to make sure that the Kafka transaction rolls back if there are any producer errors so that the record is re-consumed using a SeekToCurrentErrorHandler. The default behavior seems to be to log the producer error and continue processing/commit. To override this default behaviour, i have implemented a ProducerListener that throws an exception in the onError method. Is this the recommended approach for ensuring a rollback and the intent behind spring providing us with the listener hook?

Logs showing an exception followed by a commit (The exception didnt result in a rollback)

2020-04-02 18:20:18.314|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                                o.s.kafka.core.KafkaTemplate|TRACE|                                 456|              d3410ae8-c964-41e7-98be-6706a6f2b3b2| Sending: ProducerRecord
2020-04-02 18:20:18.345|[                      kafka-producer-network-thread | producer-13]|                           org.apache.kafka.clients.Metadata|ERROR|                                    |                                                  | [Producer clientId=producer-13, transactionalId=tx-0] Topic authorization failed for topics 
2020-04-02 18:20:18.354|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.s.LoggingProducerListener|ERROR|                                 456|              d3410ae8-c964-41e7-98be-6706a6f2b3b2| Exception thrown when sending a message with key='170854907' org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [loyalty-retail-outlet-trans-resp-dev1]
2020-04-02 18:20:18.367|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|      o.s.k.l.KafkaMessageListenerContainer$ListenerConsumer|INFO |                                    |                                                  | Sending offsets to transaction: {loyalty-retail-outlet-earn-req-dev-5=OffsetAndMetadata{offset=2220, leaderEpoch=null, metadata=''}}
2020-04-02 18:20:18.368|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|TRACE|                                    |                                                  | CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@5abb103c, txId=tx-0] sendOffsetsToTransaction({loyalty-retail-outlet-earn-req-dev-5=OffsetAndMetadata{offset=2220, leaderEpoch=null, metadata=''}}, earn-unit-test)
2020-04-02 18:20:18.769|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.t.KafkaTransactionManager|DEBUG|                                    |                                                  | Initiating transaction commit
2020-04-02 18:20:18.769|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|DEBUG|                                    |                                                  | CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@5abb103c, txId=tx-0] commitTransaction()
2020-04-02 18:20:18.816|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|TRACE|                                    |                                                  | CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@5abb103c, txId=tx-0] close(PT5S)

The records are produced within a Kafka listener using Kafka Template (read-process-write pattern).

Kafka Template config

    @Bean
    public KafkaTemplate<Integer, TransactionResponse> kafkaTemplate(
            ProducerFactory<Integer, TransactionResponse> producerFactory
            , ProducerListener<Integer, TransactionResponse> producerListener) {
        KafkaTemplate<Integer, TransactionResponse> kafkaTemplate = new KafkaTemplate<>(producerFactory);
//        kafkaTemplate.setProducerListener(producerListener);
        return kafkaTemplate;
    }

application.properties

spring:
  kafka:
    producer:
      transaction-id-prefix: tx-
      acks: all
      key-serializer: org.apache.kafka.common.serialization.IntegerSerializer
      value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
      properties:
        enable.idempotence: true
        delivery.timeout.ms: 180000

listener

   @KafkaListener(topics = "${earn.request.topic}", clientIdPrefix = "EarnConsumer", containerFactory = "earnListenerContainerFactory")
    public void listen(List<TransactionRequest> requestList,
                       @Header(KafkaHeaders.GROUP_ID) String groupId,
                       @Header(KafkaHeaders.RECEIVED_PARTITION_ID) int partitions,
                       @Header(KafkaHeaders.OFFSET) String offsets,
                       @Header(KafkaHeaders.RECEIVED_TOPIC) String topic) {

send response method (executes within the listener code)

    public void sendResponse(TransactionResponse transactionResponse) {
        kafkaTemplate.send(earnResponseTopic, transactionResponse.getEventSummary().getMemberId(), transactionResponse);
    }

container config

   @Bean
    public ConcurrentKafkaListenerContainerFactory<Integer, EarnTransactionRequest> earnListenerContainerFactory(
            ConsumerFactory<Integer, EarnTransactionRequest> consumerFactory
            , SeekToCurrentBatchErrorHandler seekToCurrentBatchErrorHandler
            , KafkaTransactionManager ktm
    ) {
        ConcurrentKafkaListenerContainerFactory<Integer, EarnTransactionRequest> containerFactory = new ConcurrentKafkaListenerContainerFactory<>();
        containerFactory.setConsumerFactory(consumerFactory);
        containerFactory.setBatchListener(true);
        containerFactory.setBatchErrorHandler(seekToCurrentBatchErrorHandler);
        containerFactory.setConcurrency(numConcurrentConsumers);

        containerFactory.getContainerProperties().setTransactionManager(ktm);
        containerFactory.getContainerProperties().setAckOnError(false);
        containerFactory.getContainerProperties().setAckMode(ContainerProperties.AckMode.BATCH);
        containerFactory.getContainerProperties().setCommitLogLevel(LogIfLevelEnabled.Level.INFO);
        containerFactory.getContainerProperties().setLogContainerConfig(true);
        containerFactory.getContainerProperties().setMissingTopicsFatal(true);

        return containerFactory;
    }

EDIT: Simplified application

@Component public class QuickTest {

private final String responseTopic;
private final KafkaTemplate<Integer, TransactionResponse> kafkaTemplate;

public QuickTest(@Value("${response.topic}") String responseTopic
        , KafkaTemplate<Integer, TransactionResponse> kafkaTemplate) {
    this.responseTopic = responseTopic;
    this.kafkaTemplate = kafkaTemplate;
}

@KafkaListener(topics = "${request.topic}", clientIdPrefix = "Consumer")
public void listen(TransactionRequest requestList) {
    kafkaTemplate.send(responseTopic, 123456789, null);
}

}

Logs from start of one transaction to other


2020-04-03 19:04:54.901|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|      o.s.k.l.KafkaMessageListenerContainer$ListenerConsumer|TRACE|Processing ConsumerRecord(topic = req-dev, partition = 1, leaderEpoch = 4, offset = 2185, CreateTime = 1585642853682, serialized key size = -1, serialized value size = 184, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = {})

2020-04-03 19:04:54.901|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.t.KafkaTransactionManager|DEBUG|Creating new transaction with name [null]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT

2020-04-03 19:04:54.901|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|DEBUG|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] beginTransaction()

2020-04-03 19:04:54.901|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.t.KafkaTransactionManager|DEBUG|Created Kafka transaction on producer [CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1]]

2020-04-03 19:04:54.902|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|             o.s.k.l.a.RecordMessagingMessageListenerAdapter|DEBUG|Processing [GenericMessage [payload={"eventSummary": {"eventId": "102"}}]]

2020-04-03 19:04:54.902|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                                o.s.kafka.core.KafkaTemplate|TRACE|Sending: ProducerRecord(topic= resp-test, partition=null, headers=RecordHeaders(headers = [], isReadOnly = false), key=123456789, value=null, timestamp=null)

2020-04-03 19:04:54.902|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|TRACE|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] send(ProducerRecord(topic= resp-test, partition=null, headers=RecordHeaders(headers = [], isReadOnly = false), key=123456789, value=null, timestamp=null))

2020-04-03 19:04:54.928|[                       kafka-producer-network-thread | producer-8]|                        o.apache.kafka.clients.NetworkClient|WARN |[Producer clientId=producer-8, transactionalId=transactionx-g21.req-dev.1] Error while fetching metadata with correlation id 22 : { resp-test=TOPIC_AUTHORIZATION_FAILED}

2020-04-03 19:04:54.928|[                       kafka-producer-network-thread | producer-8]|                           org.apache.kafka.clients.Metadata|ERROR|[Producer clientId=producer-8, transactionalId=transactionx-g21.req-dev.1] Topic authorization failed for topics [ resp-test]

2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.s.LoggingProducerListener|ERROR|Exception thrown when sending a message with key='123456789' and payload='null' to topic  resp-test:

org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [ resp-test]
2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                                o.s.kafka.core.KafkaTemplate|DEBUG|Failed to send: ProducerRecord(topic= resp-test, partition=null, headers=RecordHeaders(headers = [], isReadOnly = false), key=123456789, value=null, timestamp=null)

org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [ resp-test]
2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                                o.s.kafka.core.KafkaTemplate|TRACE|Sent: ProducerRecord(topic= resp-test, partition=null, headers=RecordHeaders(headers = [], isReadOnly = false), key=123456789, value=null, timestamp=null)

2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|      o.s.k.l.KafkaMessageListenerContainer$ListenerConsumer|TRACE|Ack: ConsumerRecord(topic = req-dev, partition = 1, leaderEpoch = 4, offset = 2185, CreateTime = 1585642853682, serialized key size = -1, serialized value size = 184, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = {"eventSummary": {"eventId": "102"}})

2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|      o.s.k.l.KafkaMessageListenerContainer$ListenerConsumer|DEBUG|Sending offsets to transaction: {req-dev-1=OffsetAndMetadata{offset=2186, leaderEpoch=null, metadata=''}}

2020-04-03 19:04:54.928|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|TRACE|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] sendOffsetsToTransaction({req-dev-1=OffsetAndMetadata{offset=2186, leaderEpoch=null, metadata=''}}, g21)

2020-04-03 19:04:55.043|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.t.KafkaTransactionManager|DEBUG|Initiating transaction commit

2020-04-03 19:04:55.043|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|DEBUG|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] commitTransaction()

2020-04-03 19:04:55.090|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|TRACE|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] close(PT5S)

2020-04-03 19:04:55.091|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|      o.s.k.l.KafkaMessageListenerContainer$ListenerConsumer|TRACE|Processing ConsumerRecord(topic = req-dev, partition = 1, leaderEpoch = 4, offset = 2186, CreateTime = 1585644055280, serialized key size = -1, serialized value size = 184, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = {"eventSummary": {"eventId": "104"})

2020-04-03 19:04:55.091|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                             o.s.k.t.KafkaTransactionManager|DEBUG|Creating new transaction with name [null]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT

2020-04-03 19:04:55.091|[ org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1]|                         o.s.k.c.DefaultKafkaProducerFactory|DEBUG|CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@43926a5b, txId=transactionx-g21.req-dev.1] beginTransaction()
1
This was a good find. Helped with something similar I was facingAnkita

1 Answers

2
votes

The error handler runs within the transaction. You should leave it null and the AfterRolllbackProcessor will reseek the records. See the Transactions chapter in the reference manual.

The container needs a KafkaTransactionManager.

See Transactions and After-Rollback Processor.

You should not need to do anything in a ProducerListener.

EDIT

I added authorization configuration to get a TopicAuthorizationException and everything worked as I would have expected (the commit fails)...

@KafkaListener(id = "ktest24", topics = "ktest24")
public void listen(String in) {
    System.out.println("1:" + in);
    this.template.send("ktest24-out", "sendInTx");
}
1:foo
2020-04-03 14:10:26.619 ERROR 75695 --- [est24.ktest24.0] o.s.k.support.LoggingProducerListener   
 : Exception thrown when sending a message with key='null' and payload='sendInTx' to topic ktest24-out:

org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [ktest24-out]

2020-04-03 14:10:26.619 ERROR 75695 --- [  ktest24-0-C-1] essageListenerContainer$ListenerConsumer
 : Send offsets to transaction failed

org.apache.kafka.common.errors.TopicAuthorizationException: Not authorized to access topics: [ktest24-out]

2020-04-03 14:10:26.620 ERROR 75695 --- [  ktest24-0-C-1] o.s.k.core.DefaultKafkaProducerFactory  
 : commitTransaction failed: CloseSafeProducer [delegate=org.apache.kafka.clients.producer.KafkaProducer@84412c5, txId=tx-ktest24.ktest24.0]
2020-04-03 14:10:31.627 ERROR 75695 --- [  ktest24-0-C-1] essageListenerContainer$ListenerConsumer
 : Transaction rolled back
1:foo
...