I am using spring with Kafka to consume the data from Kafka topics. I have configured the concurrency as 10. So different threads poll the broker for the messages and processing the messages. Even after some time (processed successfully), we are receiving the same message back to a different thread to the consumer. we are able to process the received messages within configured max.poll.interval.ms=1500000.
Please find the below-configured properties of Kafka consumer. I have configured the auto-commit through Kafka.
group.id=ips-request-group //group id
enable.auto.commit=true // auto commit
auto.commit.interval.ms=60000 // auto commit inverval
session.timeout.ms=100000 // session time out
request.timeout.ms=610000 // request time out
fetch.max.wait.ms=500 //polling time out
heartbeat.interval.ms=20000 // heart beat interval
auto.offset.reset=latest //consuming latest messages.
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer // key
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer //value
max.poll.records=10 // max polling records
max.poll.interval.ms=1500000 // max polling interval ms /*
Could you please help me to resolve duplicate receiving messages to Kafka consumer.
max.poll.records
or increasemax.poll.interval.ms
. – Gary Russell