0
votes

We're using Kafka streams to write data to a sink topic. I'm running an avro-consumer command line to check if there's data in the sink topic:

bin/kafka-avro-console-consumer --topic sink.output.topic --from-beginning --new-consumer --bootstrap-server

I see data when I simultaneously run the consumer while kafka streams application is running but if I stop the consumer and run again after a few minutes, I don't see any data. Few possibilities:

1) Is this because the kafka streams is wiping out the records from the output topic every time it pushes records to sink?

2) Or is this just a consumer related issue?

2
Kafka Streams does not delete any data from output topics.Matthias J. Sax

2 Answers

0
votes

I believe that this is because --from-beginning is used only when consumer didn't establish offset yet. Have you tried to use --offset earliest instead?

0
votes

From you description issues seems to be with the retention time. data might have removed when you run 2nd time. you can configure the retention time

Example : log.retention.hours=168