0
votes

I'm trying to produce message using kafka-console-producer. but end up with the following error.

$ /usr/bin/kafka-console-producer --broker-list confluent-kafka-0-service.ms-kafka-internal.svc:9092 --topic testTopic --producer.config ~/etc/kafka/client_security.properties

[2020-08-09 06:37:52,844] INFO Kafka version: 5.4.2-ccs (org.apache.kafka.common.utils.AppInfoParser) [2020-08-09 06:37:52,845] INFO Kafka commitId: 2626d8cfb686c23e (org.apache.kafka.common.utils.AppInfoParser) [2020-08-09 06:37:52,845] INFO Kafka startTimeMs: 1596955072646 (org.apache.kafka.common.utils.AppInfoParser)

Hi Welcome to Confluent Kafka [2020-08-09 06:38:55,451] ERROR Error when sending message to topic testTopic with key: null, value: 2 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback) org.apache.kafka.common.errors.TimeoutException: Topic testTopic not present in metadata after 60000 ms.

I have run the below consumer command to see the messages. But then also no messages have been seen.

/usr/bin/kafka-console-consumer --bootstrap-server confluent-kafka-0-service.ms-kafka-internal.svc:9092 --topic testTopic --from-beginning --consumer.config ~/etc/bmw/kafka/client_security.properties

Any immediate help would much appreciated.

Thanks, Mina

2

2 Answers

0
votes

Note that your producer and consumer points to different paths:

--consumer.config ~/etc/bmw/kafka/client_security.properties

vs

--producer.config ~/etc/kafka/client_security.propertie

maybe you just got the wrong config file path?

0
votes
This is the entire Content of Producer command.
I have no name!@confluent-kafka-0:/$ kafka-console-producer --broker-list confluent-kafka-0-service.ms-kafka-internal.svc:9092 --topic testTopic --producer.config ~/etc/bmw/kafka/client_security.properties
[2020-08-10 09:46:00,186] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2020-08-10 09:46:03,389] INFO ProducerConfig values:
        acks = 1
        batch.size = 16384
        bootstrap.servers = [confluent-kafka-0-service.ms-kafka-internal.svc:9092]
        buffer.memory = 33554432
        client.dns.lookup = default
        client.id = console-producer
        compression.type = none
        connections.max.idle.ms = 540000
        delivery.timeout.ms = 120000
        enable.idempotence = false
        interceptor.classes = []
        key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
        linger.ms = 1000
        max.block.ms = 60000
        max.in.flight.requests.per.connection = 5
        max.request.size = 1048576
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
        receive.buffer.bytes = 32768
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 1500
        retries = 3
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism = GSSAPI
        security.protocol = SSL
        security.providers = null
        send.buffer.bytes = 102400
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = https
        ssl.key.password = [hidden]
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = /etc/bmw/kafka/client_certs/kafka-client-internal-keystore.jks
        ssl.keystore.password = [hidden]
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = /etc/bmw/kafka/bmw_certs/bmw-truststore.jks
        ssl.truststore.password = [hidden]
        ssl.truststore.type = JKS
        transaction.timeout.ms = 60000
        transactional.id = null
        value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
 (org.apache.kafka.clients.producer.ProducerConfig)
[2020-08-10 09:46:11,786] INFO Kafka version: 5.4.2-ccs (org.apache.kafka.common.utils.AppInfoParser)
[2020-08-10 09:46:11,786] INFO Kafka commitId: 2626d8cfb686c23e (org.apache.kafka.common.utils.AppInfoParser)
[2020-08-10 09:46:11,786] INFO Kafka startTimeMs: 1597052771592 (org.apache.kafka.common.utils.AppInfoParser)
>Om namo Narayanaya nama
Jai Sri ram
[2020-08-10 09:47:21,186] ERROR Error when sending message to topic testTopic with key: null, value: 23 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Topic testTopic not present in metadata after 60000 ms.
>^C[2020-08-10 09:47:25,485] INFO [Producer clientId=console-producer] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer)
org.apache.kafka.common.KafkaException: Producer closed while send in progress
        at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:888)
        at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:865)
        at kafka.tools.ConsoleProducer$.send(ConsoleProducer.scala:75)
        at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57)
        at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.kafka.common.KafkaException: Requested metadata update after close
        at org.apache.kafka.clients.producer.internals.ProducerMetadata.awaitUpdate(ProducerMetadata.java:104)
        at org.apache.kafka.clients.producer.KafkaProducer.waitOnMetadata(KafkaProducer.java:1029)
        at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:885)`

Thanks,
Mina