1
votes

I have use Confluent HDFS Connector for moving data from Kafka topics to HDFS log file. But when I run these commands:

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties \ etc/kafka-connect-hdfs/quickstart-hdfs.properties

I am taking follow error. How can i solve this problem. What is the reason of that ?

Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1 Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte! [2017-06-03 13:44:41,895] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:142)

1

1 Answers

1
votes

This happens if you are trying to read data read the connector and have set key.converter and value.converter to be the AvroConverter but your input topic has data that was not serialized by the same AvroSerializer that uses the schema registry.

You have to match your converter to the input data. In other words, use a serializer that can deserialize the input data.