2
votes

I'm using Avro schema to write data to Kafka topic. Initially, everything worked fine. After adding one more new field(scan_app_id) in avro file. I'm facing this error.

Avro file: {

"type": "record", "name": "Initiate_Scan", "namespace": "avro", "doc": "Avro schema registry for Initiate_Scan", "fields": [ { "name": "app_id", "type": "string", "doc": "3 digit application id" },

{
  "name": "app_name",
  "type": "string",
  "doc": "application name"
},
{
  "name": "dev_stage",
  "type": "string",
  "doc": "development stage"
},
{
  "name": "scan_app_id",
  "type": "string",
  "doc": "unique scan id for an app in Veracode"
 },
{
  "name": "scan_name",
  "type": "string",
  "doc": "scan details"
},
{
  "name": "seq_num",
  "type": "int",
  "doc": "unique number"
},
{
  "name": "result_flg",
  "type": "string",
  "doc": "Y indicates results of scan available",
  "default": "Y"
},
 {
   "name": "request_id",
   "type": "int",
   "doc": "unique id"
 },
  {
    "name": "scan_number",
    "type": "int",
    "doc": "number of scans"
  }   ] }

Error: Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"Initiate_Scan","namespace":"avro","doc":"Avro schema registry for Initiate_Scan","fields":[{"name":"app_id","type":{"type":"string","avro.java.string":"String"},"doc":"3 digit application id"},{"name":"app_name","type":{"type":"string","avro.java.string":"String"},"doc":"application name"},{"name":"dev_stage","type":{"type":"string","avro.java.string":"String"},"doc":"development stage"},{"name":"scan_app_id","type":{"type":"string","avro.java.string":"String"},"doc":"unique scan id for an App"},{"name":"scan_name","type":{"type":"string","avro.java.string":"String"},"doc":"scan details"},{"name":"seq_num","type":"int","doc":"unique number"},{"name":"result_flg","type":{"type":"string","avro.java.string":"String"},"doc":"Y indicates results of scan available","default":"Y"},{"name":"request_id","type":"int","doc":"unique id"},{"name":"scan_number","type":"int","doc":"number of scans"}]}

INFO Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:1017) Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Register operation timed out; error code: 50002 at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:182) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:203) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:292) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:284) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:279) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:61) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:93) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72) at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:54) at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65) at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:768) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:745) at com.ssc.svc.svds.initiate.InitiateProducer.initiateScanData(InitiateProducer.java:146) at com.ssc.svc.svds.initiate.InitiateProducer.topicsData(InitiateProducer.java:41) at com.ssc.svc.svds.initiate.InputData.main(InputData.java:31)

I went through Confluent documentation about 50002 error, which says

A schema should be compatible with the previously registered schema.

Does this mean I cannot make changes/update existing schema ?

How to fix this?

1
Can you share the configuration properties of schema registry along with the Avro schemas?Giorgos Myrianthous
updated with avro schema details and complete errorEshwar P
In our case, when we run the app the first time with a new app ID, this error is thrown. But when we run the second time with the same app ID, it runs. How to debug that?Cyber Knight

1 Answers

-1
votes

Actually, the link says 50002 -- Operation timed out. If it was indeed incompatible, the response would actually say so.

In any case, if you add a new field, you are required to define a default value.

This way, any consumers defined with a newer schema that are reading older messages know what value to set to that field.

A straight-forward list of allowed Avro changes I found is by Oracle

Possible errors are:

  • A field is added without a default value