16
votes

I'm using Kafka schema registry for producing/consuming Kafka messages, for example I have two fields they are both string type, the pseudo schema as following:

{"name": "test1", "type": "string"}
{"name": "test2", "type": "string"}

but after sending and consuming a while, I need modify schema to change the second filed to long type, then it threw the following exception:

Schema being registered is incompatible with an earlier schema; error code: 409

I'm confused, if schema registry can not evolve the schema upgrade/change, then why should I use Schema registry, or say why I use Avro?

4

4 Answers

17
votes

Fields cannot be renamed in BACKWARD compatibility mode. As a workaround you can change the compatibility rules for the schema registry.

According to the docs:

The schema registry server can enforce certain compatibility rules when new schemas are registered in a subject. Currently, we support the following compatibility rules.

Backward compatibility (default): A new schema is backward compatible if it can be used to read the data written in all previous schemas. Backward compatibility is useful for loading data into systems like Hadoop since one can always query data of all versions using the latest schema.

Forward compatibility: A new schema is forward compatible if all previous schemas can read data written in this schema. Forward compatibility is useful for consumer applications that can only deal with data in a particular version that may not always be the latest version.

Full compatibility: A new schema is fully compatible if it’s both backward and forward compatible.

No compatibility: A new schema can be any schema as long as it’s a valid Avro.

Setting compatibility to NONE should do the trick.

# Update compatibility requirements globally
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "NONE"}' \
    http://localhost:8081/config

And the response should be

{"compatibility":"NONE"}

I generally discourage setting compatibility to NONE on a subject unless absolutely necessary.

4
votes

https://docs.confluent.io/current/avro.html You might need to add a "default": null.

You can also delete existing one and register the updated one.

3
votes

You can simply append a default value like this.

{"name": "test3", "type": "string","default": null}
1
votes

If you need just the new schema and you don't need the previous schemas from schema registry, you can delete the older schemas as mentioned below :

I've tested this with confluent-kafka and it worked for me

Deletes all schema versions registered under the subject "Kafka-value"

curl -X DELETE http://localhost:8081/subjects/Kafka-value

Deletes version 1 of the schema registered under subject "Kafka-value"

curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/1

Deletes the most recently registered schema under subject "Kafka-value"

curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/latest

Ref: https://docs.confluent.io/platform/current/schema-registry/schema-deletion-guidelines.html