To be clear, I am not trying to use Kafka as the data store for event sourcing, merely to replicate events.
The Confluent Schema Registry for Kafka seems very interesting in that it can validate the schema for messages sent by producers to a topic. However, from what I understand it treats each topic like a container file - one schema per topic.
This restriction doesn't work for an event source stream where for a single aggregate like File
you will have multiple message schemas: FileCreated
, FileMoved
, FileCopied
, FileDeleted
. Putting each of these on a separate topic would be complicated and error prone.
Does there exist a tool like Schema Registry which supports multiple schemas for the same topic?
Update
To clarify, each of the messages above would have a different schema. For example:
FileCreated
:
{
type: "record",
name: "FileCreated",
fields: [
{ name: "id", type: "string" },
{ name: "name", type: "string" },
{ name: "path", type: "string" },
{ name: "size", type: "string" },
{ name: "mimeType", type": "string" },
{ name: "user", type: "string" },
{ name: "date", type: "long" }
]
}
FileMoved
:
{
type: "record",
name: "FileMoved",
fields: [
{ name: "id", type: "string" },
{ name: "from", type: "string" },
{ name: "to", type: "string" },
{ name: "date", type: "long" },
{ naem: "user", type: "string" }
]
}
FileDeleted
:
{
type: "record",
name: "FileDeleted",
fields: [
{ name: "id", type: "string" },
{ name: "date", type: "long" },
{ name: "user", type: "string" }
]
}