I've been reading a few articles about using Kafka and Kafka Streams (with state store) as Event Store implementation.
- https://www.confluent.io/blog/event-sourcing-using-apache-kafka/
- https://www.confluent.io/blog/event-sourcing-cqrs-stream-processing-apache-kafka-whats-connection/
The implementation idea is the following:
- Store entity changes (events) in a kafka topic
- Use Kafka streams with state store (by default uses RethinkDB) to update and cache the entity snapshot
- Whenever a new Command is being executed, get the entity from the store execute the operation on it and continue with step #1
The issue with this workflow is that the State Store is being updated asynchronously (step 2) and when a new command is being processed the retrieved entity snapshot might be stale (as it was not updated with events from previous commands).
Is my understanding correct? Is there a simple way to handle such case with kafka?