I want to put stream SQL in Kafka to be consumed by Flink for CEP. Is this a good way ?
I know that dynamic pattern definition is not allowed on Flink and i need to apply rules that they can change on unbounded event stream.
To give an example;
There is a UI for users to define rules for their devices. Imagine that there is a thing shadowing service (such as AWS IoT hub ) keeping the state of the physical device and also i thought that it would be good way to put every device's specific rules into shadow ones and when sensor data received by a shadow actor, it can emit data with its rules added to be consumed by Flink as a rule engine (via kafka). So i want Flink to perform my incoming sensor data (with its rules) that could be different for every devices.