1
votes

I want to put stream SQL in Kafka to be consumed by Flink for CEP. Is this a good way ?

I know that dynamic pattern definition is not allowed on Flink and i need to apply rules that they can change on unbounded event stream.

To give an example;

There is a UI for users to define rules for their devices. Imagine that there is a thing shadowing service (such as AWS IoT hub ) keeping the state of the physical device and also i thought that it would be good way to put every device's specific rules into shadow ones and when sensor data received by a shadow actor, it can emit data with its rules added to be consumed by Flink as a rule engine (via kafka). So i want Flink to perform my incoming sensor data (with its rules) that could be different for every devices.

1
It is what you are asking. You need to give more detail about your use case. Moreover, Flink's stream SQL is not integrated with its CEP library yet. This is ongoing work.Fabian Hueske
Please update your question instead of adding comments. Thank youFabian Hueske

1 Answers

0
votes

What I understood from your question is that you want to process different streams with different rules. If that is the case you should send streams from these devices to Kafka topics. Then start multiple instances of CEP in Flink.

StreamExecutionEnvironment envrionment1 = StreamExecutionEnvironment.createLocalEnvironment(1);
...
StreamExecutionEnvironment envrionmentn = StreamExecutionEnvironment.createLocalEnvironment(1);

Each Instance should subscribe to Kafka topics (each representing one device) and add diff patterns in each Flink Instance.