0
votes

I need to run separated jobs, but I need to communicate these two jobs in somehow, for example:

Source1 -> Operator1 -> Sink1 Source2 -> Operator2 -> Sink2

in some point I need to know in the job2 when the Operator1 from the job1 triggers something and then start Operator2 in the job2 or just do something with the result of the Operator1 from the job1 in the job2, for example, hopping not saying anything crazy.

Kind regards.

1

1 Answers

0
votes

You can use some sort of message queue or pub/sub system (e.g., Kafka, Pulsar, etc) so that sink1 and source2 are the same resource. Then the stream produced as output from job1 becomes the input to job2.

You might take a look at stateful functions, which is another API that uses the Flink runtime. It offers the possibility to implement remote functions that are deployed independently. See this Flink Forward talk for an intro.