Is it possible in Apache Flink, to create an application, which consists of multiple jobs who build a pipeline to process some data.
For example, consider a process with an input/preprocessing stage, a business logic and an output stage. In order to be flexible in development and (re)deployment, I would like to run these as independent jobs.
Is it possible in Flink to built this and directly pipe the output of one job to the input of another (without external components)? If yes, where can I find documentation about this and can it buffer data if one of the jobs is restarted? If no, does anyone have experience with such a setup and point me to a possible solution?
Thank you!