I'm trying to get a handle on the architecture of spring cloud dataflow. Specifically in using it to orchestrate purely batch pipelines. I've deployed all the components in k8s and noticed I can deploy the skipper component without a middleware (Kafka / RabbitMQ) and run batch jobs successfully through the server component.
Per the docs I can see skipper is used to orchestrate streaming jobs. Is skipper (and subsequently the Kafka/RabbitMQ middleware) a necessary piece of spring cloud dataflow when using for purely batch jobs or tasks?