0
votes

We are currently migrating a complex spring boot batch + admin UI system into a spring-cloud-task infrastructure that will be managed Cloud Data Flow.

As first phase of POC we must be able to still package all of our Spring batch Jobs under the same deployment JAR and be able to run them one by one with custom jobs parameters and support some kind of REST API to executing the jobs/task remotely.

We removed any spring-batch admin dependencies and added the spring-cloud-starter-task

we also adapted the boot application to Spring Cloud Task programming model.

after registering the JAR on the spring cloud dataflow we were unable to define a task that will trigger only a specific Job with custom parameters.

reading the the official document and stackOverFlow issues without any more promising results.

10x

1

1 Answers

2
votes

In order to accomplish what you are looking to do, there are two steps:

  1. Repackage your batch jobs in a Spring Boot über jar. This should be pretty straight forward.
  2. Create task definitions for each job you want to run. Once your über jar is created and registered with Spring Cloud Data Flow, you'll need to create the job definitions. Each definition will look something like the following where the über jar contains a batch job named fooJob and one named barJob:

dataflow:> app register --name batchJobs --type task --uri <URI to über jar> dataflow:> task create --name fooBatchJob --definition "batchJobs --spring.batch.job.names=fooJob" dataflow:> task create --name barBatchJob --definition "batchJobs --spring.batch.job.names=barJob"