0
votes

We have a REST API written in Spring Boot. Part of this application is a Spring Batch job that runs everyday. I wanted an exit code to be returned to the shell script starting the application once the Spring Batch job was complete, so I added a System.exit() in my main method. Only then did I realize that this would cause the entire Spring Boot application to exit, which we don't want. I am looking for a way for the Spring Batch job to execute, return an exit code to the shell script that's calling it, and have the Spring Boot application still up and running.

I know that I can have the Spring Batch job scheduled to run at specific times as part of the running Spring Boot application, and it would run at those times while the entire application was up. BUT the trouble with that is that the application runs on OLTP, and we can't have the Spring Batch jobs running on OLTP since that would try to execute the same job on multiple OLTP servers at the same time, causing some instances of the job to fail.

For the above reason, we are looking to run the application on OLTP as well as a single batch server. Our aim is that the Spring Batch jobs will run through the batch server, and the application will end with a System.exit() on this batch server (so that it can return an exit code to the shell script), while the actual Spring Boot application will be made available through OLTP, and won't have a System.exit() statement. I don't see how this is possible, but if all this even remotely makes sense to someone else, I would love to hear their opinion.

2
I don't believe you can return an exit code from a process until the process completes. That being said, if you ran the jobs outside of the main server (in independent JVMs) like how Spring Cloud Data Flow works, you'd have that functionality.Michael Minella

2 Answers

2
votes

Separate the batch logic. This seems like a very bad idea. Create separate spring boot for different functions.

If you insist on keeping them together, do a flag driven approach. Keep a job active flag in a shared repository and before starting the job, check if some other instance started the job. You can also give a name to a job and have the Spring Batch from submitting the job more than once.

2
votes

Our aim is that the Spring Batch jobs will run through the batch server, and the application will end with a System.exit()

Dude, that in of itself isn't possible. If you want your batch job to run via a shell sub-process and read the process exit code, don't use spring batch. It's not mean for that. Spring batch is for processing many records with retry logic via spring batch jobs, and to give a framework/design-pattern for reading records and then writing them to another source.

So dude, just extract the java logic that is being batched into a simple spring-boot app. Probably include it as a jar file that is used by both the REST handler and the stand-alone springboot app. It appears that you are truly managing the job from your shell scripts, not spring-batch.