I have a Dataflow pipeline consisting of two sequential batch jobs. The first batch gets completed successfully, but the second one doesn't start.
I have started Dataflow server with the embedded H2 DB. I've pointed Spring Batch to the same H2 instance via application.properties. After the first step in my pipeline gets completed, I can see batch execution logs in that same DB instance.
My composed-task-runner application seems getting the Dataflow's datasource correctly. I can see it inherits it from Dataflow server and props are shown in the Dashboard's task execution section.
There are no errors in the logs. Only log entries from successful execution of the first batch.
My TASK_EXECUTION entries:
What could be the problem? And why there are two entries in the TASK_EXECUTION table for the first step? Per the task_name - these entries belong to the first batch step only.