I have a job with a single chunk step executed in parallel (8 partitions) :
- Reader : JdbcCursorItemReader
- Processor : use jdbcTemplate to call database (1 thread for each partition)
- Writer : write in file
I use a JdbcCursorItemReader to read millions of data from shared Postgres Database (v9.2).(other users use database at the same time)
Spring batch version : 3.0.6
The problem is the job and steps are blocked on status STARTED after some hour of execution with any errors in log
After blocked
the table pg_stat_activity is empty (i think processor are killed) and status job = STARTED
Anyone have any idea why the job and parallel steps are blocked in STARTED status ?
Thank you for your help





STARTED. This is explained in the docs here: docs.spring.io/spring-batch/docs/4.3.x/reference/html/… - Mahmoud Ben Hassine