0
votes

I have a job with a single chunk step executed in parallel (8 partitions) :

  • Reader : JdbcCursorItemReader
  • Processor : use jdbcTemplate to call database (1 thread for each partition)
  • Writer : write in file

I use a JdbcCursorItemReader to read millions of data from shared Postgres Database (v9.2).(other users use database at the same time)

Spring batch version : 3.0.6

The problem is the job and steps are blocked on status STARTED after some hour of execution with any errors in log

Before blocked : enter image description here

After blocked

the table pg_stat_activity is empty (i think processor are killed) and status job = STARTED

enter image description here

enter image description here

Anyone have any idea why the job and parallel steps are blocked in STARTED status ?

Thank you for your help

1
Please share your job/step configuration code to be able to help you. If your job is killed abruptly, Spring Batch will not have a chance to update the job status correctly in the job repository. Hence it will be stuck at STARTED. This is explained in the docs here: docs.spring.io/spring-batch/docs/4.3.x/reference/html/… - Mahmoud Ben Hassine
My problem is why the job is blocked because i have a some problem when i start a new execution after some hour. is problem job configuration? reader? or Postgres kill transactions - DevJava
Before talking about restarting your job and how to do that, we need to understand why it is blocked in that status. Is your job still running and blocked in the STARTED state? was it killed in the middle of processing? - Mahmoud Ben Hassine

1 Answers

0
votes

Job configuration enter image description here

enter image description here

My problem is why the job is blocked on status STARTED because i have a some problem when i start a new execution