I'm trying to process some steps in a Spring Batch job in parallel. The XML configuration for the job is as follows:
<batch:job id= "job" job-repository = "jobRepository">
<batch:split id="split" task-executor="taskExecutor">
<batch:flow>
<batch:step id = "step1">
<batch:tasklet transaction-manager = "txManager" >
<batch:chunk reader = "reader1"
processor = "processor1"
writer = "writer1"
commit-interval = "1" />
</batch:tasklet>
</batch:step>
</batch:flow>
<batch:flow>
<batch:step id = "step2">
<batch:tasklet transaction-manager = "txManager">
<batch:chunk reader = "reader2"
processor = "processor2"
writer = "writer2"
commit-interval = "1" />
</batch:tasklet>
</batch:step>
</batch:flow>
</batch:split>
</batch:job>
The taskExecutor is a SimpleAsyncTaskExecutor.
In the chunk I'm using a reader, processor and writer. These are all dependent on Seam (2.2.2).
When the steps are run in single threaded mode, they all work fine. But when they are run in parallel, they cause an error, because there is no Seam context available. Obviously because a new Thread is created and no Seam lifecycle (Lifecycle.beginCall()) is started.
How can i make sure the lifecycle is started when my chunk is being processed? I really don't want to start a lifecycle within my reader and end it in the writer, but it should be started when the tasklet is executed and ended when the tasklet is completed.