0
votes

I have a job with two steps but those two steps are partitioned steps.

Steps are basically a Lucene indexing (first step) and Lucene searching step(second step) and as of now I was creating Lucene reader, Lucene searcher and Lucene writer Objects in step Partitioner and passing to relevant step components using @Value annotation.

Now there is a requirement change and sometimes I need writer object from another thread ( another partitioned step ) in current thread ( current partitioned step ) but Lucene doesn't allow multiple writer object creation for same directory so I need to share these objects across partitioned steps.

So the solution is to create a job level Map that contains all these objects and I get these Objects from map whenever I need in a step component ( reader, processor or writer) then finally I dispose of/close all these objects at job end.

How can I achieve this? i.e. is it OK to put these Objects in a jobExecution.getExecutionContext() in JobExecutionListenerSupport's beforeJob(JobExecution jobExecution) method?

How to do I get specific objects from this map in reader, processor & writer? I wouldn't know it at configuration time but at run time.

Finally, I will close these objects in in JobExecutionListenerSupport's afterJob(JobExecution jobExecution) method.

I am using annotation based Spring batch configuration and Lucene 6.0.

Please suggest.

1

1 Answers

0
votes

Approach suggested by me in question that to put these objects in ExecutionContext - jobExecution.getExecutionContext() in method beforeJob(JobExecution jobExecution) and then closing these resources in afterJob(JobExecution jobExecution) works OK for me. I don't see any issues.

I simply inject - @Value("#{jobExecutionContext}") in components where I need it.