0
votes

I have to configure my batch job flows like this.

XML File reader -> Item processor#1 (Process item from the source XML File) -> Item processor#2 (Process item from the processor#1 result) -> Item writer#1 (Write the item from the processor#1) -> Item writer#2 (Write the item from the processor#2)

My custom processor looks like this

public class MyCompositeProcessor implements ItemProcessor<MySource, MyCompositeResult> {

@Override
    public MyCompositeResult process(MySource item) throws Exception {
       MyResult1 result1 = myProcessor1.process(item);
       MyResult2 result2 = myProcessor2.process(result1);
        return MyCompositeResult(result1, result2);
    }
}


public class MyCompositeWriter implements ItemWriter<MyCompositeResult> {
    
    @Override
    public void write(List<? extends MyCompositeResult> items) throws Exception {
        myWriter1.write(item.getResult1());
        myWriter2.write(item.getResult2());
    }
}

Is that a good approach? I saw some examples of CompositeProcessor, CompositeWriter but none of them suitable for my case.

Thanks in advance.

1

1 Answers

0
votes

If I understand correctly, your requirement is like the following:

item reader → item processor 1  → item processor 2
                   ↓                   ↓
              item writer 1       item writer 2

The composite item processor/writer route won't work in your case because it will be like:

item reader → composite item processor    →   composite item writer
              (processor 1 → processor 2)     (writer 1 → writer 2)

which is not what you want. You requirement is specific and you would need a custom ChunkProcessor implementation, specifically, a composite chunk processor:

item reader → chunk processor 1  → chunk processor 2
              (item processor 1)   (item processor 2)
              (     ↓          )   (     ↓          )
              (item writer 1   )   (item writer 2   )