0
votes

How do I configure Spring Batch (using Java Configuration) to write all chunks in a single transaction?

I have a simple simple Spring Batch Job with a single step. The step is made of a JdbcCursorItemReader, a custom item processor and a custom item writer. Currently I set the chunk size on the StepBuilder. This also seems to set the commit interval to the same value. On one hand I don't want to load all items into memory and therefore need chunk oriented processing. On the other hand the processor needs a single transaction over all the items not just the ones in the current chunk. It won't store the items in memory. It's ok if #write is called several times.

How can this be achieved using Java Configuration?

2

2 Answers

1
votes

IMO, this approach breaks the way spring-batch is designed.

The whole idea behind batch processing in spring is, that you commit every chunk and that spring-batch keeps track which records already have been processed, in order to provide restart, skip and failure handling strategies.

If you really need a transaction over the whole process, why using spring-batch at all? I mean, you could use the reader and writers directly and use them in your own simple loop. Something like this.

transaction.open();
while (not all processed) {
  List currentChunk = new List..;
  for(int i<0;i<chunkSize;i++) {
      Object readItem = jdbcCursorItemReader.read();
      if (readItem == null) {
         break;
      }
      Object processedItem = processor.process(readItem);
      if (processedItem != null) {
          currentChunk.add(processedItem);
      }
  }

  yourWriter.write(currentChunk);
}
transaction.commit();
0
votes

Write all chunks in one time is opposite to SB design and philosopy.
If you want you can read all item, put them in one big chunk and write; you will achive the same result.
Check Spring Batch Chunk processing for a possible solution.