3
votes

How can we profile a Spring batch application for the job execution time? Is it possible to know how much time is spent on the readers and writers on an average? If so, what could an efficient or smarter way?

Example, 10 million records are read from a database and outputted to a file in csv format. The file size amounts to say, 4GB.

Is it possible to know the average time taken for the reader & writer to execute for each chunk processing [fetch size say 20000 & commit-interval as 20000]

1

1 Answers

2
votes

First, it might be a good idea to look at the implementation of the spring batch admin project as this is what I based my own implementation on.

However, in general you can get quite a lot of details (start times, end times, reads, writes, skips, etc.) from the JobExplorer and JobSupportDao. This can be for the overall job execution as well as the individual steps within the job. This way you can compute quite a few metrics and build a profile for the job.

Another approach is to have a set of listeners that will log specifics as job steps complete. It really depends on what level of profiling you want.