I was trying to come out with a formula to calculate the throughput (number requests/time unit) based on some properties from the jtl log based on the following parameters:
- Timestamp (ts)
- Time (t)
- Number of total requests
Taking a look at the timestamps I was not completely sure if it refers to the time when the request was sent or when it received the response (my main point of confusion here). Taking a look at the values the first options seems more likely. So assuming this, I've come out with the following:
Throughput = (NumRequests / (max(ts + t) - min(ts)) ) * 1000
Can anyone tell me if I'm right with this?
Update (and thanks for the response @BlackGaff )
The point is that I need to execute my tests and gather the results in a non-gui environment for some automated processes, so I can't really use the Aggregate Report (unless there is a way to run it from the command line). The use of max & min is an attempt to find the ts values from a set of requests (within the same test). Also, if I configure the samplers to have a ramp up period different than 0 (so the load is distributed), the numbers I get for ts are different. And yes, as you already mentioned before, I'm effectively looking for the difference between startTime of the first sample and endTime of the last sample. Apart from that I found a parameter in jmeter.properties:
# Put the start time stamp in logs instead of the end
#sampleresult.timestamp.start=true
So depending on this parameter it seems I should also change the calculation to get the start and end times.
NOTE: I'm curious about how to calculate this based on the jtl file but if anyone needs to get these numbers from the command line try adding the "Generate Summary Report" listener, and in the jmeter logs you will get a line similar to the following at the end of the execution:
2011/03/10 22:31:42 INFO - jmeter.reporters.Summariser: Generate Summary Results = 200 in 9.8s = 20.5/s Avg: 95 Min: 75 Max: 315 Err: 0 (0.00%)