0
votes

I would like to understand the Jmeter output for in depth.

  1. I am confused with the 'throughput rate' concept.Does it mean that the server can only handle 48.1 requests/min at the given load or does it mean something else .What is the difference between the total throughput rate and the throughput rate shown by individual requests.In my case there 8 requests sent and the individual request shows throughput rate as 6.1/min.Please explain.

  2. I need to suggest any changes to server side/explain the jmeter report,Please suggest how i can explain what needs to be done.

The total summary report is as below:

Total Users:100 Ramp up time:1000s Total Samples : 800 Min:325 Max:20353 Std.Dev: 4524.91 Throughput:48.1/min Error: 0.38%

Thanks in advance.

1

1 Answers

0
votes

As per JMeter Glossary

Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.

The formula is: Throughput = (number of requests) / (total time).

So you providing the "load" of 0.8 requests per second which is quite low.

JMeter provides a test element which controls this "Throughput" value so you can choose whether you will be simulating "N" concurrent users or sending "N" requests per second. Take a look at How to use JMeter's Throughput Constant Timer guide for more details on goal-oriented load test scenario implementation with JMeter.