I'm testing my Apache & PHP setup (default configuration on Ubuntu) with the `ab' tool. With 2 concurrent connections I get fairly satisfactory results:
ab -k -n 1000 -c 2 http://localserver/page.php
Requests per second: 184.81 [#/sec] (mean)
Time per request: 10.822 [ms] (mean)
Time per request: 5.411 [ms] (mean, across all concurrent requests)
Given it's a virtual machine with low memory, it's okay. Now I want to test a more realistic scenario: Requests spread among 100 users (read: connections) connected at the same time:
ab -k -n 1000 -c 100 http://localserver/page.php
Requests per second: 60.22 [#/sec] (mean)
Time per request: 1660.678 [ms] (mean)
Time per request: 16.607 [ms] (mean, across all concurrent requests)
This is much worse. While the number of requests per second overall has not fallen significantly (184 to 60 #/sec), the time per request from a user perspective has risen sharply (from 10 ms to over 1.6 seconds on average). The longest request took over 8 seconds, and manually connecting to the local server with a web browser took almost 10 seconds during the tests.
What could be the cause and how can I optimize the concurrency performance to an acceptable level?
(I'm using the default configuration as shipped with Ubuntu Linux Server.)