4
votes

I have deployed a Java EE application in linux and Apache Tomcat 7.0.42

Everything works fine when I load test for 100 users using JMeter(concurrent 100 threads requests)

But as soon as I change the users(or number of threads) to 1000 server is choked and it gives "Connection refused" error for all the requests after ~600.

I have done all fine tuning in the application and it is more of of a static web page now, even then it comes back with error.

Server Configuration: Ubuntu, 8 vCPU / 32 GB RAM / 960 GB HD

PS: The same application works well in AWS(Amazon Web Services), so you can rule out any problem with my machine running JMeter(client)

    org.apache.http.conn.HttpHostConnectException: Connection to http://a.b.c.d:8080 refused
    at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:190)
    at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294)
    at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:645)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:480)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
    at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:286)
    at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:62)
    at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1088)
    at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1077)
    at org.apache.jmeter.threads.JMeterThread.process_sampler(JMeterThread.java:428)
    at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:256)
    at java.lang.Thread.run(Unknown Source)
Caused by: java.net.ConnectException: Connection timed out: connect
    at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
    at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
    at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
    at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
    at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
    at java.net.PlainSocketImpl.connect(Unknown Source)
    at java.net.SocksSocketImpl.connect(Unknown Source)
    at java.net.Socket.connect(Unknown Source)
    at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127)
    at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
    ... 12 more
3
How many concurrent connections does your Tomcat instance support? How long does the operation performed on each connection take? What is the conenction timeout? To help, you could post your server.xml? - JohnMark13

3 Answers

2
votes

Try adjusting the maxThreads and acceptCount attributes of the http connector in server.xml:

Each incoming request requires a thread for the duration of that request. If more simultaneous requests are received than can be handled by the currently available request processing threads, additional threads will be created up to the configured maximum (the value of the maxThreads attribute). If still more simultaneous requests are received, they are stacked up inside the server socket created by the Connector, up to the configured maximum (the value of the acceptCount attribute). Any further simultaneous requests will receive "connection refused" errors, until resources are available to process them.

Reference: http://tomcat.apache.org/tomcat-7.0-doc/config/http.html

1
votes

Thank you all!!

Problem was actually with network, when we tested using different IP address(IP spoofing), all requests were successful. Network was thinking that it was a DoS attack.

Thanks all. I had tried maxThreads & acceptCount and did a lot of tuning in Linux.

So the learning is: Conduct the performance test from a server which is located in the same zone.

0
votes

Possibly 1000 concurrent requests (in one second) is out of reality. A better test would be to distribute the 1000 concurrent requests in an interval of time.

e.g.: The image show that 100 requests are executed in a period of 60 seconds, ie, almost two requests per second.

enter image description here