1
votes

I am using Jetty to start a main class as a service. Below a extract of the code to demonstrate the problem.

    Server server = new Server(8080);
    ResourceHandler resource_handler = new ResourceHandler();
    resource_handler.setDirectoriesListed(true);
    resource_handler.setWelcomeFiles(new String[] {"index.html"});
    resource_handler.setResourceBase(".");
    HandlerList handlers = new HandlerList();
    handlers.setHandlers(new Handler[] {resource_handler, new DefaultHandler()});
    server.setHandler(handlers);
    server.start();
    server.join();

I am running this application from eclipse on a windows machine with 2 CPU.

After I start this application, I execute below program on different machine. Below code just spawns 100 concurrent threads and executes a simple http request to get the index.html page.

ExecutorService service = Executors.newFixedThreadPool(100);
for (int i = 0; i < 10000; i++) {
  service.execute(new Runnable() {
    public void run() {
      try {
        long startTime = System.nanoTime();
        URL url = new URL("http://localhost:8080");
        HttpURLConnection conn = (HttpURLConnection)url.openConnection();
        conn.setRequestMethod("GET");
        long endTime = System.nanoTime();
        System.out.println(conn.getResponseMessage() + ":" + ((endTime - startTime)/1000000) + " (ms)");
      } catch (Exception e) {
        e.printStackTrace();
      }
    }
  });
}
service.shutdown();

When I execute this program, the server's CPU immediately reaches 100% (for both cores).

I have tried modifying maxThreads, acceptorThreads, buffer size parameters as suggested in https://wiki.eclipse.org/Jetty/Howto/High_Load.

But even then, the CPU still remains stuck between 95-100 for high load.

So the question here is, is there any configuration that I am missing to minimize the CPU? Or this is expected and can only be rectified by adding more CPUs or clustering the services?

Thanks for your help.

1

1 Answers

0
votes

You state you do run the ExecutorService part on a different machine, but the URL is http://localhost:8080. That's a no-no in load testing.

Some advice:

  • Don't have the Client Load and the Server Load on the same machine (don't cheat and attempt to put the load on 2 different VMs on a single physical machine)
  • Use multiple client machines, not just 1 (when the Jetty developers test load characteristics, we use at least 10:1 ratio of client machines to server machines)
  • Don't test with loopback, virtual network interfaces, localhost, etc.. Use a real network interface.
  • Don't test with unrealistic load scenarios. A real-world usage of your server will be a majority of HTTP/1.1 pipelined connections with multiple requests per physical connection. Some on fast networks, some on slow networks, some even on unreliable networks (think mobile)
  • If you must use HttpURLConnection, understand how it manages its HTTP version + connections (such as keep-alive or http/1.1 close), and make sure you read the response body content, close the streams, and disconnect() the connection.

Finally, be sure you are testing load in realistic ways.