I have a 4 Vertx mock apis behind an Nginx. While Executing a jmeter load test with 250 users, the result is same for either 1 vertx node or more. e.g :- with 1 Vertx node (0 sec latency) - 995 tps and with all 4 nodes the result is the same. How can I improve tps by increaing the backends ? p.s when I put a timer to create a back end latency the tps drops significantly (950--> 180) . Is this due to the error in my code ?
Server - Linux 64 , Jmeter instance 3.0 with 250 users/125 rampup
//---Vertx mock service ---------------------------
public class App extends AbstractVerticle {
private static Logger LOGGER = Logger.getLogger("InfoLogging");
public static void main(String[] args) {
Vertx vertx = Vertx.vertx();
PropertyConfigurator.configure(System.getProperty("user.dir")+"/log4j.properties");
HttpServer httpServer = vertx.createHttpServer();
Router router = Router.router(vertx);
Route ELKPaymentResponse = router
.post("/:param/amount")
.produces("application/json")
.handler(routingContext -> {
routingContext.request().bodyHandler(bodyHandler -> {
HttpServerResponse response = routingContext.response();
// response.setChunked(true);
String JsonResponse ="{
//Mock service here
}";
vertx.setTimer(TimeUnit.SECONDS.toMillis(1), l -> {
JsonObject json = new JsonObject(JsonResponse);
response.putHeader("Content-Type", "application/json; charset=UTF8")
.setStatusCode(200)
.end(Json.encodePrettily(json));
});
});
});