We are evaluating how to test performance on a single-page application (SPA) which relies heavily on JavaScript and dynamic content (updated via Ajax).
Popular load-testing tools like Apache JMeter or Gatling are able to generate huge loads with little hardware by sending HTTP requests. But they do not process any Javascript code.
We would like to measure performance as perceived by the client, that is, as perceived by end-user sitting in front of a browser (with all the rendering and JavaScript overhead included).
We have the following alternatives in mind:
Use Selenium Grid to run a test script concurrently on several machines. Each instance submits latency data collected at runtime to a central repository.
Similar to above, except that only one Selenium script instance is started. We would then JMeter/Gatling to generate the heavy load in the background.
Do not use Selenium at all. While running a load test using JMeter/Gatling, manually inspect the behaviour of the page by using tools like Firebug, YSlow, etc.
Each approach has its strengths and drawbacks. On (1) I'm able to create the most "realistic" load, but it does not scale, (2) scales through JMeter/Gatling, but I have to make sure the load they create do not deviate too much from the actual application and (3) is the easiest to setup, but it's obviously time consuming since it's not automated.
So far I could not find any tool, framework, or even a set of guidelines on this topic, so I would like to know from experts here at SO: what kind of approach you they use for this type of test?
This entry on the Selenium Grid FAQ states that *"conducting performance/load testing with real browser is a pretty bad idea as it is hard/expensive to scale the load and the actual load is very inconsistent...". I can only think that they refer to "backend performance/load testing". Otherwise, if I'm interested in end-user performance, what's the alternative?