0
votes

My issue in detail - Steps followed - 1. Recorded few steps which includes Login page then home page and then logout functionality.. (used (a) blazemeter -- recorded the steps, extracted the .jmx file, imported it into Jmeter
(b) used HTTPs Test script recorder (by setting proxy)) 2. Added Listener - View Result tree 3. Ran the test

Observation: Run Results steps in the Result tree appears as pass (most of them).. HOWEVER, when i analyse the HTML format of response (in Response data tab) home page and logout page are not loaded itself.. it still shows Login page itself..

Can some one please help me to get through this issue..

Many many thanks in advance...

2

2 Answers

0
votes

Most probably your test scenario simply fails to perform the login. There could be different reasons for this, the most popular mistakes are:

  1. Missing HTTP Cookie Manager. Modern web applications widely use cookies for authentication purposes, maintaining sessions and storing client information
  2. Missing correlation. Modern web applications widely use dynamic parameters for i.e. security reasons or client state tracking

So first of all try adding HTTP Cookie Manager to your test plan. If it doesn't help - record the same scenario one more time and inspect generated scripts. If you will see the differences - all the different values need to be correlated, to wit:

  • You will need to extract the value from the previous response using suitable JMeter PostProcessor and store it into a JMeter Variable
  • In the next request you should replace recorded hard-coded value with the variable from the previous step.
0
votes

Observation: Run Results steps in the Result tree appears as pass (most of them).. HOWEVER, when i analyse the HTML format of response (in Response data tab) home page and logout page are not loaded itself.. it still shows Login page itself..

Yes, this is why for every test one is expected to check for results. A simple HTTP 200 response (HTTP OK) is not sufficient as unexpected, but valid, pages may be returned which will be inappropriate for the business process. If your virtual user continues while off track then the odds are high that you will run into an unhandled exception (HTTP 500) as you begin presenting data to the server which is out of context to the state of the business process flow.

This is one of the ways where you can check for the maturity of the performance tester, the maturity of the tester, and the value of what is being delivered. If, when you look at the script, you find that the tester is not checking for expected results being returned in the content (vs just the status), you can be assured that the maturity of the tester is low and the value of the delivery also matches. You can likely find corresponding data points in the handling of data (dynamic and user provided), monitoring, analysis, etc...

As a tester, each step has an expected result. Check for it. This is true for manual testing. This is true for automated functional testing. This is true of performance testing. This is true independent of the tool(s) used.