Loadruuner Helpline by Bhupendra Varshney, Loadrunner Help online, Loadrunner issues

Welcome To Loadrunner Helpline By Bhupendra Varshney

Monday, September 17, 2012

Loadrunner Analysis Graphs

1) Summary page shows the summarized analysis on single page.


2) Running Vusers displays the number of Vusers that executed Vuser scripts, and their status, during each second of a load test. This graph is useful for determining the Vuser load on your server at any given moment.


 3)  Hits per Second displays the number of hits made on the Web server by Vusers during each second of the load test. This graph helps you evaluate the amount of load Vusers generate, in terms of the number of hits.


 4)      Throughput displays the amount of throughput (in bytes) on the Web server during the load test. Throughput represents the amount of data that the Vusers received from the server at any given second. This graph helps you to evaluate the amount of load Vusers generate, in terms of server throughput.


 5)      Transaction Summary displays the number of transactions that passed, failed, stopped, or ended with errors.









6)      Average Transaction Response Time displays the average time taken to perform transactions during each second of the load test. This graph helps you determine whether the performance of the server is within acceptable minimum and maximum transaction performance time ranges defined for your system.









7)      Connections per Second display the number of Connections per Second.









8)      Transaction Response Time Under Load displays average transaction response times relative to the number of Vusers running at any given point during the load test. This graph helps you view the general impact of Vuser load on performance time and is most useful when analyzing a load test which is run with a gradual load.









9)      Transaction Response Time (Percentile) displays the percentage of transactions that were performed within a given time range. This graph helps you determine the percentage of transactions that meet the performance criteria defined for your system.



Performance Testing Process

Performance Testing Process

The performance testing of any application can be carried out based on the NFRs (Non-Functional Requirements). This is normally done with a discussion/meeting with the client/Business Analysts during the project planning phase. Generally this is given in terms of SLA - page response time or transaction response time, Hits Per Second, Throughput etc.




Planning a Performance Test

The following needs to be answered by a client/BA during planning of performance test –

1) Business Processes to be tested

2) Anticipated user load of application

3) Division of user load among business process (based on current logs)

4) Architecture of application

5) Environment for performance testing



For example –In mentioned point 1, three business processes need to be recorded for performance testing – Business Process 1, Business Process 2 and Business Process 3, in point 2, the evaluated application load is 200 virtual users and percentage of each business process is 40%, 30% and 20% respectively- i.e. the actual virtual user load for these business processes will be 80, 60 and 40 respectively.



Designing Performance Test Scenarios

Once the planning of performance testing is completed, performance testing team designs the test cases based on the business process provided by client/BA. These will include the step by step functionality with expected result, and the data requirement for each page. This will help the performance tester to automate the script and will help in anticipating the master/external data requirement for automated script.



Scripting Performance Test Scenarios

After completion of test case designing the performance tester will record the business processes using testing tool (Loadrunner) followed by script enhancements. These scripts will be reviewed by a technical lead/performance architect.




Test Data Generation

The testing/dev team will generate the data for test execution.



Test Scenario Design using Loadrunner Controller

Performance tester will design the scenario/mix for scheduling the groups and configuring monitors. For configuring Solaris counters the testing team can use two approaches –

1) UNIX monitor configuration using rstatd daemon utility (Please refer to Solaris counters )

2) Windows monitor configuration using perfmon (Please refer to Windows Counters)



Test Execution and Monitoring using Loadrunner Controller

After designing all test scenarios the tester will execute the test run and will be monitoring the graphs e.g – Server resource graphs, Running Vusers, Hits per Second, Throughput, Transaction Response Time, Connections per Second etc. Also the server logs can be monitored during test execution.



Test Result analysis using Loadrunner Analyzer

The result will be collated after test execution and using LR interactive graphs, the report can be published in HTML and/or Excel and/or Word document. This report will contain the summary of test and graphs as per the requirement. (Please refer to Analyzer Graphs section)