Jaa


Introduction

 

patterns & practices Developer Center

Performance Testing Guidance for Web Applications

J.D. Meier, Carlos Farre, Prashant Bansode, Scott Barber, and Dennis Rea
Microsoft Corporation

September 2007

Performance Testing Guidance for Web Applications provides an end-to-end approach for implementing performance testing. Whether you are new to performance testing or looking for ways to improve your current performance-testing approach, you will gain insights that you can tailor to your specific scenarios.

The information in this guide is based on applied use in customer scenarios. It reflects the lessons learned from multiple performance-testing professionals. The guidance is task-based and presented in the following parts:

  • Part 1, “Introduction to Performance Testing,” gives you an overview of common types of performance testing, key concepts, and a set of common terms used in performance testing.
  • Part II, “Exemplar Performance Testing Approaches,” shows you seven core activities for performance testing. This section also contains information designed to show you how to apply performance testing to different environments, including Agile and CMMI® software development.
  • Part III, “Identify the Test Environment,” shows you how to collect information about your project that you will need for your performance tests. This includes collecting information on system architecture, the physical deployment, user activities, and any relevant batch processes.
  • Part IV, “Identify Performance Acceptance Criteria,” shows you how to determine your performance testing objectives. You will also learn how to achieve clarity around your various performance goals and requirements, from a performance testing perspective.
  • Part V, “Plan and Design Tests,” shows you how to model the workload and user experience to design more effective performance tests.
  • Part VI, “Execute Tests,” walks you through the main activities of actual performance testing.
  • Part VII, “Analyze Results and Report,” shows you how to organize and present your findings in a way that is useful based on the audience and the intent of the report.
  • Part VIII, “Performance-Testing Techniques,” shows you the core techniques for performing load and stress testing.

Scope of This Guide

This guide is focused on Web application performance testing. It provides recommendations on the following:

  • Managing and conducting performance testing in both dynamic (e.g., Agile) and structured (e.g., CMMI) environments.
  • Performance testing, including load testing, stress testing, and other types of performance related testing.
  • Core activities of performance testing: identifying objectives, designing tests, executing tests, analyzing results, and reporting.

Even though many of the topics addressed in this guide apply equally well to other types of applications, the topics are all explained from a Web application perspective for consistency and to ensure that the concepts are presented in a manner that is most intuitive to the majority of anticipated readers.

This guide is intended to be tool-agnostic. What that means is that none of the concepts presented in this guide require any specific tool to accomplish, though some techniques or concepts will require the use of a certain class of tools.

This guide does not directly address performance tuning. Performance tuning is extremely application- and technology-specific and thus is not a good fit for the style and format of the guide. The guide does, however, address high-level approaches around how performance testing and performance tuning activities overlap and feed one another.

Why We Wrote This Guide

We wrote this guide to accomplish the following:

  • To consolidate real-world lessons learned around performance testing.
  • To present a roadmap for end-to-end performance testing.
  • To narrow the gap between state of the art and state of the practice.

Features of This Guide

  • Approach for performance testing.  The guide provides an approach that organizes performance testing into logical units to help you incrementally adopt performance testing throughout your application life cycle.
  • Principles and practices.  These serve as the foundation for the guide and provide a stable basis for recommendations. They also reflect successful approaches used in the field.
  • Processes and methodologies.  These provide steps for managing and conducting performance testing. For simplification and tangible results, they are broken down into activities with inputs, outputs, and steps. You can use the steps as a baseline or to help you evolve your own process.
  • Life cycle approach.  The guide provides end-to-end guidance on managing performance testing throughout your application life cycle, to reduce risk and lower total cost of ownership (TCO).
  • Modular.  Each chapter within the guide is designed to be read independently. You do not need to read the guide from beginning to end to benefit from it. Use the parts you need.
  • Holistic.  The guide is designed with the end in mind. If you do read the guide from beginning to end, it is organized to fit together in a comprehensive way. The guide, in its entirety, is better than the sum of its parts.
  • Subject matter expertise.  The guide exposes insight from various experts throughout Microsoft and from customers in the field.

Who Should Read This Guide

This guide is targeted at providing individuals with the resources, patterns, and practices they need to conduct effective performance testing.

How to Use This Guide

You can read this guide from beginning to end, or you can read only the relevant parts or chapters. You can adopt the guide in its entirety for your organization or you can use critical components to address your highest-priority needs.

Ways to Use the Guide

There are many ways to use this comprehensive guidance. The following are some suggestions:

  • Use it as a mentor.  Use the guide as your mentor for learning how to conduct performance testing. The guide encapsulates the lessons learned and experiences gained by many subject matter experts.
  • Use it as a reference.  Use the guide as a reference for learning the do’s and don’ts of performance testing.
  • Incorporate performance testing into your application development life cycle.  Adopt the approach and practices that work for you and incorporate them into your application life cycle.
  • Use it when you design your performance tests.  Design applications using the principles and best practices presented in this guide. Benefit from lessons learned.
  • Create training.  Create training based on the concepts and techniques used throughout the guide.

Organization of This Guide

You can read this guide from end to end, or you can read only the chapters you need to do your job.

Parts

The guide is divided into eight parts:

  • Part 1, Introduction to Performance Testing
  • Part II, Exemplar Performance Testing Approaches
  • Part III, Identify the Test Environment
  • Part IV, Identify Performance Acceptance Criteria
  • Part V, Plan and Design Tests
  • Part VI, Execute Tests
  • Part VII, Analyze Results and Report
  • Part VIII, Performance-Testing Techniques

Part 1, Introduction to Performance Testing

Part II, Exemplar Performance Testing Approaches

Part III, Identify the Test Environment

Part IV, Identify Performance Acceptance Criteria

Part V, Plan and Design Tests

Part VI, Execute Tests

Part VII, Analyze Results and Report

Part VIII, Performance-Testing Techniques

Approach Used in This Guide

The primary task of any testing activity is to collect information in order to be able to help stakeholders make informed decisions related to the overall quality of the application being tested. Performance testing additionally tends to focus on helping to identify bottlenecks in a system, tuning a system, establishing a baseline for future testing, and determining compliance with performance goals and requirements. In addition, the results from performance testing and analysis can help you to estimate the hardware configuration required to support the application(s) when you “go live” to production operation.

Bb924376.image001(en-us,PandP.10).gif

The performance-testing approach used in this guide consists of the following activities:

  • Activity 1. Identify the Test Environment.  Identify the physical test environment and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations. Having a thorough understanding of the entire test environment at the outset enables more efficient test design and planning and helps you identify testing challenges early in the project. In some situations, this process must be revisited periodically throughout the project’s life cycle.
  • Activity 2. Identify Performance Acceptance Criteria.  Identify the response time, throughput, and resource utilization goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern. Additionally, identify project success criteria that may not be captured by those goals and constraints; for example, using performance tests to evaluate what combination of configuration settings will result in the most desirable performance characteristics.
  • Activity 3. Plan and Design Tests.  Identify key scenarios, determine variability among representative users and how to simulate that variability, define test data, and establish metrics to be collected. Consolidate this information into one or more models of system usage to be implemented, executed, and analyzed.   
  • Activity 4. Configure the Test Environment.  Prepare the test environment, tools, and resources necessary to execute each strategy as features and components become available for test. Ensure that the test environment is instrumented for resource monitoring as necessary.
  • Activity 5. Implement the Test Design.  Develop the performance tests in accordance with the test design.
  • Activity 6. Execute the Test.  Run and monitor your tests. Validate the tests, test data, and results collection. Execute validated tests for analysis while monitoring the test and the test environment.
  • Activity 7. Analyze Results, Report, and Retest.  Consolidate and share results data. Analyze the data both individually and as a cross-functional team. Reprioritize the remaining tests and re-execute them as needed. When all of the metric values are within accepted limits, none of the set thresholds have been violated, and all of the desired information has been collected, you have finished testing that particular scenario on that particular configuration.

Feedback on the Guide

We have made every effort to ensure the accuracy of this guide and its companion content. If you have comments on this guide, send e-mail to

 PerfTest@microsoft.com .

We are particularly interested in feedback regarding the following:

  • Technical issues specific to recommendations
  • Usefulness and usability issues

The Team Who Brought You This Guide

This guide was created by the following team members:

  • J.D. Meier
  • Carlos Farre
  • Prashant Bansode
  • Scott Barber
  • Dennis Rea

Contributors and Reviewers

Alan Ridlehoover; Clint Huffman; Edmund Wong; Ken Perilman; Larry Brader; Mark Tomlinson; Paul Williams; Pete Coupland; Rico Mariani

External Contributors and Reviewers

Alberto Savoia; Ben Simo; Cem Kaner; Chris Loosley; Corey Goldberg; Dawn Haynes; Derek Mead; Karen N. Johnson; Mike Bonar; Pradeep Soundararajan; Richard Leeke; Roland Stens; Ross Collard; Steven Woody

Tell Us About Your Success

If this guide helps you, we would like to know. Please tell us by writing a short summary of the problems you faced and how this guide helped you out. Submit your summary to:

MyStory@Microsoft.com.

patterns & practices Developer Center