Life-Cycle Performance Testing for Eliminating Last-Minute Surprises

Scott Barber

PerfTestPlus, Inc.

October 2007

Summary: For years, serious performance testers and developers who have a performance focus have been dreaming of the day when they would have a tool that had the features and capabilities that Microsoft Visual Studio Team System brings them. The market asked for it, and Visual Studio Team System delivered. (9 printed pages)

Contents

Introduction

Software-Development Model

Test Subsystems, Components, and System-Wide Performance

Integration into Daily Tasks and Work Environment

Intra-Team Test Sharing

Guidance from the Experience of Experts

Virtually Unlimited Extensibility

Conclusion

Introduction

The predominate paradigm for performance testing the last 10 years can be summarized as "add two weeks to the end of the development cycle to conduct a series of preplanned performance tests, to ensure that the application is ready to go live." This paradigm has been propagated by virtually all of the currently popular software-development project management-models and performance-testing support tools. Unfortunately, this leads to finding serious last-hour performance and stability problems—the kind of problems that should be found and fixed much earlier in the development cycle.

Microsoft Visual Studio Team System is changing this paradigm. Visual Studio Team System enables software-development teams to begin the process of ensuring that application performance is addressed throughout the development life cycle—starting at the very beginning of the project, and continuing through production-level monitoring of the system's performance. This is accomplished without the addition of expensive, unfamiliar tools; without extending the length of the development cycle; and without having to worry about last-minute surprises.

The keys to enabling this advancement in the prevalent performance-testing paradigm include:

  • A software-development model that focuses on identifying and collaboratively testing and validating application qualities-of-service from project kick-off.
  • A single performance-testing tool that enables not just collection of end-to-end response time, but also component-based performance analysis.
  • Performance-testing tools that are seamlessly integrated into the developer's, tester's, and manager's daily work environment.
  • The ability for testers and developers to share their tests and expertise collaboratively, to highlight and address performance concerns as soon as they are detectable.
  • Context-sensitive guidance that is designed to assist both developers and testers in addressing many of the challenges that are inherent in performance testing.
  • A performance-testing tool that enables the team to go well beyond simple load generation by leveraging the power and extensibility of the Microsoft Visual Studio development system.

Software-Development Model

From day one, Visual Studio Team System has been developed from a value-up paradigm, as opposed to a work-down paradigm. This is particularly well suited to performance testing. One of the core problems that faces performance testing today is the belief that there is no value in starting performance testing until the application has reached a state of virtual completion. While it is true that an application must be in a state of virtual completion to predict its performance in production accurately, most performance issues are detectable and fixable much sooner. Simply asking, "What test could I conduct, or what data could I collect right now that is most likely to uncover an existing performance issue? How can I enable the team to detect a performance regression later, based on the work that has been completed to this point?" and then using the results of that testing effort invariably mitigate the risk of not uncovering a performance issue until it is too late to resolve the issue before the scheduled release date.

This is particularly important in performance testing; unlike with many other types of testing, as soon as a performance issue is detected, performance testing is frequently left at a standstill until that issue is resolved when working from a work-down paradigm. The next scheduled test in a work-down model generally depends on the attribute that is the focus of the previous test performing acceptably for it to render valuable and useful results. Additionally, the work-down paradigm starts from the unrealistic expectation that there will be no significant change to requirements, functionality, or system architecture. The value-up approach embraces the reality that change happens—allowing for the time that comes immediately after a change to be spent adding specific value to a project, instead of causing a pause in productivity while the plan is retooled to account for the change.

Every aspect of Visual Studio Team System is tailored to transparent integration of this value-up paradigm into the software-development process. In the case of performance testing, this means that critical performance issues can be detected and dealt with early in the development cycle, instead of mere hours before the application is scheduled for release.

Test Subsystems, Components, and System-Wide Performance

One of the more common reasons that development teams give for not testing performance early is that the performance-testing tools that they have are only really effective for generating load and collecting end-to-end user-response time. These teams add that they would love to start performance testing earlier in the process, if their tool could test application components independently as part of the development process and before executing integrated testing that includes all of the parts. At Microsoft, for example, the Microsoft Office server teams have had great success in performance testing by first testing each server component on its own to ensure that it is performing well, and then testing all of the of the parts together. They report that they are finding critical performance issues earlier, that it is easier to diagnose and resolve the issues that they do find, and that they face many fewer surprises as they approach their release date. This process has been enabled by the relative ease with which Visual Studio Team System can be used for component-based performance testing.

While other load-generation tools require complex performance-test harnesses, third-party plug-ins, and/or proprietary resource monitors to be installed and configured on various application servers, with Visual Studio Team System it is only a matter of walking through some wizards and selecting the components, servers, and/or resources in which you are interested to enable one of these tests to be created and executed.

Bb905531.f5c1da08-2ae5-45f8-a2c0-e864b576e7d2(en-us,MSDN.10).gif

Figure 1

Reports show that about 25 percent of all performance problems are related to architectural and hardware-configuration issues that, given the right tools, can be detected very early in the development cycle. By using the capabilities that are built in to Visual Studio Team System, developers can find and fix architectural and configuration issues early, and avoid all of the backtracking and reworking that are associated with finding those issues only during a system-wide performance test near the end of the development cycle.

Bb905531.d5f6680e-b1bf-462e-96dd-ba2448ea0ae7(en-us,MSDN.10).gif

Figure 2

Integration into Daily Tasks and Work Environment

During a typical day of performance testing, a tester will frequently have to switch back and forth between the load-generation tool, a configuration-management system, a project- and status-reporting system, and, depending on the tool suite that is in use, a defect-tracking system, a test-case–management system, a requirements-management system, an e-mail program to keep up with various notifications from these diverse and disconnected systems, and, periodically, an external integrated development environment (IDE) to develop test harnesses and custom code to enhance the tests. A developer who is supporting performance testing will frequently have to toggle between an IDE, a defect-management system, a requirements-management system, several terminal-service windows to monitor remote machines, and an e-mail program for notifications—all while looking over the tester's shoulder to view the status of the test that is being executed. Needless to say, this kind of environment switching is not conducive to productivity.

Visual Studio Team System makes all of that a thing of the past. All of the functionality that is provided by all of those different applications is provided in the IDE. With the simplest of customizations, all of those functions are at the user's fingertips by hovering over a sidebar or, at worst, a few clicks away by changing the view of the workspace. Having all of this power in a single window—along with project, development, testing, and bug-resolution tasks all being managed as classes of work items—means that testers and developers don't have to break their rhythm and concentration completely to accomplish tasks in a logical sequence.

Bb905531.b4b5f87a-ec57-4a64-bf6b-588855221c93(en-us,MSDN.10).gif

Figure 3

Let's face it: Most testers jot down their bugs in a notebook (to be entered at the end of the day), so that they don't have to switch back and forth; and most developers check the defect-tracking system once or twice a day, because it's inconvenient. As a result, testers frequently report that the end of the day is spent retesting to get their bug reports up to standard; and developers complain that they just checked in the section of code with the defect, but they didn't fix it, because they hadn't checked the defect-management system in a few hours.

Intra-Team Test Sharing

Recently, many test-tool vendors have taken the position of opposing what they term the "Mega IDE," claiming that it is unnecessary and too complex for testers to use. Not only does this insult the abilities of testers—and disregard the fact that expert performance testers have been begging for a tool that is based on a fully featured IDE and programming language, to give them the ability to solve the unique technical challenges that must be overcome to test virtually every new technology or application—but also it ignores the value that is gained by enabling development-focused and test-focused members of the team to collaborate and share test assets. For example, none of today's bestselling performance-testing tools allows the performance tester to integrate the developer's functional or performance-based unit tests into their tests, to assist in narrowing down the root cause of an observed performance defect right from the developer’s tool. Neither can developers instrument their code to evaluate the effectiveness of a performance fix by simply re-executing the performance test that uncovered the defect without leaving the IDE. Further, none of the currently popular performance-testing tools natively version tests, tests results, and code together, transparently, and into the same configuration-management system—something that is a virtual necessity in today's increasingly auditable and regulatory-compliant environments.

Visual Studio Team System changes all of that. By using its built-in capabilities, Visual Studio Team System will:

  • Allow developers or testers to create a performance session from a unit test.
  • Allow developers or testers to execute performance concurrently, with no need for additional software on their workstations.
  • Allow developers or testers to execute performance tests against debug builds (with unit tests) to assist with root-cause analysis, without the need for one another's assistance.
  • Provide transparent configuration management of all work items, along with source code.
  • Extend the native capabilities of the performance-testing tool and share those extensions.

Guidance from the Experience of Experts

When Microsoft decided to expand Visual Studio from a development environment into a life-cycle management tool, it solicited information from, hired, and/or contracted some of the world's leading experts to help them—not just to help design the new tools, but to help write guidance about how to use those tools effectively to build software of higher quality.

Microsoft guidance is not like other help screens, tool training, or tool-related instructions that you might have seen. Microsoft guidance is not thinly veiled marketing material that highlights what the tools do well, hides what it does not, and invents a set of "best practices" to ensure that you try to do only what the tools are good at doing. Instead, Microsoft guidance is written and/or reviewed by independent industry experts who happen to use the Microsoft tools to do their jobs better. In fact, many Microsoft guidance modules don't even mention a Visual Studio tool or Microsoft. The modules that do mention a Visual Studio tool do so in the context of demonstrating how to complete a particular task, or apply a particular concept by using the tool. Better still, Microsoft guidance is organized contextually by task, not by product feature; so, instead of having to scroll through an index of features and trying to guess which one might help you with the task at hand, you can just search for guidance about the task—whether you are seeking pointers about how to approach the task cognitively or about how to complete the task by using the tool.

In the case of performance testing, Microsoft has engaged Scott Barber—Chief Technologist of PerfTestPlus, Inc., and internationally recognized performance-testing expert—throughout the design and development of the tool, as well as to write and review its performance-testing guidance. (For more about Scott, please see his professional summary. As a result, the Microsoft performance-testing guidance directly addresses both (1) how to use Visual Studio 2005 Team Edition for Software Testers to conduct the kind of testing that will help avoid the surprise of poor performance in production, and (2) the management, training, business, resource, schedule, communication, and sociopolitical challenges that plague performance testing today.

Microsoft's performance-testing guidance addresses such topics as:

  • What are the different types of performance testing, and what value do they add to a project?
  • How can we effectively write performance-related requirements, goals, and objectives during the specification or design phase?
  • How can we conduct meaningful performance testing earlier in the life cycle, without extending the project schedule?
  • How do we communicate performance data and issues across the entire team in a way that everyone understands?
  • What should we do if we don't have budget to build a test environment that mirrors production?
  • What can we do to improve the time between detecting a performance issue and resolving that issue?
  • How do we bridge the gap between technical jargon and business risks and implications?

All the while, it explains things such as:

  • How do I create my first script?
  • How do I add conditional navigation to my scripts?
  • What are my options for managing test data?
  • How do I employ unit tests as part of my performance test and vice versa?
  • What do I do with all of my test data to ensure that the tests are auditable?

Virtually Unlimited Extensibility

When it comes right down to it, all load-generation tools lag at least six months behind, in support of the latest development technologies—and, sometimes, several years behind. Of course, it's these new and evolving technologies that pose the biggest risks to your applications, in terms of performance. What this means is that no matter what tool you use, it is a fairly safe bet, at some point during the performance-testing effort, that you are going to require some kind of customization, modification, extension, or plug-in to achieve your performance-testing goals. Most of the popular tools that are on the market make this degree of tool enhancement extremely difficult for performance testers by forcing them to code in a limited functionality or proprietary programming language, or trying to "simplify" test creation by restricting access to the actual test code in favor of buttons that automatically accomplish the most common tasks for you.

Visual Studio Team System solves this problem too. Visual Studio Team System enables the performance tester to work from an intuitive point-and-click, drag-and-drop, configuration-wizard environment to keep it simple for all of the common performance testing-tasks, and includes an expert view with all of the code generated in a .NET programming language—which is generally the same languages with which your team's developers are most familiar.

Bb905531.8dc5d0ed-e479-43bc-a14a-d50bc043a121(en-us,MSDN.10).gif

Figure 4

This means that anything that can be coded using Visual Studio can be integrated into a performance test and even if the performance tester doesn't know how to code a solution to a particular problem, they have the entire development team to turn to for support. Without having to ask them to translate their knowledge into some language they either never knew or don't use regularly.

Conclusion

For years, serious performance testers and developers who have a performance focus have been dreaming of the day when they would have a tool that had the features and capabilities that Visual Studio Team System brings them. The ability for developers and performance testers to collaborate and share code; the ability to jointly version tests, test results, and source code; a single tool that supports both early life-cycle, component-based performance testing and late life-cycle, end-to-end performance testing; a single environment to manage all of their daily activities; help screens that actually help them solve complex challenges, instead of simply explaining what a button does; and the ability to extend the functionality of their tool to accommodate state-of-the-art technologies, before they become state-of-the-practice—all at an affordable price—is what the market asked for and what Visual Studio Team System delivered.

To learn more, please see:

About the author

Scott Barber is the Chief Technologist of PerfTestPlus, Inc., Executive Director of the Association for Software Testing, and cofounder of the Workshop on Performance and Reliability. Scott is internationally recognized as an expert performance tester. Some of his other specialties include developing customized testing methodologies for individual organizations, embedded systems testing, testing biometric identification and security systems, group facilitation, and authoring instructional materials. Scott is an international keynote speaker and contributor to various software-testing publications. He is a member of ACM, IEEE, American MENSA, and the Context-Driven School of Software Testing, and is a signatory to the Manifesto for Agile Software Development. Scott is active in his personal mission of improving the state of performance testing across the industry by collaborating with other industry authors, thought leaders, and expert practitioners, as well as volunteering his time to establish and grow related industry organizations. His tireless dedication to the advancement of software testing in general and performance testing in particular is often referred to as a hobby in addition to a job, due to the enjoyment that Scott gains from his efforts.