yTest Run

Custom Test Automation with Team System

Dr. James McCaffrey

Code download available at:TestRun2008_Launch.exe(205 KB)

Contents

Wrapping a Simple Test Automation Script
Extended Test Results
Publishing Test Results to Team Foundation Server
Wrapping Up

There is no single best way to test software. In addition to manual testing, depending on your particular development environment, you may be using commercial test automation frameworks, open source and in-house test automation frameworks, and custom test automation scripts. All these approaches have pros and cons.

Custom test automation scripts have the advantages of being quick to write and providing maximum flexibility. However, a downside to using custom test automation is manageability. Your testing effort can become overwhelmed by the sheer volume of test scripts, test case data, and test results. Luckily, Visual Studio® 2005 Team System provides you with the ability to manage custom test automation. Let me show you what I mean using a couple of screen shots. First, consider the test automation script shown executing in Figure 1.

Figure 1 Typical Custom Test Automation Script

Figure 1** Typical Custom Test Automation Script **(Click the image for a larger view)

The test automation is a very short JavaScript script that is performing module testing on a method named TriMax that simply returns the largest of three integers. The TriMax method is housed inside a classic COM DLL file. Although simple and effective, test automation using this type of approach has several weaknesses. Where are the test results being stored? Against which build of the DLL is this test run being executed? How can these test results be shared with other members of the development team? Are these results related to any open bugs? Visual Studio Team System is designed to handle such issues, as Figure 2 shows.

Figure 2 Managing Custom Automation with Team System

Figure 2** Managing Custom Automation with Team System **(Click the image for a larger view)

Visual Studio 2005 Team System consists of Team Foundation Server (TFS) plus one of the client-side editions of Visual Studio. You can think of TFS as an intelligent back-end data warehouse that stores and manages all the data associated with a software development project, including source code, bugs, test results, specification documents, and much more.

The custom test automation functionality I describe in this column is provided in Visual Studio Team Edition for Testers. You can download the 180-day evaluation version of Visual Studio 2005 Team Suite, which includes the functionality of all editions, plus Team Foundation Server from microsoft.com/downloads.

I'll present three examples in this column. The first shows how to use VSTE for Testers to take a very simple piece of custom test automation and wrap it into a test that can be managed by the Team System. The second example expands on the first by showing you how to modify custom test automation so you can create test case results that have more detailed information than a simple pass or fail result. My third example shows how to use Team Foundation Server to store and manage test run result data.

Before diving in, let me briefly describe the module under test so you'll understand the simple custom test automation, and so you can create the module if you want to. (If you are familiar with creating COM objects using ATL, you may want to skip ahead to the next section.)

My module is a Win32® C++ method named TriMax. Using Visual Studio, I created a new Project using a C++ ATL template and named it MyCOMLib. I accepted the ATL wizard's default settings, which instruct Visual Studio to create a DLL. In the project's Solution Explorer, I right-clicked on the project name and selected Add | Class from the context menu. Next, from the ATL category, I selected the ATL Simple Object Template. Then, in the ATL Simple Object Template wizard, I typed "MyMethods" into the C++ Short Name field, which automatically populated the remaining seven name fields (.h file as MyMethods.h, Class as CMyMethods, .cpp file as MyMethods.cpp, Coclass as MyMethods, Type as MyMethodsClass, Interface as IMyMethods, and ProgID as MyCOMLib.MyMethods).

After finishing the wizard, I switched to the Class View window, right-clicked on the IMyMethods interface and selected Add | Add Method from the context menu. In the Add Method wizard, I named my method TriMax. Next I added three input parameters, named x, y, and z, all of type LONG, and then added a LONG* retval parameter to hold my method's return value. After clicking the finish button in the Add Method wizard, I switched back to the Solution Explorer view in Visual Studio and double-clicked on MyMethods.cpp in the Source Files folder. Here is the simple code for my method under test:

STDMETHODIMP CMyMethods::TriMax(LONG x, LONG y, LONG z, LONG* pRes) { if (x > y && x > z) *pRes = x; else if (y > z) *pRes = y; else *pRes = z; return S_OK; }

I pressed Ctrl+Shift+B to perform a local build of my COM object. And, finally, I used Windows Explorer to verify that the resulting MyCOMLib.dll file was created successfully in the \MyCOMLib\Debug subdirectory. As you can see, my TriMax method is unrealistically simple; I just want something to perform test automation against. Let me emphasize that even though the examples that follow use JavaScript test automation code to test a Win32 COM object, the principles of using Visual Studio Team System to manage custom test automation apply to any type of custom test automation (including C#, Visual Basic® .NET, Windows PowerShell®, and the like) targeting any type of system under test (including .NET managed libraries, Windows® form-based applications, ASP.NET Web applications, ASP.NET Web services, and so on).

Wrapping a Simple Test Automation Script

Now let's look at the simple test automation script in Figure 3. Later I'll show you how to integrate the script into VSTE for Testers and describe the advantages of doing so. My script begins by displaying a couple of messages using the Echo method of the Windows Script Host. Next I instantiate the COM object that contains my method under test:

Figure 3 Simple Test Automation Script

// testSimple.js WScript.Echo("\nBegin test run\n"); WScript.Echo("Testing TriMax() in MyCOMLib.dll\n"); var o = new ActiveXObject("MyCOMLib.MyMethods"); var cases = new Array( "001:3*7*5:7", "002:0*0*0:1", "003:2*4*6:6" ); var allPassed = true; for (var i = 0; i < cases.length; ++i) { var tokens = cases[i].split(":"); var id = tokens[0]; var inputs = tokens[1]; var expected = tokens[2]; var temp = inputs.split("*"); var arg1 = temp[0]; var arg2 = temp[1]; var arg3 = temp[2]; WScript.Echo("=============="); WScript.Echo("Case ID = " + id); WScript.Echo("Inputs = " + arg1 + " " + arg2 + " " + arg3); var actual = o.TriMax(arg1,arg2,arg3); WScript.Echo("Expected = " + expected); WScript.Echo("Actual = " + actual); if (actual == expected) { WScript.Echo("Pass"); } else { WScript.Echo("**FAIL**"); allPassed = false; } } WScript.Echo("=============="); if (allPassed == true) { WScript.Echo("\nAll test cases passed"); WScript.Quit(0); } else { WScript.Echo("\nOne or more cases failed"); WScript.Quit(1); }

var o = new ActiveXObject("MyCOMLib.MyMethods");

Because I am testing a classic COM object, I can reference the object using the object's ProgID. Additionally, because I compiled the COM object on my test host machine, the object information was stored into my system registry. Note that if I had copied the MyCOMLib.dll file to my test host machine from a different machine, I would have needed to register the DLL by calling regsvr32.exe using the Run method of the WScript.Shell object. My example script stores test its case data internally:

var cases = new Array( "001:3*7*5:7", "002:0*0*0:1", "003:2*4*6:6" ); var allPassed = true;

As a general rule, it is better to store test case data externally to the test automation code. External storage allows test cases to be shared and modified more easily than internally stored data. However, internal data is simpler and sometimes a good design choice.

In this simple example, I have three test cases. The first test of the TriMax(x,y,z) method has an ID of 001, inputs of 3, 5, and 7, and an expected result of 7. The second test case has a deliberate failure for demonstration purposes. Notice that I initialize a variable named allPassed to true. This variable tracks whether or not all the individual test cases pass. My test automation iterates through each test case, parsing out the test case ID, the inputs, and the expected value. With those values in hand, my script invokes the method under test and compares the actual result with the expected result to determine a pass/fail result for the individual test case:

var actual = o.TriMax(arg1,arg2,arg3); if (actual == expected) { WScript.Echo("Pass"); } else { WScript.Echo("**FAIL**"); allPassed = false; }

If I find any failing test case, I set the value of the allPassed variable to false. Finally, my example test script finishes by using the WScript.Quit method to set the process exit code to 0 or 1:

if (allPassed == true) { WScript.Echo("\nAll test cases passed"); WScript.Quit(0); } else { WScript.Echo("\nOne or more cases failed"); WScript.Quit(1); }

If you intend to manage simple custom test automation with VSTE for Testers, your automation must set an exit code of 0 to indicate that the overall test result is pass or an exit code of 1 to indicate failure. This means that if you are going to integrate existing test automation with VSTS, you will likely have to edit your automation because test automation rarely sets the exit code value. Exactly how you set the exit code varies slightly from language to language. For example, in a .NET-compliant language program, you can use the System.Environment.ExitCode property, and in a Windows PowerShell script, you can use the exit statement.

At this point you can execute the test automation script locally, as shown in Figure 1. But you can also run custom automation from within VSTE for Testers. Select File | New | Project. The New Project dialog has a Test Projects type with a Test Documents subtype. Select Test Documents and you will see a Test Project template. Next, give the test project a name such as MyTestProject and specify a location for the project root directory.

A Test Project can hold one or more tests. Each test can hold one or more individual test cases. By default, you will get a Test Project that contains a general help text document named AuthoringTests.txt and the skeletons for two tests: a manual test and a unit test. You can delete these three files in the Solution Explorer window.

Now right-click on your Test Project name and select Add | New Test from the context menu. In the Add New Test dialog box, select the Generic Test template, name it Simple.GenericTest, and then click OK. VSTE for Testers has built-in templates for load tests, unit tests, and Web tests. The Generic Test type is essentially a wrapper for custom test automation and has all the plumbing necessary to enable custom automation to interoperate with VSTE for Testers. The screenshot in Figure 4 shows you what the process of wrapping custom test automation as a Generic Test looks like.

Figure 4 Creating a Simple Generic Test

Figure 4** Creating a Simple Generic Test **(Click the image for a larger view)

The first field specifies an existing test program to run. This value must be the path to an .exe file. (While I used an absolute path in this example, you can additionally specify the script path as an environment variable, a relative path, or a deployment path.) In the case of script-based test automation, this is the cscript.exe program, usually located at C:\Windows\System32\. If your test automation is an .exe itself, such as a C# program, you would specify the full path to your automation. The second field specifies command-line arguments that are to be passed to the .exe file specified in the first field. In this example, the value would be the full path to the script, such as C:\VSTTIntegration\Scripts\testSimple.js. These are the only two fields you need to specify in order to run simple script-based tests with internal test case storage.

There are several ways to run tests from within VSTE for Testers. One simple way is to display all tests in your project by clicking on Test | Windows | Test View. Then select the test you wish to run, right-click on the test name, and click Run Selection. The screenshot in Figure 5 shows an example.

Figure 5 Executing a Generic Test

Figure 5** Executing a Generic Test **(Click the image for a larger view)

The Test Results window displays the overall pass/fail result. If you double-click on the result, you get detailed results including the test run duration, start time, end time, and a copy of the console output. A test result is stored by default on the host machine as an XML file with a .trx (test results XML) file extension in a TestResults subdirectory. By default, these .trx file names are stamped with user name / machine name / date / time, making them easy to organize. The actual structure of .trx files is somewhat complex but easy to interpret. For example, a small part of the .trx file generated by this example Generic Test is:

<errorInfo> <message type="System.String"> The test failed with exit code 1. </message> <stackTrace type="System.String" /> </errorInfo>

The example I've just described wraps custom test automation that has test case data embedded into the automation code. If your automation reads test case data from an external source, you may need to edit one of the Generic Test fields. Suppose your test automation resembles this:

var fsi = new ActiveXObject( "Scripting.FileSystemObject"); var fi = fsi.OpenTextFile(".\\testCases.txt"); while (!fi.AtEndOfStream) { line = fi.ReadLine(); // parse test case data // call method under test // determine pass/fail }

Here the script uses a relative file path to read test case data from a file located in the same directory as the script. When a Generic Test executes, it copies the executables into a working directory, which means the test case data would not be found. One solution to this is to specify a hardcoded path in the Working Directory field when you create a Generic Test. A slightly more flexible approach is to set the values of one or more environment variables such as %TestDir%, %TestLocation%, and %TestOutputDirectory%, and then use these environment variables to control your test execution. However, the most flexible option is to declare the script as a deployment item in Visual Studio. By doing this, your test case data will be copied into the output directory so that it can be referenced via a local, relative path.

Extended Test Results

Now you've seen how to use VSTE for Testers to manage simple custom test automation. You can also create test automation that saves additional information and more detailed test results. The basic idea is to code your test automation so that it saves intermediate detailed test results as a simple XML file that conforms to a special SummaryResult.xsd schema. VSTE for Testers automatically uses the emitted simple XML results file to create the more complex .trx results file that is managed by Visual Studio Team System. This two-step process is much easier than trying to write test automation that directly produces a .trx test results file. It is easiest to understand this process by working backward. The listing in Figure 6 shows an XML test results file that conforms to SummaryResult.xsd and so can be automatically converted to a .trx results file.

Figure 6 Intermediate Extended Test Results File

<SummaryResult> <TestName>MyCOMLib Test Run</TestName> <TestResult>Failed</TestResult> <DetailedResultsFile> C:\AdditionalResults\Details.txt </DetailedResultsFile> <InnerTests> <InnerTest> <TestName>001</TestName> <TestResult>Passed</TestResult> </InnerTest> <InnerTest> <TestName>002</TestName> <TestResult>Failed</TestResult> </InnerTest> </InnerTests> </SummaryResult>

The file in Figure 6 has required overall TestName and TestResult values and an optional overall DetailedResultsFile value. Other optional values defined by the SummaryResult.xsd schema include InnerTest to store the name and results of test cases that make up the overall Generic Test, and ErrorMessage values.

The SummaryResult.xsd schema file is installed with VSTE for Testers and is typically located at C:\Program Files\Microsoft Visual Studio 8\Xml\Schemas\. The schema defines test result types other than simply Passed and Failed. Additional defined result types are Aborted, Error, Inconclusive, NotRunnable, NotExecuted, Discontinued, Warning, InProgress, Pending, PassedButRunAborted, and Completed.

Exactly how you write your custom test automation to emit an intermediate extended XML test case results file depends on the programming language you are using and the structure of your automation logic. If you want to use the InnerTest tag, the SummaryResult.xsd schema dictates that you must specify the overall test result before you specify the result of each test case. This means you cannot build up your intermediate XML results file in a strictly sequential way while processing your test case data.

There are many approaches you can take. The listing in Figure 7 shows one way to modify the test automation script presented in Figure 3 to emit an extended results file. The overall idea is to build up the XML result file as one long string by capturing the results of each test case as each case is processed and then constructing the final string after the overall test result is known.

Figure 7 Test Script that Emits an Extended Results File

// testExtended.js function main() { WScript.Echo("\nBegin test run\n"); var o = new ActiveXObject("MyCOMLib.MyMethods"); var cases = new Array( "001:3*7*5:7", "002:0*0*0:1", "003:2*4*6:6" ); var allPassed = true; var innerTestText = new Array(cases.length); for (var i = 0; i < cases.length; ++i) { var tokens = cases[i].split(":"); var id = tokens[0]; var inputs = tokens[1]; var expected = tokens[2]; var temp = inputs.split("*"); var arg1 = temp[0]; var arg2 = temp[1]; var arg3 = temp[2]; WScript.Echo("=============="); WScript.Echo("Case ID = " + id); WScript.Echo("Inputs = " + arg1 + " " + arg2 + " " + arg3); var actual = o.TriMax(arg1,arg2,arg3); WScript.Echo("Expected = " + expected); WScript.Echo("Actual = " + actual); innerTestText[i] = "<TestName>" + id + "</TestName>"; innerTestText[i] += "<TestResult>"; // inner test result if (actual == expected) { WScript.Echo("Pass"); innerTestText[i] += "Passed"; } else { WScript.Echo("FAIL"); innerTestText[i] += "Failed"; allPassed = false; } innerTestText[i] += "</TestResult>"; } // main loop WScript.Echo("=============="); var extendedResults = "<SummaryResult>"; extendedResults += "<TestName>MyCOMLib Test Run</TestName>"; extendedResults += "<TestResult>"; // overall meta-result if (allPassed == true) { extendedResults += "Passed</TestResult>"; } else { extendedResults += "Failed</TestResult>"; } extendedResults += "<InnerTests>"; for (var j = 0; j < innerTestText.length; ++j) { extendedResults += "<InnerTest>"; extendedResults += innerTestText[j]; extendedResults += "</InnerTest>"; } extendedResults += "</InnerTests>"; extendedResults += "</SummaryResult>"; var fso = new ActiveXObject("Scripting.FileSystemObject"); var f = fso.CreateTextFile("C:\\VSTTIntegration\\Scripts\\results.xml"); f.WriteLine(extendedResults); f.Close(); } // main() main()

I create an array of strings, one string for each test case, to hold the InnerTest values:

var innerTestText = new Array(cases.length);

Inside the main processing loop I write out the TestName value and the start of the TestResult value:

innerTestText[i] = "<TestName>" + id + "</TestName>"; innerTestText[i] += "<TestResult>"; // inner test result

After I process an individual test case, I can supply a test result value and add the closing TestResult tag:

if (actual == expected) { WScript.Echo("Pass"); innerTestText[i] += "Passed"; } else { WScript.Echo("FAIL"); innerTestText[i] += "Failed"; allPassed = false; } innerTestText[i] += "</TestResult>";

After the main processing loop has finished and all my InnerTest data has been constructed, I can create the first part of my extended results file:

var extendedResults = "<SummaryResult>"; extendedResults += "<TestName>MyCOMLib Test Run</TestName>"; extendedResults += "<TestResult>"; // overall meta-result if (allPassed == true) { extendedResults += "Passed</TestResult>"; } else { extendedResults += "Failed</TestResult>"; }

I finish by adding the InnerTest data I saved earlier and writing the entire string as an XML file:

extendedResults += "<InnerTests>"; for (var j = 0; j < innerTestText.length; ++j) { extendedResults += "<InnerTest>"; extendedResults += innerTestText[j]; extendedResults += "</InnerTest>"; } extendedResults += "</InnerTests>"; extendedResults += "</SummaryResult>"; var fso = new ActiveXObject("Scripting.FileSystemObject"); var f = fso.CreateTextFile("C:\\VSTTIntegration\\Scripts\\results.xml"); f.WriteLine(extendedResults);

There are many other approaches you can take to emit XML results. My column "Five Ways to Emit Test Results as XML" in the June 2006 issue of MSDN® Magazine (msdn.microsoft.com/msdnmag/issues/06/06/TestRun) describes some of these techniques.

At this point, I have a custom test automation script that writes detailed test results to a results.xml file that conforms to the SummaryResult.xsd schema. To use this automation from within VSTE for Testers, I create a new Generic Test around the automation using the technique I described earlier, but with one change. After Add | New Test and naming the test (say, Extended.GenricTest) in the definition pane, I check the Summary Results File checkbox and enter the location of the results file in the associated textbox (say, C:\VSTTIntegration\Scripts\Results\results.xml). Now when I run my Generic Test, VSTE for Testers will run my automation, find the results.xml intermediate file, and automatically use the data there to create the final, detailed name-date-time stamped .trx result file. Very neat!

Publishing Test Results to Team Foundation Server

Wrapping custom test automation as a Generic Test using VSTE for Testers gives you a great way to manage tests and test results. If you are using Team Foundation Server in your development environment, you can publish test results to the TFS data warehouse to give you even better management of test results.

The TFS back-end data warehouse is extremely complex, so you do not want to try and save .trx test results directly to the underlying SQL tables. However, you can save test result data to TFS using VSTE for Testers. The process is very simple. After running a Generic Test, connect to your TFS project by selecting Tools | Connect to Team Foundation Server from the main menu. In the Test Results window menu, click the Publish icon. You will see a Publish Test Results dialog similar to the one shown in Figure 8.

Figure 8 Publishing Test Results to Team Foundation Server

Figure 8** Publishing Test Results to Team Foundation Server **(Click the image for a larger view)

You can select one or more loaded test results and then choose the appropriate build of the system under test from a dropdown control. Note that your test results file must be located in your test output directory in order for it to be available for publishing to the data warehouse. After clicking OK, your test results will be published to the TFS data warehouse. This approach allows anyone on your development team to easily view your test results.

One way to view results from within VSTE for Testers is to connect to your TFS project and expand the Builds folder in the Team Explorer window. If you double-click on the appropriate build type, you will see a list of all specific builds of that type. If you double-click on a particular build entry, you will see the data associated with that build, including published test results. Even better, because TFS bug report data can be associated with published test results, you can view built-in reports such as Tests Failing without Active Bugs, or you can create custom reports that use this data.

Notice that in order to publish test results to Team Foundation Server, you must associate those test results to a particular build of the system under test. This means that you must have a TFS build system in place. But what if you are performing regression testing on a legacy component that is no longer being actively built, or even a component for which the source code no longer exists? One simple solution is to create a dummy project, perform a single build of that project, and then use this build number as a placeholder to target your test results against. For more on this, take a look at the Team System column written by Brian Randell, "Team Foundation Server Version Control," in the January 2007 issue of MSDN Magazine (msdn.microsoft.com/msdnmag/issues/07/01/TeamSystem).

Wrapping Up

The Visual Studio 2005 Team Edition for Testers has the ability to create a wide range of built-in test types, such as unit tests and load tests. These well-known test types are tightly integrated into the development process and are generally associated with test-driven development. However, there are many possible development situations where you have legacy test automation or wish to create custom test automation. For example, you may be outsourcing your test effort and for security reasons do not want the testing team to have direct access to your system's source code. VSTE for Testers allows you to wrap any kind of custom test automation into a Generic Test type. In addition to providing a centralized management structure for your custom test automation, you automatically gain extra results data such as the start time, end time, and duration of each test run, as well as a log of any console output.

Generic Test results are stored as .trx files on the local host machine, and they can easily interoperate with other test systems if necessary because they are stored in a standard XML format. Although you can configure custom test automation to record a simple pass or fail result, VSTE for Testers also gives you the ability to record extended results, including result types such as Inconclusive and Aborted. You do this by modifying your custom test automation to produce an intermediate XML results file that conforms to a SummaryResult.xsd schema. VSTE for Testers automatically uses the intermediate XML file to produce a detailed .trx file.

In development scenarios where you are using Team Foundation Server to manage source code, specification documents, bug data, and the build process, you can publish test results to the TFS data warehouse. Publishing test results to TFS allows these results to be viewed by all other members of the development team and easily determines the relationships between bugs, builds, and test results.

Visual Studio 2008 has just been released, and, from what I've seen so far, you will have all the existing ability to manage custom test automation, plus additional test management functionality that will help you create better software systems.

Thanks to Howard Dierking for suggesting the topic of this column, and to Team System expert Brian Randell for reviewing it.

Send your questions and comments for James to testrun@microsoft.com.

Dr. James McCaffrey works for Volt Information Sciences, Inc., where he manages technical training for software engineers working at the Microsoft campus in Redmond. He has worked on several Microsoft products including Internet Explorer and MSN Search. James is the author of .NET Test Automation Recipes (Apress, 2006) and can be reached at jmccaffrey@volt.com or v-jammc@microsoft.com.