Unit Testing

Apply Test-Driven Development to your Database Projects

Jamie Laflen

This article discusses:

  • The benefits of TDD
  • Bringing unit testing to your database development
  • Combining T-SQL and .NET–compliant languages
  • Connections, test conditions, and transactions
This article uses the following technologies:
Visual Studio 2008, SQL Server

Code download available at:DatabaseUnitTesting2008_Launch.exe(174 KB)

Contents

Unit Testing in Software Development
Database Unit Testing
Defining Connections
Managing T-SQL Scripts
Modifying Generated Test Code
Transactions
System.Transactions
Data-Driven Testing
Custom Test Conditions
Defining the Test Condition
Modifying Test Execution
Looking Forward

Ln November 2006 Microsoft released Visual Studio® Team System Database Edition, also known as DBPro or Data Dude, which introduced a product lifecycle methodology to database development. DBPro also introduced a database unit-testing designer that makes it easy to either generate or write T-SQL-oriented unit tests to verify database objects prior to deployment. In this article, I will dig into database unit testing, explain how the feature was implemented, and show you how to perform unit testing of your own database projects.

Unit Testing in Software Development

In software development, unit testing plays a critical role in ensuring that the proper level of quality is maintained and that milestones are met. Although most developers agree that unit tests are important, most don't take the time to write the unit tests at the end of a product cycle. To address this issue, the test-driven development (TDD) methodology was conceived. Using this methodology, developers write unit tests for individual features before actually implementing the feature. This process allows the developer to see the API from the outside-in, thus providing the opportunity for the simplest design possible.

Today, many developers working in C# and Visual Basic® follow TDD by creating a unit test to exercise a method on a class that has not yet been written and then implement the bare minimum amount of code needed to compile the test. Once the test has compiled and failed, the developer writes only the code necessary to make the test pass in the next run. With each iteration, all unit tests are run to verify that there are no regression errors. The application and unit-test code are then checked into source code control. Each unit test is run at least twice for the method being developed.

To be useful, a unit test must set up its execution environment before it is run so that it can consistently verify the code it is meant to test. Once the test has been implemented and verified, it is checked into source code control and will be used for the rest of the product's lifetime to ensure that the method continues to behave as expected.

TDD is more challenging in database development because you need to provide a consistent test environment to verify code prior to checking in changes. For a database unit test to run successfully it must have the appropriate schema as well as the proper data. This means that in order to run tests, other developers must duplicate the environment you used when the unit tests were created. This is a tall order to fill without building a lot of costly infrastructure. Fortunately, DBPro provides features that allow database developers to follow the same unit-testing process that other developers follow, as in this scenario:

  1. Get the latest test and database projects from source code control and add a new stored procedure to the database project. Configure the procedure's input and output parameters.
  2. Right-click on the new stored procedure and select "create unit tests." Add the unit test to the existing test project. A new test class is added to the test project with T-SQL that will execute the new stored procedure with default input parameters.
  3. Modify the input parameters to contain expected values and add a test condition to verify that the stored procedure returns the expected results.
  4. Run the test. As part of test setup, the database project is deployed to the local SQL Server® instance and the data generation plan is executed to populate the newly deployed database with expected test data. The test executes against the local SQL Server instance and fails (as expected because the stored procedure's logic has not yet been implemented).
  5. Implement the stored procedure's logic and run the test again. The updated stored procedure is deployed, and the test passes.
  6. Run all tests to verify the database and then check the stored procedure and unit-test code into source code control.

The features provided by DBPro include: the generation of stub T-SQL unit tests from existing functions, stored procedures, or triggers; automatic deployment of database project updates to a local sandbox instance; generation of data as part of the test environment setup using a data generation plan; and execution of T-SQL tests against a target database. Although these features can be used together, that isn't required. For instance, a database unit test can be written from scratch and data generation does not have to be executed prior to every test run.

Database Unit Testing

The database unit-testing feature reflects two somewhat competing goals: the desire on the part of the database developer to use familiar T-SQL scripts to express the unit-test logic and the desire to take advantage of powerful Microsoft® .NET Framework-targeted languages such as C# or Visual Basic. These competing goals are reconciled by encapsulating the test in T-SQL code and then hosting the T-SQL in C# or Visual Basic .NET code. The unit-test designer generates both, enabling either (or both) to be modified to meet individual needs.

Database unit testing with Team System provides or supports the features you'd want if you were designing a database testing framework yourself. You need to be able to execute a test with the proper context. For example, you need to utilize the credentials your Web tier would use when connecting to the database or be able to use a different set of credentials to access or modify the underlying tables before and after a test has executed. Being able to run a script before and after each test in a "test group" to set up or reset the database state is helpful, as is the ability to run a script (for an individual test) before and after a test to set up or reset the database state.

Finally, you'd like to extend the set of test conditions available by adding conditions specific to your database, leverage transactions to control modifications to database state, and use data to drive multiple runs of your test with different input and output parameters. In some cases, to use these features you'll need additional code. Let's see what that entails.

Defining Connections

As I mentioned, it's important for a database test harness to allow tests to execute using a connection other than the one used to verify and set up the database. To support this goal, Microsoft created the concept of a connection context to contain all the information necessary to define the connection and how it should be used to execute the test. The base connection context contains an open connection, the provider factory (msdn2.microsoft.com/system.data.common.dbproviderfactory) to use when creating additional ADO.NET objects, as well as the timeout and DbTransaction to use when executing a test.

Using the context type, you define two context instances—execution context and privileged context—to support test setup, execution, and verification. As is clear from its name, execution context is used to execute the unit test. It should represent the configuration of the application that uses the database. Privileged context is used to verify unit tests as well as perform all other modification operations (including deployment and data generation). The definitions of these two contexts are stored in the accompanying sample test project's app.config file, in a special section defined for database unit testing.

Managing T-SQL Scripts

From a code perspective, database unit testing generates C# or Visual Basic .NET code that targets the unit-testing framework in Visual Studio. In the unit-testing framework, a class is used as a container to hold tests (each test is defined as a method). There are a number of opportunities to add scripts in the unit-testing designer to support the scripting features I've described. All scripts, other than the test script, are executed using the privileged context because they are meant to set up or verify the test environment. A script will be executed one or more times depending on where it is added, so let's look in more detail at the scripts and how they are executed (see Figure 1).

Figure 1 Test Scripts and Their Execution

Script Execution
Test Initialization This T-SQL script is used to set up the test environment prior to each test's execution. This script is executed before every test in the class.
Pre-Test Setup The setup prior to a specific test being executed. The pre-test script is available so that you can perform a set-up action for the test or verify that the database is in the appropriate state for the test to take place.
Test The test itself.
Post-Test Cleanup The cleanup after a specific test has executed. The post-test script is supplied so that you can restore or verify the state of the database.
Test Cleanup This T-SQL script is used to restore the test environment after the test has been completed. This script is executed after every test.

Now let's see how to create, access, and modify scripts in the database unit-testing designer. First, create a database unit test by selecting New test from the Test menu, and then select the Database unit test template. Once the wizard has completed, you will see a dialog prompting you to configure the project—cancel this for now (I'll come back to it later). In the solution explorer a new file called DatabaseUnitTest1 will be created and the associated designer will be opened so you can create the first unit test.

In Figure 2 there are three interesting areas. First, the top-left dropdown allows you to select between common scripts and tests defined in the class. When a new test class is created, a default test called DatabaseTest1 is added to the class; you can rename this default test using the Rename button and add additional tests using the button with the green plus symbol.

Figure 2 New Unit Test

Figure 2** New Unit Test **(Click the image for a larger view)

The dropdown to the left of that button allows you to switch between scripts. If Common scripts has been selected in the dropdown on the left, the test initialization/cleanup scripts that are run before and after every test are available to be edited. On the other hand, if a Test is selected, the pre/test/post scripts are available to be edited.

Third, the Test Conditions panel provides you access to the test conditions that operate on the result of the T-SQL script execution to verify that the results are what you expected. The dropdown is enabled once you click on the hyperlink "Click here to create." This dropdown provides access to the test conditions registered on your machine.

As you can see from the UI, there is a consistent pattern for all scripts. Everywhere you can define a script, you can also define one or more test conditions that can be used to verify that the script executed as you expected. Verification of tests is crucial when writing unit-test code. It is not enough to just execute a piece of code; you also need to verify that the execution did what you expected.

At this point, you are probably wondering how the test knows which database to connect to and what the privileged context I mentioned is all about. Well, to answer these questions you need to go back to that dialog I had asked you to cancel earlier. To get to the dialog, select the Test Project and go to Test | Database Test Configuration (see Figure 3).

Figure 3 Configuring the Unit Test

Figure 3** Configuring the Unit Test **(Click the image for a larger view)

In the first dropdown option, you select the connection string that will be used to execute your test. This connection string will probably use Windows® authentication by default, but you can specify SQL Server authentication. If you do, the password will be stripped out, encrypted, and stored in the registry (the password will be retrieved from the registry during test execution).

In the second dropdown, specify a different set of credentials for modifying and verifying database state. For instance, if you want to execute your tests using the credentials the application will use (for example, the Web tier communicating with the database), you may want to use a different set of credentials with higher privileges to verify the database state. All other scripts, including deployment and data generation, use this connection string. If you do not specify a connection string, the execution connection string is used.

In the third dropdown, you select a database to deploy prior to executing any tests. If you select a database project (and select the default configuration), a differencing build will be performed and the change deployed to the database specified in the execution connection string. If you have not made any changes, this is a quick operation (thanks to changes in DBPro SP1) because MSBuild and the deployment script detect that no changes have been made. After the tests have been executed, you can see the results of the database deployment by clicking on the hyperlink at the top of the Test Results tool window.

In the very last dropdown, you select a data-generation plan that should be deployed prior to test execution. This plan will run every time your tests run, so you will want to configure a small amount of data to reduce the time it takes for your tests to actually begin execution.

After you click OK on the dialog, the information is stored in the test project's app.config file, as shown in Figure 4. Up to this point, I've shown you the major features exposed by the UI—now let's drill into the actual code.

Figure 4 App.config

<?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> <section name="DatabaseUnitTesting" type="Microsoft.VisualStudio.TeamSystem.Data .UnitTesting.Configuration.DatabaseUnitTestingSection, Microsoft.VisualStudio.TeamSystem.Data.UnitTesting, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" /> </configSections> <DatabaseUnitTesting> <DataGeneration ClearDatabase="true" /> <ExecutionContext Provider="System.Data.SqlClient" ConnectionString="Data Source=(local)\sqlexpress ;Initial Catalog=Database1Sandbox;Integrated Security=True;Pooling=False" CommandTimeout="60"/> <PrivilegedContext Provider="System.Data.SqlClient" ConnectionString="Data Source=(local)\sqlexpress ;Initial Catalog=Database1Sandbox;Integrated Security=True;Pooling=False" /> </DatabaseUnitTesting> </configuration>

Modifying Generated Test Code

The unit-testing API was designed to meet your need to change or augment the way the test harness works out of the box. Microsoft expected that users would want to modify the code that's generated, write their own test conditions, and replace the default test service with their own implementation. Figure 5 lists the classes that make up the database unit-testing API. (Also see msdn2.microsoft.com/microsoft.visualstudio.teamsystem.data.unittesting for more information.)

Figure 5 Classes in the Database Unit-Testing API

Class Purpose
TestCondition The base class for all test conditions used to verify test scripts. A subclass of Component, it supports a design-time and runtime execution. At design time, you configure the test condition with the values you expect during test execution, and at run time the test condition uses those design-time values to verify the actual test results.
DatabaseTestClass The base class for all database unit tests in Visual Studio. The definition of this class integrates into the Visual Studio design environment and is responsible for serializing the code in InitializeComponent.
ConnectionContext Encapsulates a connection to a database.
DatabaseTestActions This class is a component, a design-time representation of a database unit test. Instances of this class host the DatabaseTestAction instances that comprise one test.
ExecutionResult Represents the result of executing one batch during test execution.
DatabaseTestAction This class is a component. Instances of this class contain a script and a collection of test conditions.
DatabaseTestService The default implementation of the test harness. It implements database deployment, data generation, creation of execution and privileged connection contexts, and test execution.

The default instance of the test service is accessed through a static property on the DatabaseTestClass. If you would like to, you can replace the default implementation with your own and override any of the above implementations.

There are several reasons why you may want to modify the generated Visual Basic or C# code. The two most common scenarios are to execute your unit tests in a transaction or to take advantage of the unit-testing harness's data-driven testing feature.

Transactions

Unit tests that are executed in a shared environment must be careful not to dirty the execution environment; modifying the expected data or schema could cause subsequent tests to fail unexpectedly. Authors of database unit tests must be especially careful because it is very easy to make this kind of mistake.

That said, not surprising to any database developer, the way database unit-test authors avoid modifying the underlying database is through the use of transactions. Here I will show you three ways you can leverage transactions. Inside the executed T-SQL, use this transaction:

begin transaction -- Execute test -- We can clean up when the test passes successfully; otherwise, it -- is up to the engine to clean up the transaction when the -- connection is closed rollback transaction

In an ADO.NET transaction, execute the code shown in Figure 6. To use System.Transactions, take a look at Figure 7.

Figure 7 Using System.Transactions

[TestMethod()] public void SystemTransactionsExample() { using (TransactionScope transactionScope = new System.Transactions.TransactionScope( TransactionScopeOption.Required)) { base.ExecutionContext.Connection.EnlistTransaction( Transaction.Current); base.PrivilegedContext.Connection.EnlistTransaction( Transaction.Current); DatabaseTestActions testActions = this.SystemTransactionsExampleData; // Execute the pre-test script Trace.WriteLineIf((testActions.PretestAction != null), "Executing pre-test script..."); ExecutionResult[] pretestResults = TestService.Execute( this.PrivilegedContext, this.PrivilegedContext, testActions.PretestAction); // Execute the test script Trace.WriteLineIf((testActions.TestAction != null), "Executing test script..."); ExecutionResult[] testResults = TestService.Execute( this.ExecutionContext, this.PrivilegedContext, testActions.TestAction); // Execute the post-test script Trace.WriteLineIf((testActions.PosttestAction != null), "Executing post-test script..."); ExecutionResult[] posttestResults = TestService.Execute( this.PrivilegedContext, this.PrivilegedContext, testActions.PosttestAction); } }

Figure 6 ADO.NET Transaction

[TestMethod()] public void ADONetTransactionSample() { base.ExecutionContext.Transaction = base.ExecutionContext.Connection.BeginTransaction(); try { DatabaseTestActions testActions = this.ADONetTransactionSampleData; // Execute the test script Trace.WriteLineIf( (testActions.TestAction != null), "Executing test script..."); ExecutionResult[] testResults = TestService.Execute( this.ExecutionContext, this.ExecutionContext, testActions.TestAction); } finally { base.ExecutionContext.Transaction.Rollback(); } }

Of these three options, T-SQL and ADO.NET transactions work if you're not using the pre/post test scripts to set up the database or verify the results of the test; remember, the pre/post scripts are executed using a different connection so their modifications will not be rolled back. Using System.Transactions, the Privileged and Execution connections have already been created so they must be enlisted in the ambient transaction created when the TransactionScope is instantiated, but any changes will be rolled back.

System.Transactions

Because using ambient transactions (as in the System.Transactions example shown previously) is the best comprehensive solution, let's drill into the solution more. (You may also visit msdn2.microsoft.com/ms172152 for more background.) The ambient transaction is used across two different connections that may have different lifetimes, so the Distributed Transaction Coordinator (DTC) service must be used to manage the transaction's lifetime. This means that in order to use System.Transactions, the database's DTC service must be started for the test to pass. If the service is not started, your test will fail with an exception like the following:

Test method TestSamples.DatabaseUnitTest1.SystemTransactionsExample threw exception: System.Data.SqlClient.SqlException: MSDTC on server '(LOCAL)\SQLEXPRESS' is unavailable.

Modifying each method to use an ambient transaction is tedious at best; most likely, you will want to group all the transactional tests together in one class and then modify the class so that a transaction is started and stopped for each test (see Figure 8).

Figure 8 Using Transactions for Each Test

TransactionScope _ambientTransaction; [TestInitialize()] public void TestInitialize() { // Create the transaction prior to the // connections being created. When the // connections are created they will // automatically enlist in the transaction ambientTransaction = new TransactionScope( TransactionScopeOption.Required); base.InitializeTest(); } [TestCleanup()] public void TestCleanup() { // Since Complete() was not called, disposing // of the ambient transaction will cause all // changes to be rolled back ambientTransaction.Dispose(); base.CleanupTest(); }

Data-Driven Testing

Team System for Database Professionals: Resources

By reducing the risks involved in making database schema changes, you can better control the integrity of your database throughout the development process. Here are some resources to help you:

Minor changes to the database can have a significant impact on other areas of the application without the impact being immediately visible. As such, it is valuable to determine the impact of schema changes early on. These Team System features will help:

One of the other benefits of DBPro is that, as a part of the Team System family, it is integrated with Team Foundation Server to give team members complete visibility into all development artifacts and activities on a project. This results in the reduction of issues related to poor communication and the improvement of team productivity and effectiveness.

The test harness that shipped with Visual Studio 2005 allows you to run a unit test multiple times, each time with a different set of input parameters. Documentation describing how to use this harness can be found in "Coding a Data-Driven Unit Test" at msdn2.microsoft.com/ms182527 and in "Walkthrough: Using a Configuration File to Define a Data Source" at msdn2.microsoft.com/ms243192.

The database unit-testing API supports data-driven testing by allowing you to supply zero or more DbParameter instances to the Execute method. To understand how the supplied DbParameter instances are used, think of the SQL test script as parameterized SQL that directly references the DbParameter instances as variables in the code. An example is shown in Figure 9.

Figure 9 Supplying DbParameter Instances

[DataSource ("System.Data.SqlClient", "Data Source=(local)\\sqlexpress" + ";Initial Catalog=TestDB_DataDriven;Integrated Security=True", "DecisionAllocationTestData", DataAccessMethod.Sequential), TestMethod()] public void DataDriveTestSample() { // Setup and execute the test DatabaseTestAction test = new DatabaseTestAction(); DbParameter customerID = CreateParameter( "@customerID", DbType.Int32, ParameterDirection.Input, "CustomerID"); DbParameter offerID = CreateParameter( "@offerID", DbType.Int32, ParameterDirection.Input, "OfferID"); DbParameter decision = CreateParameter( "@decision", DbType.StringFixedLength, ParameterDirection.Input, "Decision"); DbParameter actualVault = CreateParameter( "@vault", DbType.Int32, ParameterDirection.Output); DbParameter actualBroker = CreateParameter( "@broker", DbType.Int32, ParameterDirection.Output); DbParameter actualMarket = CreateParameter( "@market", DbType.Int32, ParameterDirection.Output); test.SqlScript = @" declare @rc int execute @rc = AllocatePositionsByDecision @customerID, @offerID, @decision, @vault out, @broker out, @market out "; DatabaseTestClass.TestService.Execute( base.ExecutionContext, base.PrivilegedContext, test,customerID, offerID, decision, actualVault, actualBroker, actualMarket); // Verify the results int expectedVault = (int) base.TestContext.DataRow["ExpectedVault"]; int expectedBroker = (int)base.TestContext.DataRow["ExpectedBroker"]; int expectedMarket = (int)base.TestContext.DataRow["ExpectedMarket"]; Assert.AreEqual(expectedVault, actualVault.Value, "Vault incorrect"); Assert.AreEqual(expectedBroker, actualBroker.Value, "Broker incorrect"); Assert.AreEqual(expectedMarket, actualMarket.Value, "Market incorrect"); } private DbParameter CreateParameter(string paramName, DbType paramType, ParameterDirection paramDirection) { return CreateParameter(paramName, paramType, paramDirection, null); } private DbParameter CreateParameter(string paramName, DbType paramType, ParameterDirection paramDirection, string dataColumnName) { DbParameter newParameter = base.ExecutionContext.Provider.CreateParameter(); newParameter.DbType = paramType; newParameter.Direction = paramDirection; newParameter.ParameterName = paramName; if (paramDirection == ParameterDirection.Input || paramDirection == ParameterDirection.InputOutput) { newParameter.Value = base.TestContext.DataRow[dataColumnName]; } return newParameter; }

As you can see in Figure 9, the test method is decorated with an attribute that informs the test harness that it expects to be called once for each row of data retrieved from the DecisionAllocationTestData table in the TestDB_DataDriven database on my local SQL Server Express instance. The DatabaseTestAction class represents an action in a test composed of a test script to execute and test conditions to evaluate the results. You'll notice that I create five DbParameter instances representing the variables referenced in the test script; it is important to note that some of these parameters are output parameters. The CreateParameter method is just a small helper method that creates the instance and optionally populates it with data from the test context.

The test script you see next is defined to reference the DbParameter instances I'm providing as arguments to the Execute method. If you're familiar with ADO.NET code, you'll see this is simply parameterized SQL.

Next I call the Execute method on the default TestService instance, passing the DbParameter instances as arguments. After execution I verify the results. This is simply a matter of retrieving the expected values from the DataRow and comparing them with the output parameters.

The example in Figure 9 demonstrates that the database unit-testing API can be targeted directly rather than using the designer. There I am using data retrieved from a test table to drive execution of a stored procedure; the stored procedure being tested uses the first three input values to calculate the number of shares that will be retrieved from various locations to fulfill the customer's decision. This calculation is complex, depends only on the inputs, and has simple outputs. Therefore it makes sense to write this test as data-driven rather than duplicating the same test many times varying only the data.

At this point, I have covered several different ways you can modify or use the existing API to leverage transactions or create data-driven tests. The database unit-testing API can be extended or modified in several ways, though, by adding extra test conditions or by creating your own test service. So now let's look at ways you can extend the framework.

Custom Test Conditions

The database unit-testing feature comes with six test conditions that can be used to verify the execution of your unit test. Although these test conditions are sufficient for basic use, most users will quickly find that they need to verify a resultset rather than a single value, so here I'll build two additional test conditions. These two test conditions, plus the LoggingTestService, are additionally part of a new set of power tools for Visual Studio 2008 released by the Team System Data team. To download the new power tools, please see go.microsoft.com/fwlink/?Linkid=107300.

The first test condition is ChecksumCondition. At design time, it generates a hash of the expected results. At execution time, the actual results are hashed and the resulting value is compared with the expected result. The second, ExpectedSchemaCondition, collects and stores the expected schema at design time. At execution time, the actual schema is compared with the expected schema.

When utilizing test conditions, a user creates a database unit-test class and adds the test conditions from a list of available conditions (a reference to the assembly containing the new conditions is added automatically when the new test class is added to the project).

The user can access the test condition's configuration by pressing on the ellipses in the Configuration property. The resulting dialog, shown in Figure 10, allows the user to specify and execute the query that retrieves the expected data; in this figure, the data displayed is random (nonsensical) data generated by the DBPro Data Generator. When first displayed, the T-SQL from the test is provided in the textbox. The user can then modify the test code to reflect the query she actually wants and then click on the Retrieve button to execute the code against the server and display the results. Once the user is satisfied that the results are what was expected, she clicks OK, and the test condition configures itself accordingly.

Figure 10 Test Scripts and Their Execution

Figure 10** Test Scripts and Their Execution **(Click the image for a larger view)

The user doesn't have to interact with the condition any further. Now that you have an idea what the end result will look like, you can create these test conditions. When developing the previous test conditions, the bulk of the interesting code is in the design-time behavior. I will walk through the high points in the following section; the complete code is available at msdn.microsoft.com/msdnmag/code08.aspx.

Defining the Test Condition

When defining a new test condition, I find that it is easiest to define the skeleton and registration of the condition first and verify that the skeleton works before developing the actual condition logic. Because the two sample test conditions are very similar, I defined a common base class, as you see in Figure 11. Once the project compiled, I then defined the registration XML file, called UnitTesting.Samples.extensions.xml, shown in Figure 12.

Figure 12 UnitTesting.Samples.extensions.xml

<?xml version="1.0" encoding="utf-8"?> <extensions assembly="UnitTesting.Samples, Version=1.0.0.0, Culture=neutral, PublicKeyToken=382814fd17d42dea" version="1" xmlns="urn:Microsoft.VisualStudio.TeamSystem.Data.Extensions" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:Microsoft.VisualStudio.TeamSystem .Data.Extensions Microsoft.VisualStudio.TeamSystem.Data.Extensions.xsd"> <extension type="UnitTesting.Samples.TestConditions.ExpectedSchemaCondition" enabled="true"/> <extension type=" UnitTesting.Samples.TestConditions.ChecksumCondition" enabled="true"/> </extensions>

Figure 11 Common Base Class Defined

namespace UnitTesting.Samples.TestConditions { public abstract class DataSetTestCondition : TestCondition { //... } } namespace UnitTesting.Samples.TestConditions { [DisplayName("Checksum")] public class ChecksumCondition : DataSetTestCondition { // ... } } namespace UnitTesting.Samples.TestConditions { [DisplayName("Expected schema" )] public class ExpectedSchemaCondition : DataSetTestCondition { // ... } }

I like to add the extensions file directly to the project and set it to automatically copy to the output directory so that it is easy to reference and use during development and deployment. The final step is setting up automatic deployment by copying the extensions file to the DBPro directory and installing the test conditions assembly into the Global Assembly Cache (GAC).This can be accomplished through a post-build step or by running a command prompt with elevated privileges. I used the following script in my project's post-build step:

set TargetNameVar=$(TargetName) set DevEnvDirVar=$(DevEnvDir) set WindowsSDKVar=$(ProgramFiles)\Microsoft SDKs\Windows\V6.0A copy /Y %TargetNameVar%.extensions.xml "%DevEnvDirVar%..\..\DBPro\*" copy /Y %TargetNameVar%.pdb "%DevEnvDirVar%PublicAssemblies\*" "%WindowsSDKVar%\Bin\gacutil" /if %TargetNameVar%.dll

If you want to run the script as a separate .bat file, modify the first three lines to retrieve values from environment variables (instead of MSBuild variables).

After this initial setup has been completed, I can see the two new conditions in the test conditions dropdown. I added the DisplayNameAttribute (msdn2.microsoft.com/system.componentmodel.displaynameattribute) to the test conditions so they appear as Checksum and Expectedschema. Now that I have finished setup, I need to develop the test conditions' design-time experience so that the user can click on the ellipses in the property browser and configure the condition instance.

If you have designed components for Windows Forms in Visual Studio, then you are probably already familiar with the UITypeEditor in the .NET Framework. The UITypeEditor (msdn2.microsoft.com/system.drawing.design.uitypeeditor) and the EditorAttribute (msdn2.microsoft.com/system.componentmodel.editorattribute) enable the component developer to design a custom editing experience and assign that experience (through the EditorAttribute) to a component property that is designed in the IDE. In this case, I want the experience to be a modal dialog so I define the editor as in Figure 13. Since both test conditions are configured the same way, I define the property on the base class:

Figure 13 Defining Editor Style

namespace UnitTesting.Samples.TestConditions.Design { class DataSetEditor : UITypeEditor { public override UITypeEditorEditStyle GetEditStyle(ITypeDescriptorContext context) { return UITypeEditorEditStyle.Modal; } public override object EditValue( ITypeDescriptorContext context, IServiceProvider provider, object value) { MessageBox.Show("DataSetEditor::EditValue"); return value; } } }

[DesignOnly(true)] [Editor( typeof(UnitTesting.Samples.TestConditions.Design.DataSetEditor), typeof(System.Drawing.Design.UITypeEditor))] public string Configuration { get { return "Press to configure"; } set { } }

When I compile and deploy the test conditions, I see the Configuration property, and when I click on the ellipses, the message box displays. At this point, all the code needed to integrate with Visual Studio has been completed and I can concentrate on the logic of the test conditions themselves.

First I create a form—this will be displayed by the DataSetEditor I just created—that allows the user to configure the DataSet expected by the test condition. This form (named DataSetSelectorDialog) is seeded with the T-SQL from the test and allows the user to execute that T-SQL against a target server in order to preview the data prior to configuring the test condition. Once the user is happy with the DataSet, he can click OK on the dialog, signaling that the test condition should use the DataSet and return control to the editor.

Next, I modify the editor to display and process the form's content after the user clicks OK. Normally, a UI editor passes back the value that should be stored in the property being edited, but in this case I want the property to display the "Press here to configure" text and pass back the DataSet to the edited test condition. Since the conditions would process the DataSet differently, I define a virtual method on the base class and have the editor call this method to set the DataSet instance; the derived classes override the method to add their specific implementation. With these changes the EditValue method now looks like Figure 14. There, I retrieve the IUIService to display a modal dialog; this service integrates the display of the modal dialog with the rest of the Visual Studio windowing behavior.

Figure 14 EditValue Method

public override object EditValue( ITypeDescriptorContext context, IServiceProvider provider, object value) { IUIService uiService = (IUIService)provider .GetService(typeof(IUIService)); DataSetTestCondition cond = context.Instance as DataSetTestCondition; if (uiService != null && cond != null) { DataSetSelectorDialog dsDialog = new DataSetSelectorDialog(); if (cond.Site != null) { dsDialog.Text = string.Format(CultureInfo.CurrentCulture, "Configuration for {0}", cond.Site.Name); } dsDialog.Query = cond.GetTestScript(); dsDialog.StartPosition = FormStartPosition.CenterParent; if (uiService.ShowDialog(dsDialog) == DialogResult.OK) { value = dsDialog.Query; if (cond != null) { cond.ConfigureExpectedDataSet(dsDialog.Result); } } } return value; }

Now that you've seen the design-time behavior of the test conditions, let's see what happens when the test script is executed against the target server and the results of that execution are collected inside a DataSet. After the test is executed, an instance of the ExecutionResult class is created and populated with information about the test's execution (including the DataSet). The test harness then loops through all test conditions that have been configured to verify the execution results and calls the Assert method on each test condition to verify those results. In the case of the test conditions in this example, they retrieve the DataSet from the first ExpectedResult in the results array, and then ChecksumCondition generates a hash of the actual DataSet and compares that hash with the hash generated during design time. ExpectedSchemaCondition deserializes the empty DataSet that represents the expected data tables (and their columns) and compares them with the DataSet generated by the test.

Modifying Test Execution

The default behavior for connection strings, database deployment, and test execution is accessed through the instance of the DatabaseTestService class that is registered on the static TestService property on the DatabaseTestClass. The methods on the DatabaseTestService class are marked as virtual so that they can be overridden individually or all together to change how tests are executed. You might want to modify the default behavior if, for example, you're using a custom connection string encryption/decryption mechanism, you're modifying the way database deployment or data generation is performed, or you're automatically logging execution results.

Each of these modifications (and more) can be achieved by writing your own test service class and then registering that test service with the harness so that it can be used during test execution. In this sample, I'm going to write a test service that will log the results of a test execution to a file using DataSet serialization. The service can be configured to log no results, all results, or only failed results. This simple service makes it easy to see the result of your tests' execution within the harness.

First, I define the class; I am simply modifying the way tests are executed, so the class looks like that in Figure 15. As you can see, the LoggingTestService subclasses the DatabaseTestService class and then overrides the Execute method. Next, I need to add results logging. Note that the default implementation of Execute executes the test script and then evaluates the test conditions; this implementation will not meet my needs, however, because I don't have access to the execution results in the event of verification failure. Therefore, to meet my goals, I'll instead evaluate the assigned test conditions after calling the base Execute method and then log the results accordingly. The complete Execute implementation looks like Figure 16.

Figure 16 Execute Implementation

public override ExecutionResult[] Execute( ConnectionContext scriptExecutionContext, ConnectionContext privilegedExecutionContext, DatabaseTestAction action, params System.Data.Common.DbParameter[] sqlParameters) { // We want to evaluate the conditions ourselves so that we // can log the results from every test's execution List<TestCondition> conditions = new List<TestCondition>(); if (action != null) { conditions.AddRange(action.Conditions); action.Conditions.Clear(); } // Execute the test ExecutionResult[] results = base.Execute( scriptExecutionContext, privilegedExecutionContext, action, sqlParameters); // Verify and log the results bool verificationFailed = false; try { foreach (TestCondition condition in conditions) { if (condition.Enabled) { condition.Assert(privilegedExecutionContext.Connection, results); } } } catch { verificationFailed = true; throw; } finally { if (results != null && (Verbosity == LogVerbosity.AllResults || (verificationFailed && Verbosity == LogVerbosity.OnlyFailure))) { // Serialize the dataset to an .xml file to be visualized string resultPrefix = Guid.NewGuid().ToString("N"); for (int resultIndex = 0; resultIndex < results.Length; resultIndex++) { string dsFilePath = Path.GetFullPath(string.Format( CultureInfo.CurrentCulture, "{0}-{1}.xml", resultPrefix, resultIndex)); ExecutionResult result = results[resultIndex]; result.DataSet.WriteXml(dsFilePath, System.Data.XmlWriteMode.IgnoreSchema); Trace.TraceInformation("Log {0} {1}", resultIndex, dsFilePath); } } } return results; }

Figure 15 LoggingTestService Class

namespace UnitTesting.Samples { public class LoggingTestService : DatabaseTestService { public override ExecutionResult[] Execute( ConnectionContext scriptExecutionContext, ConnectionContext privilegedExecutionContext, DatabaseTestAction action, params System.Data.Common.DbParameter[] sqlParameters) { // Execute the test ExecutionResult[] results = base.Execute( scriptExecutionContext, privilegedExecutionContext, action, sqlParameters); return results; } } }

This implementation still uses the base Execute implementation, but, afterward, it verifies the results using the action's test conditions and writes the results of that execution to a file. I'm using a GUID to generate a unique name because this method will be called multiple times during a single test's execution. It is also important to check for null results because it is common for Execute to be called with no action.

Once the service has been compiled into an assembly, I still need to register it in order for it to be used during test execution. In my case, I want the service to be registered for all tests in my assembly but only to log if a test fails, so I modified the IntializeAssembly method to look like the following:

[AssemblyInitialize()] public static void IntializeAssembly(TestContext ctx) { LoggingTestService testService = new LoggingTestService(); testService.Verbosity = LoggingTestService.LogVerbosity.OnlyFailure; DatabaseTestClass.TestService = testService; // Setup the test database based on setting in the // configuration file DatabaseTestClass.TestService.DeployDatabaseProject(); DatabaseTestClass.TestService.GenerateData(); }

Now, when I run a unit test that fails, the path to the log file will be output to the test run directory and the path to the file will be in the test results window for that test.

To look at the results that caused the test failure, I created a very simple Windows Forms application (it is in the accompanying sample code download) that deserializes and displays the DataSet. Before I wrap up, I should mention that the Visual Studio test harness executes each test assembly in its own AppDomain. Because the service instance is registered via a static property, the service registration will only persist for the lifetime of the AppDomain. This does mean that you will need to reregister the test service during the initialization of each test assembly. Also, you should take a look at the sidebar, "Team System for Database Professionals: Resources", for more information on making database schema changes.

Looking Forward

I've touched on many features of DBPro in this article, but the best way to get a feel for what you can do is to get your hands dirty. I suggest you experiment with the product, including working with the data generation API and the new Visual Studio 2008 power tools. If you encounter any problems or just have questions, the forum is always open at: forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=725&SiteID=1. In addition, check out Sachin Rekhi's whitepaper titled "Database Unit Testing with Team Edition for Database Professionals" at msdn2.microsoft.com/bb381703.

Jamie Laflen is a Tech Lead on the DBPro team; he is responsible for the unit testing, data generation, and build/deploy features. Prior to joining the development team, Jamie was part of Microsoft Services and helped customers implement .NET applications inside Visual Studio.