Microsoft Strategy for Universal Data Access

 

By David Lazar

May 1998

Summary: Explains the Microsoft's data access strategy. (20 printed pages) Also covers:

  • Design goals for MDAC
  • Microsoft's commitment to Universal Data Access
  • Universal Data Access cross-platform capabilities

Contents

Overview and Scope
Customer Requirements for Data Access Technologies
Definition of the Universal Data Access Strategy
Design Goals for Microsoft Data Access Components
Universal Data Access Is a High-Performance Architecture
Solutions Built with Universal Data Access Components Are Reliable
Microsoft Commitment to Universal Data Access
Broad Industry Support for Universal Data Access
Data Access SDK
How Universal Data Access Supports Data on Multiple Platforms
How Universal Data Access Differs from Other Strategies
Universal Data Access: A Road Map for the Future
Conclusion

Overview and Scope

This paper provides an in-depth look at Microsoft® Universal Data Access, Microsoft Corporation's broadly supported strategy for providing access to information across an organization, from the desktop to the enterprise.

The technologies discussed in this paper are multifaceted, spanning many areas of computing, including architecture, programming, networking, and platform integration. The issues are critically important because data and information are at the heart of almost any computer system, and the efficient and effective use of information is what provides business value and strategic advantage. Today, these issues are magnified as organizations begin to broadly implement applications that leverage the Internet and mobile computing.

Access to information is being required in new scenarios, and the complexity of that information continues to grow. Organizations previously had data on the mainframe and in various DBMSs. Now important information is also found in mail stores, file systems, Web-based text and graphical files, and more. Organizations that are able to leverage all of this information, that can expand rather than replace current UNIX and mainframe systems to embrace client/server systems and the Internet, will thrive.

This paper is intended to explain Microsoft strategy for helping organizations achieve maximum business advantage by organizing and accessing their information efficiently.

Customer Requirements for Data Access Technologies

Microsoft has solicited extensive feedback from data access customers on the criteria they use in judging and selecting data access technologies and products. Our research has shown that there are four main criteria used in the decision-making process:

  • High-performance access to data. Simply stated, new data access methods need to provide the same level of performance that is possible through the data provider's proprietary API. This means that customers don't plan to compromise their number one criterion, performance, for any other benefit. Similarly, services that augment the native capabilities of data providers must be held to the same standards: They must not stand in the way of native performance. And, because applications and components must frequently scale to support hundreds or thousands of concurrently connected users, performance should be maintained as usage grows.
  • Reliability. Customers want their database solutions to perform reliably. They use terms such as "rock-solid" and "fail-safe" to describe their requirements in this area. Underlying these statements is the need to minimize maintenance and support costs, and to reduce the total cost of ownership.
  • Vendor commitment. Customers indicate that they are making strategic commitments to vendors of data access technologies, and are looking for reciprocation. They indicate that database decisions are long term, that they are purchasing not a single release but a string of database and related product releases. On the flip side, customers are very wary of becoming too dependent on a single vendor, a situation they term "vendor lock-in" or "vendor tie-in." Finally, new technologies should evolve gracefully from current ones, to avoid costly replacement of existing capabilities.
  • Broad industry support. This is defined as market share, as well as the support of vendors of related products and technologies. For customers, broad industry support is a more important gauge than the blessing of a standards body when choosing data access products. Broad industry support carries many benefits—safety in numbers, availability of skilled people to work with the products, and products that work together without expensive integration and customization.

This paper has been structured to explain how the Microsoft Universal Data Access strategy meets these criteria. After the terms and technologies are defined, separate sections address how Microsoft has met or intends to meet each of the criteria.

Definition of the Universal Data Access Strategy

Universal Data Access is a platform, application, and tools initiative that defines and delivers both standards and technologies and is a key element in the Microsoft foundation for application development, the Microsoft Windows® Distributed interNet Applications (DNA) architecture.

Today, companies building client/server and Web-based database solutions seek maximum business advantage from the data and information distributed throughout their organizations. Universal Data Access provides high-performance access to a variety of data and information sources on multiple platforms and an easy-to-use programming interface that works with practically any tool or language, leveraging the technical skills developers already have. The technologies that support Universal Data Access enable organizations to create easy-to-maintain solutions and use their choice of best-of-breed tools, applications, and data sources on the client, middle tier, or server.

Figure 1.

Another benefit of Universal Data Access is that it does not require expensive and time-consuming movement of all corporate data into a single data store, nor does it require commitment to a single vendor's products. Universal Data Access is based on open industry specifications with broad industry support and works with all major established database products. Universal Data Access is an evolutionary step from standard interfaces such as Open Database Connectivity (ODBC), Remote Data Objects (RDO), and Data Access Objects (DAO), and it significantly extends the functionality of these well-known and well-tested technologies.

A Unified Data Access Model Based on COM

One strength of the Microsoft Universal Data Access strategy is that it is delivered through a common set of modern, object-oriented interfaces. These interfaces are based on the Microsoft Component Object Model (COM), the most widely implemented object technology in the world. COM has become the choice of developers worldwide because it provides the following:

  • The richest integrated services, including transactions, security, message queuing, and data access to support the broadest range of application scenarios.
  • The widest choice of tools from multiple vendors using multiple development languages.
  • The largest customer base for customizable applications and reusable components.
  • Proven interoperability with users' and developers' existing investments.

Because of the consistency and interoperability afforded through COM, the Microsoft Universal Data Access architecture is open and works with virtually any tool or programming language. It also enables Universal Data Access to provide a consistent data access model at all tiers of the modern application architecture.

Microsoft Universal Data Access architecture exposes COM-based interfaces optimized for both low-level and high-level application development by using OLE DB and ADO, respectively.

Definition of OLE DB

OLE DB is the Microsoft strategic system-level programming interface to data across the organization. OLE DB is an open specification designed to build on the success of ODBC by providing an open standard for accessing all kinds of data. ODBC was created to access relational databases. OLE DB is designed for relational and nonrelational information sources, including: mainframe ISAM/VSAM and hierarchical databases; e-mail and file system stores; text, graphical, and geographical data; custom business objects; and more.

Figure 2.

OLE DB defines a collection of COM interfaces that encapsulate various database management system services. These interfaces enable the creation of software components that implement such services. OLE DB components consist of data providers, which contain and expose data*; data consumers*, which use data; and service components, which process and transport data (such as query processors and cursor engines). OLE DB interfaces are designed to help components integrate smoothly so that OLE DB component vendors can bring high-quality OLE DB components to market quickly. In addition, OLE DB includes a bridge to ODBC to enable continued support for the broad range of ODBC relational database drivers available today.

Definition of ActiveX Data Objects

Microsoft ActiveX® Data Objects (ADO) is the Microsoft strategic application-level programming interface to data and information. ADO provides consistent, high-performance access to data and supports a variety of development needs, including the creation of front-end database clients and middle-tier business objects, using applications, tools, languages, or Internet browsers. ADO is designed to be the one data interface needed for one-to-multitier client/server and Web-based data-driven solution development.

ADO provides an easy-to-use application-level interface to OLE DB, which provides the underlying access to data. ADO is implemented with a small footprint, minimal network traffic in key scenarios, and optimized interaction between the front end and data source—all to provide a lightweight, high-performance interface. ADO is easy to use because it is called using a familiar metaphor—the COM automation interface, available from all leading RAD, database tools, and languages on the market today. And because ADO was designed to combine the best features of—and eventually replace—RDO and DAO, it uses similar conventions with simplified semantics to make it a natural next step for today's developers.

Design Goals for Microsoft Data Access Components

Microsoft Data Access Components (MDAC) is a set of redistributable technologies that implement the Universal Data Access strategy. MDAC consists of the latest versions of ActiveX Data Objects (ADO), OLE DB components, and Open Database Connectivity (ODBC), which have now been released as an integrated set. Developers creating client/server and Web-based data-driven solutions select the components they need and call them from their choice of tools, applications, languages, or Web browser to create complete database solutions.

In designing MDAC, Microsoft set a number of guiding project goals, including the following:

  • Meet the key customer requirements outlined in the section "Customer Requirements for Data Access Technologies" in this paper. These requirements include performance, reliability, strongly committed vendors, and broad industry support. Discussions of how Universal Data Access and MDAC meet these requirements are provided in separate sections later in this document.
  • Enable access to the broadest possible range of data sources, focusing on those that are most important to customers now and in the future. To address this goal, we've taken a three-pronged approach:
    • Provide the best possible reuse of existing ODBC drivers, so organizations can continue to leverage their investments in these technologies. Continue to enhance strategic ODBC drivers from Microsoft as OLE DB components become broadly available.
    • Build OLE DB providers for the most important Microsoft and non-Microsoft data sources.
    • Create a comprehensive strategy for enabling the rapid development of high-quality, high-performance third-party OLE DB components to expose all mainstream data sources now and in the future.
  • Provide a migration path that will enable customers to leverage their investments in and experience with DAO, RDO, and ODBC.

These are the project goals guiding the development of the Microsoft Data Access Components to support and enhance the Universal Data Access strategy.

Next, we will take a closer look at each of the key customer requirements for data access technologies—performance, reliability, strongly committed vendors, and broad industry support—and detail specifically how Universal Data Access and MDAC meet these requirements.

Universal Data Access Is a High-Performance Architecture

We have seen that performance is of paramount concern to developers and users of data access technologies. Universal Data Access consequently has been designed with performance as its number one goal. This section will examine how the Universal Data Access technologies support this requirement and the positive implications for users.

Flexible, Component-Based Services Model

OLE DB, designed to be a high-performance architecture, accomplishes this by using a flexible, component-based services model. Rather than having a prescribed number of intermediary layers between the application and the data, OLE DB requires only as many components as are needed to accomplish a particular task. For example, suppose a user wants to run a query. Consider four scenarios:

  • The data resides in a relational database for which there currently exists an ODBC driver, but no native OLE DB provider. The application uses ADO to talk to the OLE DB Provider for ODBC, which then loads the appropriate ODBC driver. The driver passes the SQL statement to the DBMS, which retrieves the data.
  • The data resides in Microsoft SQL Server™ or another data source for which there is a native OLE DB provider. The application uses ADO to talk directly to the OLE DB provider for Microsoft SQL Server. No intermediaries are required.
  • The data resides in Microsoft Exchange Server, for which there is an OLE DB provider, but which does not expose an engine to process SQL queries. The application uses ADO to talk to the Microsoft Exchange data provider and calls upon an OLE DB query processor component to handle the querying.
  • The data resides in the Microsoft Windows NT® file system in the form of documents. Data is accessed by using a native OLE DB provider over Microsoft Index Server, which indexes the content and properties of documents in the file system to enable efficient content searches.

In all four cases, the application can query the data. The user's needs are met with a minimum number of components. In each case, additional components are used only if needed, and only the required components are invoked. This demand-loading of reusable and shareable components greatly contributes to high performance when OLE DB is used.

Increasing Development Efficiency with OLE DB

ODBC has been a very important and successful data access standard. Now OLE DB has an improved architecture that provides a significant advantage over ODBC. Providersno longer have to implement an SQL relational engine to expose data. With ODBC, services such as cursoring and query processing need to be implemented by every ODBC driver writer. This represents overhead both for the ODBC driver author and for the end user. (How many cursor engines and query processors do you need on one machine?) With OLE DB, reusable service components handle the processing chores for a variety of data providers. OLE DB simplifies the process of writing data providers, which means they should come online faster and be of a higher quality. It also reduces the number of components installed on data consumer machines.

ADO Performance Advantages

As with OLE DB, ADO is designed for high performance. To achieve this, it reduces the amount of solution code developers must write by "flattening" the coding model. DAO and RDO, the object models that preceded ADO, are highly hierarchical models. To return results from a data source, the programmer has to start at the top of the object model and traverse down to the layer that contains the recordset. The ADO object model is not hierarchical. The programmer can create a recordset in code and be ready to retrieve results by setting two properties, then execute a single method to run the query and populate the recordset with results. The ADO approach dramatically decreases the amount and complexity of code that needs to be written by the programmer. Less code running on the client or middle-tier business object translates to higher performance. ADO also offers better performance in getting data from recordsets. Scalability is improved by minimizing overhead in simple scenarios.

Minimal Network Traffic in Key Internet Scenarios

Microsoft has designed OLE DB and ADO for the Internet, implementing a "stateless" model in which client and server can be disconnected between data access operations. MDAC contains a Remote Data Service component that provides efficient marshaling of data between the middle tier or server and the client, including support for batch updates, as well as an efficient client-side cursor engine that can process data locally without constant server requests. Thus MDAC provides greater local functionality and higher performance for Internet applications than previous versions and other approaches.

Ultimately, the performance of Microsoft Data Access Components will be judged in comparison with that of native access methods. The goal for OLE DB is to establish it as the native interface to major Microsoft and non-Microsoft data stores. To accomplish this, performance tuning for all key scenarios (transactions, decision support, etc.) and with major data sources will be of paramount concern. The number one design goal for ADO and OLE DB is performance, and the architectural foundations to achieve this are in place.

Solutions Built with Universal Data Access Components Are Reliable

As we have seen, reliability is one of the primary requirements for organizations managing and supporting data access applications. Universal Data Access aims to address this need in three areas: increasing the manageability of client-side components, enabling strong run-time coordination and control capabilities by the server, and delivering well-tested components.

Increasing the Manageability of Client-Side Components

One of the most important tools that organizations can use to increase reliability and decrease support costs is the reduction of the number of components to support on client PCs. The Universal Data Access strategy and the Microsoft Data Access Components support this approach in several key ways:

  • Universal Data Access supports new multitier and Web deployment models, in which data access logic and business logic are centralized on middle-tier servers. Front ends provide presentation services by using browser-based interfaces, or by using custom and packaged applications. In this model, application functionality is mainly centralized, not distributed to end-user PCs, thus reducing the number of components to manage on those machines.

  • Microsoft Data Access Components, version 1.5, which includes OLE DB and ADO, ships as part of these Microsoft system products:

    • Components for easy client deployment to Microsoft Internet Explorer 4.0 and Windows® 98
    • Full server components to Microsoft Internet Information Services 4.0 (also known as Windows NT Option Pack for Windows NT 4.0)

    Because OLE DB and ADO ship like system components, organizations can frequently rely on these components being available in a run-time environment and don't have to manage their distribution and maintenance.

  • ADO is tool- and language-independent and in many cases may be able to replace multiple data access libraries on client PCs. Organizations that may have previously supported DAO, RDO, and ODBC on each PC as part of a two-tier system will now be able to get the same functionality by deploying Internet Explorer 4.0 to their clients and using a three-tier architecture. Data access functions previously handled by ODBC clients will move into their middle-tier objects or their servers. Internet Explorer with ADO can thus replace DAO, RDO, and ODBC on client PCs.

Enabling Strong Server-Side Coordination and Control

Universal Data Access enables transactional control of diverse data sources and components by using Microsoft Transaction Server. To achieve this, OLE DB data sources must implement the functionality of a resource manager, which handles transactions local to the data source and enables each data source to participate in distributed transactions. Microsoft Transaction Server provides a distributed transaction coordinator that guarantees atomic operations spanning multiple data sources using a reliable, two-phase commit protocol, and enables applications to scale as user load grows with minimal additional effort.

Delivering Well-Tested Components

MDAC components are rigorously tested. Because ADO and OLE DB have been shipping in high volume since the release of Internet Information Server 3.0, they have enjoyed significant field usage. With the advent of MDAC, and the commitment to ship ADO and OLE DB in a synchronized fashion, these components are now developed and tested side by side. Finally, after testing the components for interoperability, Microsoft stress-tests MDAC components with the products with which they ship (such as Internet Information Server and Internet Explorer) to guarantee reliable behavior in multithreaded, continuous operation in highly concurrent environments. This testing is designed to help ensure high-performance, highly reliable components that work well in a variety of real-world scenarios.

The net result of the Universal Data Access architecture's three-pronged approach should be significant reductions in configuration and support expenses and a reduced total cost of ownership—in short, a reliable data access architecture.

Microsoft Commitment to Universal Data Access

The choice of data access technologies is extremely strategic for organizations. Typically other factors, such as the DBMS, largely drive this choice. However, customers have told us that data access decisions are made for the long haul, and consequently need to be considered carefully.

Here are some of the questions that customers have told us they need answered when evaluating a data access vendor:

  • Is the vendor here to support me, not just for the current release, but through a string of releases?
  • Does the vendor have a strong and compelling strategy for data access, and does that strategy mesh with the goals of my organization?
  • Has the vendor withstood the test of time by consistently delivering strong products in these markets?
  • Likewise, has the vendor been able to succeed in these markets, thus proving its ability to deliver technological solutions and build market momentum, resulting in broad support for the technologies?

Based on market share, developer acceptance and broad industry support for many technologies, we believe that Microsoft has consistently met this set of criteria and has proved to be a market leader for data access technology. We'll now examine Microsoft's commitment to Universal Data Access.

A Short History of Microsoft as a Data Access Vendor

A look at how the Universal Data Access strategy has evolved at Microsoft will help illuminate the long-term commitment the company is making in this area.

Microsoft began investing in data access shortly after the initial release of Microsoft SQL Server 1.0 in 1989. Initial interest in Microsoft SQL Server was high, but the tools available to program it were limited. The SQL standard was in its infancy, but was clearly bound to the coming client/server revolution. Microsoft knew that acceptance of client/server architecture would be highly beneficial and could see that the biggest problem the industry faced was the proliferation of data access interfaces, and the complexity of creating, maintaining, and programming against them.

Open Database Connectivity was the result of these factors. ODBC combined important features to make it extremely attractive:

  • It was SQL-based, making it familiar to most database developers of the day.
  • Its call-level interface supported a broad range of tools.
  • Its plug-in model for database drivers simplified implementation.

As ODBC gained broad support as a standard for data access, it became clear that a standards body should be defining its future. Microsoft turned over the specification for ODBC to the SQL Access Group, made up of a broad range of DBMS, middleware, and tools vendors.

Despite its features, ODBC had a number of shortcomings, which by 1993 were being addressed in the next phase of data access market development. ODBC was programmed as a Windows API, which made it difficult for the majority of customers to use it. A number of Microsoft tools and applications could use ODBC through the Microsoft Jet database engine, but ODBC functionality was not extensible except for a few API-level programmers. Thus high-level programming models were created, first Data Access Objects and then Remote Data Objects, which simplified the ODBC programming model and made it accessible to a wider range of programmers.

DAO provided Microsoft Access and Microsoft Office programmers with an interface to the Jet database engine. RDO provided programmers using the Visual Basic® programming system with higher-level interfaces to ODBC. These interfaces seemed like natural extensions to the Visual Basic language used in each of these products, and have gained broad usage among database programmers.

By 1995, two major new trends began to shape the next phase of development. These two trends, which are still evolving, are the rise of the Internet as a database applications platform and the rise in importance of nonrelational data, which does not directly fit the database model encapsulated by ODBC.

One might wonder at this stage why the Internet requires new data access technologies. After all, the fundamental goal of connecting people with data remains the same. The Internet, however, presents new data access challenges on many levels:

  • Scale. The Internet is synonymous with more people accessing more data. This amplifies the priorities for effective data management: Performance, reliability, and security take on new importance.
  • Higher volume of clients. Because of the high volume, the Internet needs a better paradigm to share server-side resources.
  • Distributed medium. The Internet is by definition a distributed medium, with many more potential points of failure and network performance issues that go beyond the boundaries typically relevant to information technologists. On the flip side, the medium presents many opportunities to distribute processing tasks for improved service.
  • Casually connected clients. Client and server, previously bound tightly in time and space, are now virtually independent. Client computers need to be able to accomplish more with less support from the server, and at times, without a server connection.
  • New business opportunities. The Internet opens up the possibility for employees, customers, suppliers, and business partners to connect on many levels.
  • New types of data. The Internet, while using existing data and databases in new ways, is also composed of massive amounts of text, images, and other media that will grow in importance. Successful organizations will manage new data types effectively and synthesize meaning from them.

As the Internet catalyzes a major paradigm shift in database management and data access, a related shift is occurring: the emergence of nonrelational data sources. While the Internet highlights the need for management of textual and graphical data, organizations today face a proliferation of data in a variety of DBMS and non-DBMS stores, including desktop applications, mail systems, workgroup and workflow systems, and others. Most established organizations face an even larger challenge: leveraging the data in mainframe and minicomputer flat files as they extend access to this information to intranet-based and Internet-based customers.

Data access today encompasses all of the issues traditionally addressed by DBMS systems, plus a range of new data types, new clients, and new access methods. The Universal Data Access strategy was created to meet this new generation of challenges by leveraging the successful strategies of the past and embracing the architectures of the future.

Universal Data Access Builds on the ODBC Foundation

Universal Data Access is a strategy that includes and builds on the successful foundation of ODBC. ODBC successes include the following:

  • Establishing a market standard for database access. There are more than 170 ODBC drivers available today, providing access to a broad range of data.
  • Achieving portability, so applications can scale to new database platforms as an organization's requirements change.
  • Responding to customers' needs by steadily adding new features and performance improvements to enable better database applications.

The most frequent customer issues surrounding ODBC are related to performance and configuration management, defined as matching database drivers on multiple machines with multiple back-end data sources. Microsoft is aware of these issues and will continue to address them through subsequent ODBC releases, including a new and significantly improved ODBC driver for Oracle.

Moving forward, ODBC is a supported technology under the Universal Data Access umbrella. ODBC in the short and medium term is the best way to access a broad range of relational DBMS-based data due to the high number of drivers available. During this period, with ODBC remaining as a mature technology and OLE DB components becoming available, Microsoft does not want to force customers to choose between the two architectures and make the ensuing trade-offs. Our goal is to enable customers to take advantage of existing ODBC technologies, while adopting the Universal Data Access architecture for new applications.

It was therefore by design that the evolutionary strategy for migrating from ODBC to OLE DB was created. The very first OLE DB provider released by Microsoft was the OLE DB Provider for ODBC. Applications are then written to the ADO or OLE DB interface, and the OLE DB Provider for ODBC connects to the ODBC data source. If an organization later decides to change data sources, add data sources, or change from the ODBC driver to a pure OLE DB provider for the existing data source, then the application can be adapted with minimal changes.

This evolutionary strategy for migrating from ODBC to OLE DB carries some additional important benefits. Because OLE DB is a component-based architecture with service components providing processing capabilities on an as-needed basis, and because ODBC data sources can expose their data as OLE DB, OLE DB service component features may be invoked against ODBC data. For example, the Find feature provides for sorting and filtering within a result set. Thus, the result set can be reused and further refined, without an additional round trip to the server. This capability is not available for an ODBC client. Thus new and existing applications can gain additional data access features by using OLE DB to call broadly supported ODBC drivers.

This is the evolutionary path from ODBC to OLE DB that Microsoft is providing based on consistent customer feedback. Organizations should continue to plan on broad availability and support for ODBC drivers. As they build new applications, they should look to the Universal Data Access architecture, using ADO and OLE DB interfaces. Nonrelational data will be exposed by OLE DB providers. For relational data, organizations may choose between ODBC drivers, and, as they become available, OLE DB providers and components. In the long run, Microsoft believes customer demand will drive the market for OLE DB components and they, too, will become broadly available. Able to freely choose among and mix ODBC and OLE DB components, organizations will benefit from the highest possible application performance and reliability, while gaining new capabilities at a pace that suits their unique requirements.

Universal Data Access Is Strategic for Microsoft

The Universal Data Access strategy is intertwined with most of the major lines of business where Microsoft is serving organization customers, including operating systems, tools, applications, and Internet products. Universal Data Access is designed to work consistently across each of these major product lines, to enable organizations to leverage their data access expertise across teams and projects to build high-performance database solutions accessible to employees, customers, and business partners.

One of the strongest examples of this can be seen in ADO, which provides a single interface to data whether it's being called from a developer tool, Web page, Office or a custom business object. Applications using a variety of front ends may now use the same high-performance interface to data, featuring a small memory footprint, demand-loaded reusable OLE DB components, and a familiar semantic derived from most widely used Microsoft interfaces. No matter where in the multitier architecture one is writing data access code, the interface can be the same—ADO.

Making integrated access to all forms of relational and nonrelational data ubiquitous is strategic for Microsoft products because it enables those products to add value through tools that use the Universal Data Access architecture. Customers are the ultimate beneficiaries as their tools and applications become more highly adept at processing the information they work with every day.

Relationship of Universal Data Access and Windows DNA

The Windows Distributed interNet Applications architecture is the Microsoft architectural framework for building modern, scalable, multitier distributed computing solutions that can be delivered over any network. Windows DNA provides a unified architecture that integrates the worlds of client/server and Web-based application development. Microsoft Universal Data Access, a central part of the Microsoft Windows DNA strategy, provides data access services for Windows DNA applications.

Figure 3. Windows distributed internet applications architecture

Windows DNA addresses requirements at all tiers of modern distributed applications: user interface and navigation, business process, and storage. The core elements of the Windows DNA architecture are these:

  • Pluggable software components
  • Extensible Web browser and application server
  • Richly integrated platform services
  • Scalable distributed operating environment and servers
  • Open protocols and published interfaces
  • Choice of programming languages, tools, and hardware platforms

Because Microsoft Universal Data Access is based on COM, it provides a unified, consistent, and common data access model for all applications built to the Windows DNA model.

Broad Industry Support for Universal Data Access

Organizations using data access components have indicated that in order to invest in the Universal Data Access architecture, they need to see the support of vendors of related products and technologies. For customers, broad industry support carries many benefits—safety in numbers, availability of skilled people to work with the products, and products that work together without expensive integration and customization. This section details the activities in which Microsoft is engaged to solidify and publicize the broad range of companies supporting Universal Data Access.

Supporters of Universal Data Access

The industry reception for Universal Data Access has been very positive. Companies building components in each architectural segment recognize key benefits for their customers—improved performance and functionality, flexibility, and reduced cost. This section will detail the industries and key vendors supporting Universal Data Access, discuss Microsoft strategy for continued growth in industry support and provide instructions for additional federated vendors to join the Universal Data Access strategy.

The key industry segments supporting Universal Data Access are as follows:

  • DBMS vendors. These vendors benefit from Universal Data Access by gaining additional high-performance clients for their engines. Their customers benefit by gaining a broader choice of development tools and other supporting technologies and by connecting to and integrating with data from more sources.
  • Development tools vendors. Tools vendors benefit from Universal Data Access by being able to provide access to a broader range of data sources, thus enabling their customers to build richer, more functional applications. Leading tools vendors are participating by licensing MDAC, which includes ADO, OLE DB components and providers, and ODBC components and drivers. Tools vendors have thus gained the ability to enable their customers to create applications that access the vast majority of data sources available today.
  • Data access component builders. By building OLE DB components, data access component builders gain the benefits of a broadly accepted standard environment in which to deploy their products. Their customers can access more data, find more readily available support, and interoperate with a broad array of products across the platform.

Because the list of vendors in each of the above categories is growing rapidly, the reader is asked to visit https://www.microsoft.com/data/ for a complete, updated list. Leading vendors in each industry segment are represented among the list of Universal Data Access supporters.

OLE DB Provider Strategy

To be successful, OLE DB must gain a broad array of native providers and components so that users can connect to virtually any data source, reuse OLE DB service components, and realize performance and reliability benefits.

The tools that OLE DB provider and component vendors use to simplify their work are found in the Data Access SDK. In addition to the data access consumer components discussed in this paper (ADO, OLE DB, and ODBC), users of the SDK receive additional tools, documentation and specifications to help them create high-performance OLE DB components. Provider writers will find the following:

  • Leveling specification. Defines the minimum level of functionality that should be implemented by every provider. In addition, it defines the level of functionality that most consumers will expect and that should be implemented by a provider when supported natively by the data source. To tie these together, it also defines the services that are likely to be supported by various classes of service components.
  • Conformance tests. Includes interface tests to show that a provider correctly implements the interfaces that it supports. Also includes a minimal set of ADO tests that show that a provider works well when in an ADO application. The initial focus of these tests is on the minimum provider interfaces as defined in the leveling specification.
  • OLE DB Simple Provider (OSP) Toolkit. Gets you started to build quick providers for tabular and other simple data.
  • Microsoft Visual C++® development system template classes for OLE DB providers. These are a part of Visual C++ and are not actually included in the Data Access SDK.
  • OLE DB providers for Microsoft Jet, SQL Server, and Oracle data sources.

These tools simplify the process of writing OLE DB components, provide a framework for creating components that interoperate in well-defined ways, and provide criteria by which OLE DB consumers can easily compare component features. Anyone interested in creating OLE DB components should obtain the Data Access SDK.

Data Access SDK

The Microsoft Data Access SDK is a set of tools and samples. It is designed to help developers create solutions using MDAC. The SDK provides a convenient single source for everything needed to learn about and create data access solutions.

The SDK contains MDAC version 2.0, tools for getting started with data access component development, tools for testing and distributing components, and documentation. The SDK is activity-based with content designed specifically for both consumer and provider writers, and for developers working with various languages and deployment environments.

How Vendors Participate in Universal Data Access

Microsoft is interested in working with vendors of products that support Universal Data Access to help ensure that components address the performance and quality demands of our joint customers. Vendors of DBMS products, development tools, and OLE DB service components should visit https://www.microsoft.com/data/ for updated information on programs, products and services.

How Universal Data Access Supports Data on Multiple Platforms

While the Windows NT operating system is emerging as an important platform for database management, many organizations rely on a mixture of operating systems and database platforms. To be successful, any strategy for providing data access must be equally efficient at accessing data on all major platforms. Universal Data Access provides the foundation for supporting efficient and reliable access to data on today's major computing platforms. Microsoft is actively engaged in supporting third-party development projects involving OLE DB providers for non-Windows-based data. In fact, products using the Universal Data Access architecture to access leading DBMSs on non-Windows platforms are currently available.

Figure 4. Universal data access supports data across the enterprise

Because the OLE DB specification defines interfaces that components support, rather than providing a set of DLLs or actual system components, it is highly portable to other operating environments. OLE DB is based on the COM architecture, which is the Windows object model. This would seem to imply that OLE DB components must run on a Windows-based or Windows NT-based PC; however, this is not the case. OLE DB in fact has two separate approaches that provide portability to non-Windows-based DBMS platforms: a full port of COM, available today from several vendors, and implementations of COM interfaces on non-Windows platforms.

ISG International Software Group Ltd., with its ISG Navigator product, provides an example, using the second approach described above, of MDAC components that integrate data from Windows NT and several non-Windows NT platforms, including the following:

Operating Environments Database Platforms
Windows 95 Microsoft SQL Server
Windows NT (Intel and Alpha) Oracle
HP UX Sybase
IBM RS/6000 AIX Informix
DEC UNIX RMS
SUN Solaris C-ISAM
DEC OpenVMS (Alpha and VAX) CA-Ingres
IBM MVS (planned) DB2
  Adabas (planned)
  IMS/DB (planned)
  RdbVSAM (planned)
  MUMPS (planned)

The important thing to recognize about the ISG Navigator product is that its availability and performance prove the ability of the Universal Data Access architecture to integrate data between Windows and non-Windows platforms. It is not the only approach to satisfying the need for multiplatform data access, but it is a solid implementation available and demonstrable today. The Navigator demo demonstrates several important features:

  • An Internet-based front end using ADO code called from Visual Basic Scripting Edition in an Active Server Page
  • ADO taking advantage of OLE DB components on both Windows and non-Windows platforms
  • Navigator, in a single SQL query, joining data from multiple data sources, potentially running on several different platforms

Broad availability of MDAC components that integrate data on multiple platforms will benefit organizations that support multiple DBMS platforms. An additional benefit enables users to take advantage of new OLE DB capabilities when accessing non-Windows-based data. Powerful new service components, running on front ends or middle-tier servers, can be integrated with an OLE DB provider, including those running on non-Windows platforms. For example, general-purpose query processors, cursor engines, or custom business objects can all add value to non-Windows-based data exposed by OLE DB. Mainframe and UNIX-based databases that previously did not support remoting of data—an essential feature for the Internet and loosely connected scenarios—may now implement it, thus gaining greater use from existing systems and applications. This powerful extensibility and reusability model is a benefit of component-based software written to a broadly supported specification such as OLE DB.

How Universal Data Access Differs from Other Strategies

A number of leading DBMS vendors have begun shipping new databases and updated versions that follow "universal database" strategies. Customers may be curious about how those strategies differ from Universal Data Access.

In the other approaches, data from across the organization is consolidated in the DBMS, and the DBMS is extended with additional processing capabilities to handle new data types. This strategy can be attractive for several reasons:

  • It centralizes data for more efficient management.
  • It places a DBMS "wrapper" around new data types, which can ensure the protection of DBMS security and transaction services, while offering other potential benefits such as content indexing.
  • Many DBMSs are very efficient at serving up data, so organizations can generally expect good performance for applications based on traditional and nontraditional DBMS data.

Microsoft, while recognizing these benefits, believes they may be difficult for some organizations to attain. A universal database approach may require expensive and time-consuming movement to and maintenance of corporate data in the DBMS. It may require tools and applications to support it. And it may require compromises in the selection of supporting products. Customers' applications will need to either implicitly support this architecture, which is unlikely, or be customized to integrate with it, which could be expensive.

It is very important to note that because Universal Data Access does not exclude any data stores, the two strategies can cooperate. In fact, OLE DB providers for a number of new "universal database" products are currently under development. Using the Universal Data Access strategy, customers will be able to use data in their existing databases, universal database servers, desktop applications, mainframes, etc. Organizations that combine Universal Data Access and universal database products will ultimately benefit from a broad choice of best-of-breed tools, applications, and DBMS products available from leading data access vendors.

Universal Data Access: A Road Map for the Future

Going forward, Microsoft has two important vehicles for shipping the supporting components of its Universal Data Access strategy: the Microsoft Data Access Components and the Data Access SDK. Long-term planning and development for both of these products are under way, and customers may be curious as to the future directions for the Universal Data Access strategy expressed in these plans.

MDAC

The mission for Microsoft Data Access Components is to provide, in a consolidated release, the key data access technologies used across Microsoft tools, applications, and platform products. The next MDAC release will be shipped concurrently with the next release of Office. Each subsequent release will be tested and supported for use with the latest versions of following Microsoft products (see footnote):

  • Windows 95 and later
  • Windows NT version 4.0 and later
  • Internet Explorer version 4.0 and later
  • Internet Information Server version 4.0 and later
  • Microsoft Visual Studio™ (and each of its tools) version 6.0 and later
  • Office 97 and later

There are three project-level design goals for the next release of MDAC:

  • Continue to deliver more and better OLE DB components for all major Microsoft data stores and strategic third-party stores.
  • Respond to input from vendors using the conformance tests to make them an industry de facto standard for measuring any OLE DB providers.
  • Continue to expand ADO. There are two aspects of this goal. One, to further superset the capabilities of DAO and RDO, so that developers working with Visual Basic and Office will have the option of moving to ADO without sacrificing current functionality. Two, to enable new scenarios, including more remote, disconnected capabilities and more sophisticated data models.

Conclusion

Organizations of all sizes today are creating business solutions that leverage data from the desktop to the enterprise. As the types of data and types of access have proliferated, the challenge to create business advantage has remained paramount. Microsoft has designed the Universal Data Access strategy to meet the needs of today's distributed, multiplatform organization building client/server and Web-based data-driven solutions. By building in performance and reliability features, by making Universal Data Access a key part of the Windows DNA architecture, and by enlisting the support of a broad range of industry players, Microsoft is aggressively meeting customer needs.

Universal Data Access helps organizations build on existing systems and data stores as they create new client/server and Web-based solutions. Universal Data Access bridges the gap between existing systems and new technologies to create an evolutionary path for cost-conscious customers. As customers forge new business opportunities, Microsoft will be there to provide tools and technologies to enable success.

Footnote

Not all releases will support back versions of all listed products. For specific system requirements, please consult the MDAC product information, located at https://www.microsoft.com/data/.