A Strategic Approach to Data Transfer Methods

 

E. G. Nadhan
EDS

Jay-Louise Weldon
EDS

October 2004

Summary: Database management systems have evolved to address the storage and management of terabytes of data; however, the issue of effectively exchanging high volumes of data between and among enterprise applications remains. This paper explores an enterprise-wide data transfer strategy necessary to guide IT practitioners and to enable consistent representation of key business entities across enterprise applications. (16 printed pages)

Contents

Introduction
Background
Target Audience
Problem Definition
Business Process Review
Options Analysis
Conclusion

Introduction

Today, business is driven by having access to the right information at the right time. Information is the lifeline for the enterprise. However, timely access to the right information is complicated by the number and complexity of business applications and the increased volumes of data maintained. Data needs to be shared in order for business processes to be effective across the enterprise. Organizations have a variety of ways in which they can share data. A fully integrated set of applications or access to common databases is ideal. However, if these alternatives are not practical, data must be moved from one application or database to another and the designer must choose from the range of alternatives that exist for data transfer. As a result, data sharing can pose a significant challenge in the absence of an established data transfer strategy for the enterprise.

Most enterprises have acquired or built applications that support the execution of business processes specific to autonomous business units within the enterprise. While these applications serve the specific need of the business units, there continues to be a need to share data collected or maintained by these applications with the rest of the enterprise. In cases where the applications serve as the Systems of Record, the volume of data to be shared is relatively high. Further, enterprises have accumulated huge volumes of data over the past few decades as storage costs have decreased and the amount of activity tracked in the e-Commerce environment has grown beyond that of the mainframe-centric world.

Modern IT organizations face the challenge of storing, managing, and facilitating the exchange of data at unprecedented volumes. While data base management systems have evolved to address the storage and management of terabytes of data, the issue of effectively exchanging high volumes of data between and among enterprise applications remains. A sound, enterprise-wide data transfer strategy is necessary to guide IT practitioners and to enable consistent representation of key business entities across enterprise applications.

Background

The mainframe world of the 70's consisted of punch card driven monolithic applications, many of which continue to be the systems of record in organizations today. The advent of the Personal Computer in the 80's fostered the client-server world of the early 90's where the PC evolved into a robust client workstation. Processing power continued to grow exponentially resulting in the introduction of midrange servers that were employed by key business units within the organization. Client-server technology gave these autonomous business units the power to store and use data in their own world. Such autonomy gave birth to many repositories that did a good job of storing isolated pockets of data within the enterprise. N-tier distributed computing in the late 90's resulted in the creation of additional layers that stored data specific to these business units.

While departmental business processes are not impacted by the proliferation of data across multiple repositories, there exists a critical need to leverage data at an enterprise level – as well as at the business unit level. For example, organizations need to have an enterprise level view of the customer and serve their customers as a single logical entity. In today's world of real-time online interaction with customers, end-to-end system response has become a critical success factor as well. The fundamental requirements to provide basic customer service have not changed over the years. However, maintenance and retrieval of the data expediently to provide such service has become a much more complex process.

In spite of this complexity, enterprises of today need to have access to all these pockets of data as well as the original systems of record. Such access is accomplished either by building connection mechanisms to the various systems or by transferring data between systems at periodic intervals. Enterprise Application Integration (EAI) tools can be applied to move transactions and messages from one application to another. Extract Transformation and Load (ETL) tools perform the same task but usually move data in bulk.

This article describes the options available to address this problem of data sharing. While the options are not mutually exclusive, they represent logically different design and execution principles.

Target Audience

IT personnel who are faced with the challenges of sharing data between multiple applications within the enterprise would benefit from the contents of this article. Such personnel include IT Enterprise Architects, Data Architects, Integration Architects as well as Subject Matter Experts for the key enterprise applications. Process and Functional managers within the enterprise who work closely with the IT architects will develop an appreciation for the complexities of data sharing driven by business process changes.

Problem Definition

Applications often need to make their data accessible to other applications and databases for various reasons. Data may need to be moved from one platform to another or from one geographic location to another. Data may need to be moved to make it accessible to other applications that need it without impacting the performance of the source system. Changes in data may need to be moved to keep two systems in sync. Often, firms will create a shared repository, called an Operational Data Store (ODS), to collect data from source systems and make it available to other systems and databases. Data must then be moved from the originating application to the ODS.

There are many ways to accomplish data transfer and many factors to consider when choosing the alternative that best fits the situation at hand. Efficiencies become critical when the data volume is large. Bulk data transfer may not be a viable alternative due to time constraints. At the same time, identification of changed data can be a challenge.

The example below represents a realistic scenario where this situation manifests itself.

Sample Scenario

A customer-facing website allows subscribers to a service to enroll online. The processes involved in this activity will capture a number of relevant data elements about the subscriber. The captured data will be immediately housed in a Sales system (e.g. an Order Management System). In this example, the Sales system would be considered the System of Record for these data elements. Subscriber data is, no doubt, critical to the Sales department. At the same time, it is also important to several other business units. For instance, the Billing department will need it to make sure that financial transactions with the subscriber are executed. And, the Marketing department may want this data to help design campaigns to cross sell and up-sell products and services to the subscriber. Therefore, it is crucial that subscriber data be placed; as soon as possible in an enterprise accessible Operational Data Store from which it can be made available to those applications that need it.

From a systems standpoint, Figure 1 represents the scenario just described. It illustrates a front-end application storing data into its proprietary System of Record and receiving an acknowledgement of a successful update. This System of Record is constantly populated with high volumes of data that need to be transferred to an Operational Data Store so that they may be shared with the rest of the enterprise. The subsequent sections illustrate the different ways of accomplishing such a transfer.

Aa480064.strategicdata-1(en-us,MSDN.10).gif

Figure 1.   Sample Scenario

Figure 1 illustrates the following steps:

  1. Front End Application updates System of Record.
  2. System of Record acknowledges successful update.
  3. Transfer of data to the Operational Data Store.

Depending on the context of the specific problem domain for a given enterprise, there are multiple approaches to effect the transfer of data to the ODS in this scenario. The various approaches involved are described in the sections that follow.

The approaches presented are based on the following assumptions:

  1. For simplicity's sake, we have assumed that there is only one System of Record within the Enterprise for any given data element. Propagation of data to multiple Systems of Record can be accomplished using one or more of these options.
  2. An update to a System Of Record can mean the creation, modification, or even the deletion of a logical record.
  3. The acknowledgement step is the final message to the Front End Application that indicates that all the intermediate steps involved in the propagation of data to the System of Record as well as the Operational Data Store have been successfully completed. Additional acknowledgement steps between selected pairs of nodes might be necessary depending on the implementation context for specific business scenarios.
  4. Metadata, while not directly addressed in this paper, is a crucial consideration for data transfer1. It is assumed that all the options discussed entail the capture, manipulation and transfer of metadata. However, the discussion in this paper is limited to the logical flow of data between the different nodes within the end-to-end process.

Business Process Review

There are various options available for engineering the transfer of data within the Sample Scenario defined in Figure 1.

However, prior to exercising any given option, it is prudent to take a step back, review and validate the business need for the transfer of data. The actual transfer of data between systems could be a physical manifestation of a different problem at a logical business process level. A review of the end-to-end processes may expose opportunities to streamline the business process flow resulting in the rationalization of the constituent applications. Such rationalization could mitigate and in some cases, eliminate the need for such transfer of data. Some simple questions to ask would include:

  • Why does the data need to be transferred?
  • Why can't the data stay in a single system?

A viable answer to these questions could eliminate the need for such transfer of data. If there is still a clear need for this transfer of data even after a review of the end-to-end business process, there are multiple options available that broadly conform to one or more of these approaches:

  • EAI Technologies
  • ETL Technologies
  • Combinations

The remaining options explore these different possibilities.

Data Transfer Options

This section describes the architectural options that are available for sharing data between discrete applications. The discussion here is independent of commercial solutions; rather it focuses in genres of technologies that are available in the market today.

The options discussed in this section are:

  • Option 1: EAI Real-time Transfer
  • Option 2: EAI Propagation of Incremental records
  • Option 3: Incremental Batch Transfer (Changed Data Capture)
  • Option 4: Native Replication
  • Option 5: Bulk Refresh using Batch File Transfer
  • Option 6: ETL/ELT Transfer
  • Option 7: Enterprise Information Integration

Option 1: EAI Real-time Transfer

Figure 2 illustrates the manner in which an EAI Integration Broker2 can facilitate this transfer. This option is application-driven and most appropriate for data transfers where the updates to the System of Record and the ODS are part of the same transaction. An Integration Broker receives the transaction initiated by the Front End Application after which it assumes responsibility for the propagation of the data to the System of Record as well as the Operational Data Store. The steps are executed in the following sequence:

  1. Front End Application initiates update to the System of Record.
  2. Integration Broker receives this update from the Front End Application and sends it to the System of Record.
  3. System of Record acknowledges the update.
  4. Integration Broker immediately initiates a corresponding update to the Operational Data Store, thereby effecting an immediate, real-time transfer of this data.
  5. ODS acknowledges the receipt of this update.
  6. Integration broker sends the acknowledgement of these updates to the Front-End Application.

Aa480064.strategicdata-2(en-us,MSDN.10).gif

Figure 2.   EAI Real-time Transfer

Usage Scenario – Financial Institution

A front end CRM application captures data about prospects calling into the Contact Center. The CRM application propagates the prospect data to an Operational Data Store that contains the basic customer data for enterprise wide reference. This data needs to be propagated to the ODS immediately so that the most current data is available to all the other customer-facing applications like the Automated Teller Machine (ATM), financial centers (branches) and online banking access.

Option 2: EAI Propagation of Incremental records

This option is application-driven and appropriate for lower priority data. The Front End Application updates the System of Record after which this data is propagated to the ODS through the Integration Broker. This is characteristic of scenarios where there is a tightly coupled portal to an ERP or CRM system. There are two different mechanisms to effect this transfer of data to the ODS:

  • Option 2a: Push to Integration Broker: System of Record initiates the notification of the receipt of this data to the Integration Broker. The 'push' is frequently triggered by a scheduled requirement, for example, daily update.
  • Option 2b: Pull from Integration Broker: Integration Broker continuously polls the System of Record for receipt of this data. The 'pull' is frequently triggered by a business event in the application using the ODS, for example, a service transaction that requires up-to-date customer data.

Usage Scenario – Manufacturing Organization

An order entry ERP application is used by Customer Service Representatives to enter orders every hour directly into the backend orders database. New orders received must be transferred to the enterprise service dashboard repository on a daily basis. The enterprise service dashboard provides management a holistic view of the order volume as of the previous business day. The first option could be a daily 'push' of new orders from the ERP application to the dashboard repository. Or, the dashboard could initiate a 'pull' from the orders database through the Integration Broker to provide this data when management requires the latest view of the order volume. Each of these options is explained in further detail below.

Option 2a: Push to Integration Broker

Figure 3 illustrates the EAI propagation of incremental records by having the System of Record push this data to the Integration Broker. The steps are executed in the following sequence:

  1. Front End Application initiates update to the System Of Record
  2. System of Record notifies Integration Broker about receipt of this data after completing the update.
  3. Integration Broker receives this update and sends it to the ODS.
  4. ODS acknowledges the update.
  5. Integration Broker sends an acknowledgement of the successful propagation of this data to the Front End Application.

Aa480064.strategicdata-3(en-us,MSDN.10).gif

Figure 3.   Push to Integration Broker

Option 2b: Pull from Integration Broker

Figure 4 illustrates the EAI propagation of incremental records by having Integration Broker poll the System of Record on a regular basis and propagate this data to the ODS. The steps are executed in the following sequence:

  1. Front End Application initiates update to the System of Record.
  2. Integration Broker polls the System of Record to check if any new data has been received.
  3. System of Record responds to the poll.
  4. If there is new data to be propagated, Integration Broker sends an update to the ODS.
  5. ODS acknowledges the update.
  6. Integration Broker sends an acknowledgement of the successful propagation of this data to the Front End Application.

Aa480064.strategicdata-4(en-us,MSDN.10).gif

Figure 4.   Pull from Integration Broker

Option 3: Incremental Batch Transfer (Changed Data Capture)

Option 3 is data-driven and is used to periodically move new or changed data from the source to the target data store. This option is applicable to scenarios where it is acceptable for the data updated in the System of Record to be provided to other applications after a finite time window (e.g. one day). In such scenarios, the data is transferred on an incremental basis from the System of Record to the ODS. This data sharing option involves capturing changed data from one or more source applications and then transporting this data to one or more target operations in batch. This is graphically depicted in Figure 5. Typical considerations in this option include identifying a batch transfer window that is conducive to both the source and target system(s) to extract and transport the data.

Aa480064.strategicdata-5(en-us,MSDN.10).gif

Figure 5.   Incremental Batch Transfer

There are two ways to accomplish this:

  • Change Log: System of Record maintains the changed data in dedicated record sets so that the Batch Transfer Program can directly read these record sets to obtain the delta since the last transfer. In this case, the System of Record is responsible for identifying the changed data in real-time as and when the change happens.
  • Comparison to previous: Batch Transfer Program leverages the data in the base record sets within the System of Record to identify the changed content. In this case, the Batch Transfer Program has the responsibility of comparing the current state of data with earlier states to determine what had changed in the interim.

The typical sequence of events for this kind of data sharing is as follows:

  1. Front End Application initiates update to the System of Record.
  2. Batch Transfer Program fetches changed data from System of Record.
  3. Batch Transfer Program updates Operational Data Store.
  4. An acknowledgement is sent to the Front End Application, System of Record and/or the Batch Transfer Program after the Operational Data Store has been successfully updated.

Usage Scenario – Service Provider

The sales force uses a sales leads database that tracks all the leads that the sales representatives are pursuing. The project delivery unit tracks the resources required for sales and delivery related activities. The project delivery unit maps resource requirements to existing projects as well as leads currently in progress. To that end, the leads data is transferred on a daily basis from the sales leads database to the project delivery database through the incremental batch transfer option.

Option 4: Native Replication

Option 4 is a data-driven option that is especially relevant for high-availability situations, e.g., emergency services, where the source and target data stores need to stay in sync virtually all the time. This data sharing option involves the use of native features of database management systems (DBMS) to reflect changes in one or more source databases to one or more target databases. This could happen either in (near) real-time or in batch mode.

The typical sequence of events for native replication is:

  1. Front End Application initiates update to the System of Record.
  2. Native Replication transfers data from System of Record to Operational Data Store.
  3. Operational Data Store sends an acknowledgement of receipt of data back to the System of Record.
  4. The System of Record sends an acknowledgement of the success of the operation to the Front End Application.

Aa480064.strategicdata-6(en-us,MSDN.10).gif

Figure 6.   Native Replication

Usage Scenario – Health Care Payer

Claims data is being entered through a two-tier Client Server application to a backend RDBMS by Customer Service Representatives. Updates to the Customer Profile are also made in the System of Record while entering data about the claims. Customer Profile updates are directly replicated into the ODS which serves as the Customer Information File for all the other enterprise applications.

Option 5: Bulk Refresh using Batch File Transfer

This option is data-driven and appropriate when a large amount of data, for example, a reference table of product data, needs to be periodically brought into sync with the System of Record. This option transfers all the data inclusive of the latest changes on a periodic basis. All the records are extracted from the System of Record and refreshed into the ODS. Existing records in the ODS are purged during each transfer. Such transfers are typically done in batch mode overnight. Bulk Refresh is well suited for scenarios where there is significant overhead involved in identifying and propagating the incremental changes. The incremental approach can be more error prone and therefore, maintenance intensive.

These types of transfers can be accomplished in one of two ways:

  • Option 5a: File Extract: A program in System of Record extracts all the records into an intermediate file. This file is subsequently loaded into the ODS by another program.
  • Option 5b: Program Extract: A separate program queries the System of Record and transfers each record in real time to the ODS. There is no intermediate file created.

Option 5a: File Extract with Full Refresh

Figure 7 illustrates the file based extraction process for bulk transfer of data. The following steps are executed in this process:

  1. Front End Application initiates update to the System of Record.
  2. System of Record acknowledges the update.
  3. All records are extracted into an Extract File from the System of Record.
  4. Extract File is refreshed into ODS.

Aa480064.strategicdata-7(en-us,MSDN.10).gif

Figure 7.   File extract

Option 5b: Program Extract with Full Refresh

Figure 8 illustrates the program-based extraction process for bulk transfer of data. The following steps are executed in this process:

  1. Front End Application initiates update to the System of Record.
  2. System of Record acknowledges the update.
  3. Extract and Load program retrieves and updates all the records from the System of Record into the ODS.

Aa480064.strategicdata-8(en-us,MSDN.10).gif

Figure 8.   Program Extract

Unlike the File Extract, retrieval from the System of Record and updates into the ODS are part of a single transaction with no intermediate persistence of the data. The Extract and Load program can be triggered at fixed time intervals or on the occurrence of specific events. For instance, it can run four times a day, or on updates to a critical master table. While this is architecturally similar to Option 3: Incremental Batch Transfer (see Figure 5), the scope is different: here, all data from the System of Record is transferred to the ODS, rather than just an incremental change.

Usage Scenario – Large Enterprise HR Department

Large international enterprises with thousands of employees have an organizational hierarchy that is spread wide and deep across the globe. A minor change to this hierarchy can have a ripple effect across the organizational layers. While the organizational structure is maintained in a single repository, it is used in a read only mode by other applications from the Operational Data Store.

The organizational structure, thus, must be fully refreshed on a regular basis in the Operational Data Store.

Option 6: ETL/ELT Transfer

Option 6, illustrated in Figure 9, is data driven and most appropriate where substantial data scrubbing and transformation are required as the data are moved, e.g., for integration into a data warehouse or data mart. This option overlaps with both Option 3: Incremental Batch Transfer and Option 5: Bulk Refresh transfers*.* The difference is that business logic is applied to the data while it is transported from source to target systems. An ETL tool is often used for this kind of data transfer. Source data is extracted, transformed en route, and then loaded into one or more target databases. The transformations performed on the data represent the business rules of the organization. The business rules ensure that the data is standardized, cleaned and possibly enhanced through aggregation or other manipulation before it is written to the target database(s). ETL transfer involves the following steps:

  1. Front End Application initiates update to the System of Record.
  2. ETL Transfer Program fetches changed or bulk data from System of Record.
  3. ETL Transfer Program updates Operational Data Store.
  4. An acknowledgement is sent to the Front End Application, System of Record and/or the ETL Transfer Program after the Operational Data Store has been successfully updated.

Aa480064.strategicdata-9(en-us,MSDN.10).gif

Figure 9.   ETL Transfer

The same applies to ELT transfer as well. The difference between ETL and ELT lies in the environment in which the data transformations are applied. In traditional ETL, the transformation takes place when the data is en route from the source to the target system. In ELT, the data is loaded into the target system, and then transformed within the target system environment. This has become a popular option recently with since there are significant efficiencies that can be realized by manipulating data within database environments (for example by using stored procedures).

Usage Scenario – Health Care Provider

Employers send Entitlement information for Employees and their dependents to Health Care Insurance Payers on a weekly basis recording all the changes that happened each week. The incoming data is in a format proprietary to the Employer that needs to be converted into the Health Care provider's backend mainframe system's format. Summary records have to be created that list the number of dependents and children that each employee has. ETL tools can be used to perform these format and content transformations in batch mode.

Option 7: Enterprise Information Integration

This option is an emerging one and is similar to Business Process Review. It involves the creation of a logical enterprise-wide data model that represents the key business entities and their relationships in a consistent, standardized fashion. The Enterprise Information Integration layer where this model resides has the business intelligence to do the following:

  • Determine the repository that has the most accurate value for each data element.
  • Construct the result set by fetching the right information from the right repository.
  • Propagate updated information to all the affected repositories so that they are in a synchronized state all the time.
  • Provide an enterprise-wide view for all the business entities.

The enterprise-wide data model functions like a virtual database. In some respects, it is a view, in relational database terms, on tables spread across multiple physical databases. As part of its information integration responsibilities, the Enterprise Information Integration (EII) layer can propagate the information to the ODS and the System of Record ensuring that they are synchronized. This is illustrated in Figure 10.

The following execution steps are involved when the EII option is exercised:

  1. Front End Application initiates update to the System of Record through the EII layer.
  2. EII layer updates System of Record.
  3. EII layer updates the Operational Data Store.
  4. Upon successful completion of both updates, the EII layer sends the acknowledgement back to the Front End Application.

Aa480064.strategicdata-10(en-us,MSDN.10).gif

Figure 10. Enterprise Information Integration

Compound Scenarios

Apart from the Sample Scenario described at the beginning of this paper and the usage scenarios described under each option, there are complex situations where the various options for data transfer need to be evaluated carefully and a combination of the relevant ones applied. These scenarios include, but are not limited to:

  • Populating a DW or an ODS with data from operational systems
  • Populating data marts from a DW or an ODS
  • Back propagating integrated data into applications
  • Combinations of application-to-application and application-to-ODS data transfers

The first three of these scenarios can be handled using Business Process Review and/or Option 1: EAI Real-time Transfer through Option 7: Enterprise Information Integration described above. Application to application scenarios involve a mix of the above options and two types are discussed here in detail.

Option 8a: Application-to-Application Transfer with Cross-reference

Option 8a is appropriate when the EAI tool must perform a simple lookup during data transfer. For example, while transferring data from a Sales application (X) to a Finance application (Y), current account code based on the transaction type in the Sales transaction must be looked up and added to the transaction during transfer. The business requirement in this scenario, graphically depicted in Figure 11, is to transfer data from application X to application Y. As part of this transfer, there must be manipulations performed on the data that require cross-reference tables (like looking up codes and translating into meaningful values in the target system). While real-time EAI transfer can effect the transfer of data from application X to application Y, ETL transfer can be used to transfer cross-reference data from these systems into a cross-reference data construct (represented as XREF in the diagram).

Note:   Option 5a: File Extract with Full Refresh or Option 5b: Program Extract with Full Refresh could also be used to update the XREF table.

Option 8b: Application-to-Application Transfer with Static Data

Option 8b represents a situation where the data from application X must be augmented with data from application Z during transfer to application Y. For example, a transaction from the Sales application (X) must be augmented by product cost data from the Inventory application (Z) during transfer into the Finance application (Y). In this scenario, depicted in Figure 12, data is transferred from application X to Y. At the same time, updating application Y also involves receiving other data from secondary applications that are static – or at least relatively static compared to the real-time nature of transfer from X to Y. Here, EAI is used to achieve the transfer of some of the data from X to Y. ETL transfer is used to prepare and provide the additional data that application Y requires from a secondary application (Z) into an ODS. EAI then fetches the additional data from the ODS to populate application Y.

Note:   Any one of Option 6: ETL/ELT Transfer through Option 7: Enterprise Information Integration could be used for the update of the ODS.

Options Analysis

The most appropriate option for an environment is based on the data transfer requirements and constraints specific to that environment. There are several procedural, architectural and financial criteria that have to be taken into account while determining the most suitable option for an environment. This section outlines the key criteria to be considered followed by a ranking of each option in the context of these criteria. While there may very well be other applicable options or combinations of these options as discussed under Compound Scenarios, this section focuses on the basic options (1 through 6) described earlier.

Business Process Review and Enterprise Information Integration have been excluded from the analysis since they do not actually involve the transfer of data. These criteria can be classified into Requirements and Constraints as shown in Table 1. Requirements are typically architectural in nature, driven by business needs. Constraints define the parameters within which the solution must be architected keeping the overall implementation and maintenance effort in mind.

Aa480064.strategicdata-11(en-us,MSDN.10).gif

Figure 11.   A2A Transfer with Cross-Reference

Aa480064.strategicdata-12(en-us,MSDN.10).gif

Figure 12.   A2A Transfer with Static Data

Table 2 outlines the characteristics of Option 1: EAI Real-time Transfer through Option 6: ETL/ELT Transfer in the context of these criteria. Please note that Business Process Review and Option 7: Enterprise Information Integration have not been analyzed in Table 2. Business Process Review is a revision to the existing business processes that may result in the implementation of any one of the other options. Option 7: Enterprise Information Integration has to do with the logical representation of information at an enterprise level. Any one of Option 1: EAI Real-time Transfer through Option 6: ETL/ELT Transfer may be used in conjunction with the EII model.

Table 1.   Evaluation Criteria (click graphic to enlarge)

Click here for larger image.

Table 2.   Options Evaluation (click graphic to enlarge)

Click here for larger image.

Conclusion

There are many approaches available to enterprises for effecting data transfer between and among their business applications. Enterprises should first review the Business Process to confirm the necessity of the transfer. Once confirmed, there are multiple options, enabled by EAI and ETL technologies, to effect the data transfer. In some cases, a combination of options might be needed to address the complete set of data transfer requirements within an enterprise. The process driving such transfers should establish the technology and the tool employed rather than have the technology define the process. Large enterprises typically employ an optimal mixture of all three strategies: Business Process Review, EAI, and ETL. Enterprise Information Integration is emerging as another viable option in this space. The right option or combination of options to be used for a given scenario depends upon several criteria, some of which are requirements-driven while others are constraints. This paper presents the most significant criteria to consider and provides an evaluation of each option based on these criteria.

Endnotes

1 Effective data sharing requires a common understanding of the meaning and structure of data for the provider and the receiver. Metadata – data about data – is the vehicle for achieving that understanding. When data is shared or physically transferred between parties, metadata also must be exchanged. It is the designer's responsibility to ensure the appropriate metadata is captured and transferred in all data transfer situations.
2 An integration broker is a component that routes the messages exchanged between applications. It facilitates the conditional transfer of messages between applications based on predefined rules driven by business logic and data synchronization requirements.

 

About the Authors

E G Nadhan
Principal, EDS
Easwaran.Nadhan@eds.com
E G Nadhan is a Principal with the EDS Extended Enterprise Integration group. With over 20 years of experience in the software industry, Nadhan is responsible for delivering integrated EAI and B2B solutions to large scale customers.

Jay-Louise Weldon
Managing Consultant, EDS
Jaylouise.weldon@eds.com
Jay-Louise Weldon is a Managing Consultant with EDS' Business Intelligence Services group. Jay-Louise has over 20 years experience with business intelligence solutions and database and system design.

Special Acknowledgement: The authors thank Carleen Christner, Managing Consultant with the EDS Extended Enterprise Integration group for her thorough review of the paper and the feedback she provided on the content and format.

This article was published in the Architecture Journal, a print and online publication produced by Microsoft. For more articles from this publication, please visit the Architecture Journal website.

© Microsoft Corporation. All rights reserved.