Section 4: Legacy and Business Partner Integration: Using Service-Oriented Architecture for Integration

 

Brian Travis
Architag International Corporation

November 2003

Applies to:
    Microsoft® Visual Studio® .NET Enterprise Architect Edition

Summary: Learn how service-oriented architecture (SOA) can be used in business-to-business, application-to-application, and service-to-service integration. (38 printed pages)

To see an overview of the entire project, read FoodMovers: Building Distributed Applications using Microsoft Visual Studio .NET.

Contents

Integration
Hop on the Message Bus
Item Maintenance Process
Query Inventory Database
Supplier Orders
Store Orders
Services-to-Services Integration
Conclusion

Old systems and new systems need to live together and—most importantly—communicate important business data among one another. However, programs do not always support and understand each other's data formats, communications protocols, and languages. In addition, some programs provide humans with a view of the system. These user interfaces are designed to be used by humans, not by other programs. Furthermore, programs that must communicate live in different organizations. How are we to integrate all of these systems?

Our fictitious enterprise, FoodMovers Distribution Company, uses Visual Studio .NET and the .NET Framework to create business-to-business and application-to-application interfaces. The key to this functionality is "Service-Oriented Architecture," or SOA.

In Section 3, I discussed SOA, and how the FoodMovers application creates service interfaces that called "managers." These managers expose business processes and data in a reliable, scaleable, and secure way by decoupling the business logic and data layers from the presentation layer.

In this section, I will use those managers to allow our business partners to update their items in our inventory database, check inventory levels for their products that we carry, and place orders. I will also show that service-oriented architectures can be used in application-to-application and service-to-service integration.

Integration

Integration takes two major forms. The first I will deal with is integration of internal systems. This is usually called application-to-application integration or A2A. It is also called enterprise application integration, or EAI.

The second form is the integration of processes that are beyond our control, usually existing outside of our firewall at one of our business partner locations. This is usually called business-to-business integration, or B2B.

In order to integrate our internal systems, we need to realize that there are integration problems because applications, no matter what generation or platform they belong to, must communicate with each other by exchanging data. They must communicate with each other in an automated manner to accomplish a business process without human intervention. These components can exist as part of stand-alone programs that exist in different environments and machines. Java programs, COM objects, and mainframe COBOL programs can all belong to different generations and environments but still exist as part of the same business environment. They need to communicate one way or another.

To achieve this goal, data from one component must be sent to another component. In our FoodMovers scenario, we have a mainframe computer that is running inventory applications. The rest of the system is based on modern tools that communicate easily. We need to integrate the existing mainframe computer while somehow not breaking the programs that have been running for decades.

Internal Integration

We need to integrate our own systems in order to automate our business and offer services to external partners. A business function is spread over different applications, using different systems. We need to automate these systems before we can offer these business functions to our partners or even to our own internal departments, applications, or users.

If our systems depend on slow or inaccurate transfer of data, any automation will not be successful. We will spend most of our time tracking down system errors and other exceptions, rather than focusing on the business goals.

To do the integration, I could write separate interfaces between each of our systems. This has been the traditional way of integrating disparate systems. For example, I could write a bridge that somehow interacts with a system by scraping the screen of one application and providing the data to the next. The problems with such an approach are many, but that doesn't mean it isn't done. In fact, I have found this to be the preferred method of integration for many systems.

Fortunately, there are new techniques and tools that are available to help ease the burden of internal system integration. These techniques can also make it easier to integrate across enterprise boundaries, as well, allowing us and our trading partners integrate systems at any level we feel comfortable with.

These new techniques are based on two simple concepts: loosely coupled architectures and message-based protocols.

Web services are popularly considered to be services that are delivered over the Web, but that's a misnomer. In fact, any service that provides information over any network can be called a Web service. The main features that distinguish a Web service, I feel, are that it is loosely coupled, message-based, and expressed using non-proprietary international standards.

Loosely coupled, message-based systems can transcend the differences in platforms of our various systems. They work by changing the way integration is done.

As an illustration, consider the simple task of getting the temperature:

Interrogator: What is the temperature outside?

Respondent: 20 degrees Celsius.

In human interactions, this is a typical request/response scenario. This is a loosely coupled system, because the interrogator does not care about the implementation details of the respondent's reply. It is message-based, because the interrogator sends the respondent a request, and the respondent answers with a response. Contrast this with another approach:

Interrogator: Please read the thermometer outside on the north wall of the building. There is a scale on the side that is calibrated according to the temperature. I want you to give me that reading in Celsius.

Respondent: Well, I'm walking, and I can see the thermometer. I'm looking at it right now, and I can see that it consists of a closed tube with a bulb at the bottom that contains an amount of mercury. As the temperature rises, the mercury expands, and is forced up a tube. Reading the scale, I can tell you that it is 20 degrees Celsius.

In the second example, the interrogator had to know what the object was, and how to instruct the respondent to return the information. The respondent also gave information to the interrogator that might have been useful if the interrogator wanted to have some confidence that the answer was correct.

However, this extra information is burdensome if the interrogator only wanted to know the temperature.

The second example is a tightly coupled method-based architecture.

It is tightly coupled, because implementation details were communicated between the parties. That is, the interrogator had to tell the respondent to refer to a particular thermometer. If the respondent wanted a more accurate reading, he might have told the interrogator to query a digital thermometer that was on the east wall of the building.

It is method-based, because the interrogator asked for a specific methodology for getting the temperature.

Of course, this is just a simple example, and does not necessarily reflect the real world computer models that are in use today, but the point is that modern systems currently work by requiring more intimate knowledge of implementations than they might want.

So how do we get from a tightly coupled, method-based architecture to a loosely coupled, message-based system? The short answer is "standards." The long answer is dependent on the type of integration you are doing.

Since XML was formalized in 1998, there has been a huge amount of work and discussion done on the topics of using standards to integrate legacy systems and allow different architectures to communicate. This work has culminated in a concept called "Web services," and is based on open, vendor-independent standards and toolkit implementations that make the integration of these technologies usable.

Business-to-Business Integration

FoodMovers will use Web services to provide integration of all systems behind the firewall. However, in order to integrate our external trading partners, we have some more problems. Here is an overview of the problem and some history that will be helpful when we talk about our external trading partners.

Companies need to communicate in order to do business. Companies communicate by sending messages back and forth. Messages like contracts, purchase orders, invoices, monetary transactions, insurance claims, and a thousand more. Companies have been sending messages to each other for centuries. This is called "commerce."

Humans have engaged in commerce for thousands of years. Since the earliest days of civilization, we have conducted business with each other face to face. Only recently have we been doing commerce by technological proxy, letting a means other than face-to-face communication be instrumental in leading to a business transaction.

With the Internet and the more fast-paced commerce today, communicating electronically is essential for business survival. Think about a simple ink pen. When you buy a pen from an online retailer, a single transaction is involved. If the retailer wants to save transaction costs, the retailer could make the transaction more efficient by using appropriate technologies. Let's say the online retailer shaved two cents off the cost of the transaction. That savings is respectable, but it's only a couple of pennies toward the bottom line of the pen sale.

On the other hand, it took hundreds of transactions to build the pen. The pen manufacturer had to buy steel and ink and springs and dozens of other raw materials from manufacturers. Then the pen maker had to buy production services, marketing services, and management services. Each one of these transactions has a cost. If the manufacturer of the pen could shave a couple of pennies off each transaction between its partners, all of those savings would go to the bottom line.

The savings are real, but so is the complexity of getting all of your business partners to agree on using your syntax for communication. Having one site talk to another is a huge task once you realize that the two sites might be running different applications on different operating systems on incompatible hardware. Put these all together, and you get a nightmare of combinations. In order for business-to-business ecommerce to work, there needs to be some way for all of these systems to work together.

That's the task Microsoft took on in 1999. Their product, BizTalk Server, provides a rich ability to link together different systems speaking different languages, and make them talk to each other and play nice.

Among the tools FoodMovers uses is BizTalk Server to manage and control the business-to-business communications between FoodMovers and our suppliers. BizTalk Server integrates systems by making the interchange of documents relatively easy.

Electronic Document Interchange

The granddaddy of electronic interchange is the Electronic Data Interchange, or EDI. In the early 1970s, large companies such as Sears and Kmart pioneered electronic business communication. With thousands of stores and suppliers to contend with, these organizations generated and processed mountains of paper. It was obvious that all this paper was getting in the way of productivity, so methods of electronic communication began to emerge. To communicate electronically with these companies, suppliers were forced to develop and maintain a customized interface for each of their electronic business partners.

By the late 1970s, a committee composed of representatives from the transportation sector, the government, and computer manufacturers began to address developing a method of improving and standardizing electronic business communications. The American National Standards Institute (ANSI) chartered the Accredited Standards Committee (ASC) X12 in 1979. Its goal was to develop uniform standards for the electronic interchange of business transactions. X12 was the first of several standard formats for doing business electronically.

Of course, all this electronic business was done over private networks at the time. What would someday become the Internet had begun to develop but was limited to government and academic researchers.

Various X12 committees worked on standards for specific documents—mostly invoices and purchase orders. Each specification had to work for all users in all situations; standards were therefore complex and difficult to implement.

Because of the complexity of the EDI standards, some groups started to branch out and create more industry-specific standards. For example, the National Retail Merchants Association began developing a set of purchase-order message standards for EDI. However, these standards were not well-defined and messages were ambiguous, so retailers and suppliers did not use them. The National Retail Merchants Association subsequently chose to support the X12 standards.

FoodMovers will be communicating with some of its suppliers using an EDI 888 Item Maintenance document. This is one of the X12 standards, and is supported by BizTalk Server.

With that background, let's see how we can deal with these documents, and how we can integrate our external partners.

External partner integration

Integration problems are amplified when business logic resides in two different enterprises. In addition to the common differences of platform, language, and object type, there are technical issues such as firewall, security and, let's not forget the fact that everything that is done between two companies depends upon two different I.T. departments, each of which has a vested interest in making sure their data and business processes are secure and reliable.

In Section 3, I discussed service-oriented architecture, and explained how it offers a loosely-coupled, message-based integration solution. Service-oriented architecture simplifies the integration problem by offering a common interface for the communication. It standardizes the interface between two machines by establishing a contract, instantiated as a WSDL file, and provides an understanding of the format of the data and communications protocols that are used in the integration.

The use of service-oriented architecture can solve both of our integration problems, A2A and B2B.

BizTalk Server

Before we go much further, I need to introduce Microsoft BizTalk Server. BizTalk Server will be used to provide integration between the mainframe and the FoodMovers system, and between our internal managers and non-XML-based partners.

First, some nomenclature used in BizTalk Server

Organization

An organization is just what you might expect it to be. It is an enterprise with which you want to do business. A trading partner. Organizations are created and managed using the BizTalk Messaging Manager. The first organization you create should be your own. This is called the "Home Organization."

An organization is named using a human-readable string. This is called an "organization identifier." This string becomes the key that BizTalk Server uses to refer to that organization from that point on.

Application

Your home organization will have applications associated with it. These applications are used as sources for creating messages that go to external trading partners or other applications within your home organization.

FoodMovers uses the Update Manager, which communicates with a mainframe inventory control system. In our case, the source and destination organizations are both our home organization.

Likewise, if you want to send a message to an external trading partner organization, it must be sent from one of many different applications in your home organization. Since there are so many different types of applications at your home organization, just sending from your home organization is not enough.

Channel

Channels and messaging ports are used together to control the flow of information through the BizTalk Server. The relationship between channels and messaging ports is kind of confusing, so I will use examples to show how they fit together.

A channel defines a set of properties that allow for the processing of messages. Channels indicate the type of processing that the document is to undertake. Once the processing takes place, the document is sent to an associated messaging port. There can be any number of channels for a particular port.

For example, suppose a purchase order arrived at your organization from one of your trading partners, as shown in Figure 1. They use a purchase order schema that is different than the one you use internally, so it is essentially incompatible with your internal system.

Aa302165.foodmovers4_01(en-us,MSDN.10).gif

Figure 1. Receiving a document through a channel

Fortunately, you have configured your system to take care of this incompatibility by using a transformation map. When you set up the system, you created a map using the BizTalk Mapper. This is a tool that allows you to associate the fields and records in a source document with the appropriate records and fields in your internal data representation.

When the foreign document arrives at your channel, it knows what the structure of the incoming document is, and what the structure of the outgoing document should be, and automatically invokes the transformation mapper to convert from one structure to another.

The resulting document is sent to the port associated with incoming purchase orders.

Now, suppose you get the purchase order system up and running, and you want to do business with another trading partner organization that uses yet another purchase order structure to order products from you. No problem. The solution is illustrated in Figure 2.

Aa302165.foodmovers4_02(en-us,MSDN.10).gif

Figure 2. Adding another trading partner to an existing application

First, you would create a map to transform the new trading partner's purchase order structure to your internal purchase order structure. Then you would set up another organization to describe your new trading partner. As part of the organization setup, you would then set up an associated channel for purchase order processing, using the map you created as the tool that transforms from their structure to yours.

The port that is associated would be the existing port for processing anyone's purchase order. The beauty of this approach is that you can add new trading partners quickly, without writing massive amounts of integration code.

In the FoodMovers application, we will be using a map to transform incoming EDI documents to flat-file documents for the mainframe. We will use a second map to transform XML documents into the same flat-file format.

Messaging port

You can communicate with internal applications or external organizations using "messaging ports." A messaging port can be associated with a particular organization, or can be left unassociated. An unassociated messaging port is called an "open messaging port," and can be associated at a later time when it is required.

Messaging ports take care of a lot of the integration detail needed in a system. A messaging port can be configured to communicate with an organization through SMTP. If you do this, then the details of instantiating an SMTP object and managing the interface are automatically handled for you.

A messaging port can also instantiate a copy of a set of tasks called "orchestration" and invoke the scheduler engine. This is a way to automatically start a schedule processing on the receipt of a message coming from a trading partner.

Document definition

A document definition represents a specific type of document that is used in your system. A document definition can identify an inbound or outbound document coming through an associated channel.

A channel will have two document definitions associated with it. If these two document definitions are the same, the channel just acts as a conduit for transferring the document from the inbound location to the outbound location.

If a given channel has two different document definitions associated with it as inbound and outbound documents, then a map must be specified in order to give the outbound document the structure required by the receiving port.

A document definition can have a reference to a document specification. We will be using two maps, each of which has a different inbound document definition. One document will be an EDI stream coming from our EDI suppliers. The other will be an XML stream coming from our more progressive suppliers. The output of both will be the same outbound document definition (mainframe flat file).

Document specification

A document specification is a schema that describes the structure of a document that is to be managed by BizTalk Server. These can be custom XML schemas that are defined by any of your trading partners, or they can be open, international standard schemas that are shared by many companies in many different industries.

An example of the latter is the set of electronic data interchange (EDI) standards. BizTalk Server ships with many schemas that are XML representations of these standard document types.

Schemas are created using the BizTalk Editor, a client tool that ships with all versions of BizTalk Server. All schemas in BizTalk Server are expressed using the XML Data Reduced (XDR) schema syntax.

Document envelope

Just like physical envelopes provide a way to transport physical documents to a desired location, a BizTalk document envelope encapsulates a BizTalk message for transport. BizTalk Server supports several different envelope types in order to provide the most flexibility in integrating business partners that are using different e-commerce applications:

  • Custom XML
  • ANSI X12
  • UN/EDIFACT
  • Flat file delimited
  • Flat file positional
  • Custom
  • Reliable

The reliable format creates and processes envelopes that are compliant with BizTalk Framework 2.0 specification.

Message queuing

Microsoft Windows Server ships with a technology called Microsoft Message Queuing (MSMQ). Message queues are designed to provide scalability, reliability, load-balancing, and a degree of fault-tolerance to mission-critical applications.

A queue is a system service that allows an application to place messages intended for another application. Picture the scenario where you are an insurance company. You receive medical records from many different places: hospitals, doctors, other insurance companies. These documents will be processed by your BizTalk Server, a single machine running on your network.

As with any real-time system, the load of your system will vary depending on such factors as the time of day, the number of clients you have, even the phase of the moon (studies show that more accidents happen during a full moon).

Now you have a problem. If you scale your system to handle reasonable loads, the system will bog down and response time will suffer during peak loads. If you build a system that is designed for the peak load, you will have spent a lot of money, and your system will be sitting largely unused most of the time.

In order to keep response time to a minimum, you should be able to accept these messages quickly, so the transmitting organization can get back to their work. You don't want your customers to wait for you to get around to processing the documents during peak loads.

Enter queuing. By setting up a queue, messages can be posted in a place very quickly, releasing the transmitting organization from waiting for your system to respond. During light-load times, your BizTalk server will pull these documents off the queue right away and process them.

During peak-load times, BizTalk Server might not be able to process everything as soon as it comes in, so the queue starts to fill. During the inevitable lulls, BizTalk Server can catch up, emptying the queue.

This is called asynchronous communication, and is an important part of the foundation of BizTalk Server.

If the peak loads last longer than your system can handle, it is very easy to set up another computer running BizTalk Server to read messages off the same queue. This elegant architecture provides instant scalability, as well as a level of fault-tolerance, because if one server goes down, others can pick up the slack.

Mapping

Mapping is the process of transforming a document from one document specification to another. BizTalk document mapping uses XSLT as the programming language to transform documents from a source specification (source) to a destination specification (sink).

Document maps are created with the BizTalk Mapper application. The BizTalk Mapper creates an XSLT transformation program. Document mapping is done automatically by a channel if the two document definitions defined by the channel are different.

I will talk more about mapping from XML and non-XML sources later in this section.

Orchestration

Orchestration is the defining of business processes and the order in which they occur. Orchestration is similar to workflow, except that orchestration provides for a more integrated set of process flow between systems.

In BizTalk Server, orchestration is the complete definition of a particular business process, which flows from action task to action task. Most actions can have processes associated with them through orchestration ports. These processes are usually applications or objects that provide business functionality that must be executed in order achieve the desired business process.

An orchestration is created using the BizTalk Orchestration Designer. The BizTalk Orchestration Designer is a tool that integrates Microsoft Visio technology for defining workflows and integrating those workflows with internal or external processes, queues, or systems. A completed orchestration is saved as an XML document that can be executed as many times as required when a particular business process or transaction is needed.

Hop on the Message Bus

Now that I have introduced BizTalk Server, let's see how it fits in with the rest of the FoodMovers system.

If I want to talk to you personally, I can walk up to you and we can communicate as humans have done for a long time. If we are distant, we can use a technological proxy to assist in the communication process: the telephone. A telephone conversation is a private, one-to-one communication. I see another matrix, where the sender and receiver each can access one or many parties. This is shown in Table 1.

Table 1. Communication matrix

  One Sender Many Senders
One Receiver Telephone Poll
Many Receivers Public address Conference call

Suppose that I want to have a conversation with several people at once. I can set up a conference call, where everyone can take their turns talking. This is a many-to-many communications technique.

Now suppose that I want to talk to a bunch of people in the office, but I don't want to be interrupted with their feedback. I get on the public address system. This is an example of one-to-many communication.

To complete our graph, suppose I want to invite many people to communicate only with me. An example of this is a poll, where a single person contacts many people for the purpose of getting their opinion.

In the FoodMovers project, we need a way for everyone to communicate in a way that is most effective for a particular purpose. How can we achieve this in a way that is scaleable, secure, and easy to maintain?

As I mentioned earlier, I prefer a loosely-coupled, message-based communication and integration model. In order to make this happen, I need some means for these messages to be sent and received.

In our system, we will use a thing called a "message bus." A message bus is a virtual conduit, sitting in a communications cloud, that resides among the applications. The message bus supports the protocol and features that the components require to communicate and integrate. A message bus can also provide a certain level of protocol translation

Our message bus works as publish/subscribe system. The publisher exposes an interface that its subscribers use, but the publisher doesn't care who uses its services or which protocol they use, as long as it is compatible with the standards employed. The message bus is the thing that allows us to enable loosely-coupled, message-based communication. The message bus is illustrated in Figure 3.

Aa302165.foodmovers4_03(en-us,MSDN.10).gif

Figure 3. Components communicate sending messages across a message bus

Every component interface understands the characteristics of each message bus that it needs to use. A common message bus that you use every day is the Internet. Web sites have published their information in the form of a URL, and your Web browser subscribes to that published site using the agreed-upon standards HTTP and HTML.

Another message bus is the telephone system. A service (your friend) first publishes her contact information (phone number, spoken language, hours she is accessible, etc). When you want to communicate, you place a request for service (dial her number), which is responded to (she picks up the phone), and the communications can begin.

Since the task of each message bus is to allow for efficient communication, it is possible, even common, for a component to work with more than one message bus. For example, you can call your friend while looking at your Web browser to find out what's playing at the local cinema.

When the components are services, then the message bus uses Web services standards. The services in the FoodMovers system communicate with each other using a loosely coupled, message-based communication model. We will be using several different message busses, depending upon the type of communication is required. This is shown in Figure 4.

Aa302165.foodmovers4_04(en-us,MSDN.10).gif

Figure 4. FoodMovers uses a loosely coupled communication model using multiple message busses for legacy, business-to-business and services-to-services integration.

Most components use the internal network as the message bus, passing SOAP messages to one another. Anyone outside the firewall uses the Internet as the message bus to get messages into our system.

In order to communicate with BizTalk Server, the Update Manager places messages on a queue managed by Microsoft Message Queue (MSMQ). These messages are picked up one at a time and processed by the BizTalk Server in accordance with the orchestration that was defined.

BizTalk Server then places a message on an IBM technology called MQ Series, which is similar to MSMQ. The mainframe speaks MQ Series and pulls messages off the queue when it is ready for them.

The message bus must support the required functions that the business components and services require. For example, the MSMQ and MQ Series message buses both support reliable messaging and transactions, but the SOAP message bus does not. There are work-in-progress specifications that will extend SOAP to support these more advanced communication functions. I will discuss some of the specifications and show the implementation in Section 5, Extensions: Building Blocks for Extra Functionality, which covers service interface extensions using Microsoft's Web Service Enhancement toolkit (WSE).

Item Maintenance Process

Now let's take a look at all of these technologies in action.

As I described in Section 1, the Item Maintenance process involves getting an Item Maintenance document submitted by our suppliers and processing it to update our mainframe item database.

This process involves business-to-business integration as well as legacy integration. The functional diagram and process flow are shown in Figure 5.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 5. The Item Maintenance process accepts maintenance information in two different forms in order to update the mainframe database.

We are receiving item maintenance documents in two different forms. From our traditional suppliers we get an EDI 888 document, and from our new Web service partners we get an XML document. How we process the document depends on which of these two types it is.

An EDI document is shown below.

ISA*00*          *00*          *08*148033863      *08*9251730000     
*020123*1239*U*00401*000000015*0*T*>~
GS*QF*148033863*2137279234*20020123*1239*16*T*004010UCS~
ST*888*160001~N1*VN*Kraft Foods  
Inc.*9*0013390350~N1*BY*FoodMovers*9*006906804~
G62*09*20020123*W*123904~G53*003~
G39*000013649982*VN*000013649982**5.313*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0.040*OZ****CN*987654*N*2.363~
G69*160/.04Z KENCO COLMBN COFFEE~
G55*UD*000013649982***1*IN*1*IN*1*IN****0.040*OZ***********0.014*N*L~
G69*160/.04Z KENCO COLMBN COFFEE~
G39*000013649983*VN*000013649983**5.313*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0.040*OZ****CN*987654*N*2.363~
G69*160/.04Z KENCO KENYAN COFFEE~
G55*UD*000013649983***1*IN*1*IN*1*IN****0.040*OZ***********0.014*N*L~
G69*160/.04Z KENCO KENYAN COFFEE~
G39*000013653491*VN*000013653491**5.375*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0.040*OZ****CN*987654*N*2.291~
G69*160/.04Z KENCO ESPRESSO COFFEE~
G55*UD*000013653491***1*IN*1*IN*1*IN****0.040*OZ***********0.014*N*L~
G69*160/.04Z KENCO ESPRESSO COFFEE~
G39*000013653492*VN*000013653492**5.937*G*L*6.22*IN*8.70*IN*14.72*IN*0.461*CF
*013006*160*0.040*OZ****CN*987654*N*2.858~
G69*160/.04Z KENCO CAPPUCCINO~
G55*UD*000013653492***1*IN*1*IN*1*IN****0.040*OZ***********0.017*N*L~
G69*160/.04Z KENCO CAPPUCCINO~
G39*000013653523*VN*000013653523**5.375*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0. 040*OZ****CN*987654*N*2.291~
G69*160/.04Z KENCO DECAF COFFEE~
G55*UD*000013653523***1*IN*1*IN*1*IN****0.040*OZ***********0.014*N*L~
G69*160/.04Z KENCO DECAF COFFEE~
G39*000013653524*VN*000013653524**5.005*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0.040*OZ****CN*987654*N*2.396~
G69*160/.04Z KENCO DARK RST COFFEE~
G55*UD*000013653524***1*IN*1*IN*1*IN****0.040*OZ***********0.014*N*L~
G69*160/.04Z KENCO DARK RST COFFEE~
G39*000013653525*VN*000013653525**5.313*G*L*6.42*IN*8.50*IN*14.41*IN*0.455*CF
*013006*160*0.040*OZ****CN*987654*N*2.218~
...

As you can see, EDI is a terse syntax, optimized for getting the maximum amount of information across the minimum bandwidth. The record type is indicated with the first few characters in the record, and tildes (~) mark the end of a record. Asterisks generally mark the end of fields.

Suppliers that want to use the SOAP Web service interface can access the system by first grabbing the WSDL document. As I mentioned in Section 3, the WSDL contains everything the client needs to know in order to create a document that calls the Web service.

The WSDL file is shown below. It defines the service and operations of the interface.

<?xml version="1.0"?>
<definitions
   xmlns:http="https://schemas.xmlsoap.org/wsdl/http/"
   xmlns:soap="https://schemas.xmlsoap.org/wsdl/soap/"
   xmlns:s="https://www.w3.org/2001/XMLSchema"
   xmlns:s0="https://foodmovers.com/services/UpdateManager"
   xmlns:soapenc="https://schemas.xmlsoap.org/soap/encoding/"
   xmlns:i0="https://FoodMovers.com/schemas/ItemMaintenanceData"
   xmlns:tm="https://microsoft.com/wsdl/mime/textMatching/"
   xmlns:mime="https://schemas.xmlsoap.org/wsdl/mime/"
   targetNamespace="https://foodmovers.com/services/UpdateManager"
   xmlns="https://schemas.xmlsoap.org/wsdl/">
   <types>
      <xs:schema
         xmlns:mstns="https://FoodMovers.com/schemas/ItemMaintenanceData"
         xmlns:msdata="urn:schemas-microsoft-com:xml-msdata"
         xmlns="https://FoodMovers.com/schemas/ItemMaintenanceData"
         attributeFormDefault="qualified" elementFormDefault="qualified"
         targetNamespace=
            "https://FoodMovers.com/schemas/ItemMaintenanceData"
         id="ItemMaintenanceData"
         xmlns:xs="https://www.w3.org/2001/XMLSchema">
         <xs:element msdata:IsDataSet="true" name="ItemMaintenanceData">
            <xs:complexType>
               <xs:choice maxOccurs="unbounded">
                  <xs:element name="ItemMaintenance">
                     <xs:complexType>
                        <xs:sequence>
                           <xs:element name="UPC" type="xs:string" />
                           <xs:element minOccurs="0" name="SupplierName" 
                              type="xs:string" />
                           <xs:element minOccurs="0" name="Description" 
                              type="xs:string" />
                           <xs:element name="Action" type="xs:string" />
                           <xs:element minOccurs="0" name="WholesalePrice"
                              type="xs:decimal" />
                           <xs:element minOccurs="0" name="RetailPrice" 
                              type="xs:decimal" />
                        </xs:sequence>
                     </xs:complexType>
                  </xs:element>
               </xs:choice>
            </xs:complexType>
         </xs:element>
      </xs:schema>
      <s:schema elementFormDefault="qualified"
         targetNamespace="https://foodmovers.com/services/UpdateManager">
         <s:import 
            namespace="https://FoodMovers.com/schemas/ItemMaintenanceData"/>
         <s:element    name="TransmitMaintenance">
            <s:complexType>
               <s:sequence>
                  <s:element minOccurs="0" name="MaintenanceDoc">
                     <s:complexType>
                        <s:sequence>
                           <s:any namespace="https://FoodMovers.com/schemas/
                              ItemMaintenanceData"/>
                        </s:sequence>
                     </s:complexType>
                  </s:element>
               </s:sequence>
            </s:complexType>
         </s:element>
         <s:element    name="TransmitMaintenanceResponse">
            <s:complexType>
               <s:sequence>
                  <s:element 
                     name="TransmitMaintenanceResult" type="s:int"/>
               </s:sequence>
            </s:complexType>
         </s:element>
      </s:schema>
   </types>
   <message name="TransmitMaintenanceSoapIn">
      <part name="parameters" element="s0:TransmitMaintenance"/>
   </message>
   <message name="TransmitMaintenanceSoapOut">
      <part name="parameters" element="s0:TransmitMaintenanceResponse"/>
   </message>
   <portType name="UpdateManagerServiceSoap">
      <operation name="TransmitMaintenance">
         <documentation>Receives maintenance document and responds 
            with status</documentation>
         <input message="s0:TransmitMaintenanceSoapIn"/>
         <output message="s0:TransmitMaintenanceSoapOut"/>
      </operation>
   </portType>
   <binding name="UpdateManagerServiceSoap"
      type="s0:UpdateManagerServiceSoap">
      <soap:binding transport="https://schemas.xmlsoap.org/soap/http"
         style="document"/>
      <operation name="TransmitMaintenance">
         <soap:operation style="document"
            soapAction="https://foodmovers.com/services/
               UpdateManager/TransmitMaintenance"/>
         <input>
            <soap:body use="literal"/>
         </input>
         <output>
            <soap:body use="literal"/>
         </output>
      </operation>
   </binding>
   <service name="UpdateManagerService">
      <port name="UpdateManagerServiceSoap" 
         binding="s0:UpdateManagerServiceSoap">
         <soap:address location="https://localhost/FoodMovers/
            FoodMoversWebServiceProjects_UpdateManager/
            UpdateManager.asmx"/>
      </port>
   </service>
</definitions>

If the supplier is using Visual Studio .NET, the process of adding the FoodMovers UpdateManager service is as easy as adding a Web Reference to the project. If they are using the J2EE Java-based suite, they can use the xrpcc utility to create a set of proxy classes based on the WSDL document.

However they get the service interface information, the supplier is now ready to send us a SOAP request document. An example of an XML Web service document is shown below.

<ItemMaintenanceData
   xmlns="https://FoodMovers.com/schemas/ItemMaintenanceData">
   <ItemMaintenance>
      <UPC>1090000109</UPC>
      <Action>update</Action>
      <RetailPrice>3.35</RetailPrice>
   </ItemMaintenance>
   <ItemMaintenance>
      <UPC>1111112423</UPC>
      <Description>New Item from Hearty Soup Company</Description>
      <Action>insert</Action>
      <WholesalePrice>1.8</WholesalePrice>
      <RetailPrice>3.35</RetailPrice>
   </ItemMaintenance>
   <ItemMaintenance>
      <UPC>1111112412</UPC>
      <Action>update</Action>
      <WholesalePrice>1.8</WholesalePrice>
   </ItemMaintenance>
   <ItemMaintenance>
      <UPC>4119601013</UPC>
      <Description>Chunky Tomato Basil Soup</Description>
      <Action>update</Action>
      <WholesalePrice>0.77</WholesalePrice>
      <RetailPrice>1.98</RetailPrice>
   </ItemMaintenance>
   <ItemMaintenance>
      <UPC>1111112418</UPC>
      <Action>delete</Action>
   </ItemMaintenance>
</ItemMaintenanceData>

This document would be received encapsulated in a SOAP envelope. It is defined in our project as ItemMaintenanceData.

There are advantages of sending an XML SOAP document over sending an EDI document. I will talk about some of these advantages in Section 5, Extensions: Building Blocks for Extra Functionality.

Update Manager

The Update Manager receives the Item Maintenance document. If the document is an XML SOAP request, the Update Manager replies to the requesting service with the number of ItemMaintenance items it received. This is not a confirmation that the maintenance has been performed. Rather, it just acts as a receipt that the document was retrieved intact and accepted. We will not know whether the item maintenance has been completely performed until the mainframe processes it.

Since the Update Manager does not know anything about EDI, it just accepts the document as is.

Legacy Integration

Whether the document is in XML or EDI format, the Update Manager sends it to BizTalk Server, which does have knowledge of both types of documents. BizTalk Server will apply a different map (XSLT program) to each document depending upon what kind of format it is. The output of both of these maps is an identical flat file that does to the mainframe. By the time the mainframe gets the document, it does not know or care what the original transmitted format was.

Sending the document to BizTalk Server involves attaching to a queue that BizTalk Server is listening to, then placing the document on the queue and getting a receipt. The C# code for this is shown below.

MessageQueue queue = new MessageQueue();

if (MessageQueue.Exists(queueName))
{
   queue.Path = queueName;
   Message msg = new Message();
   ActiveXMessageFormatter format = new ActiveXMessageFormatter();
   format.Write(msg, this.SerializeXml(invoice));

   if (queue.Transactional) 
   {
      MessageQueueTransaction tx = new MessageQueueTransaction();   
      tx.Begin();
      queue.Send(msg,"ItemMaintenanceData", tx);
      tx.Commit();
   }
   else
   {
      queue.Send(msg);
   }
}

The queue is transactional, which ensures reliable messaging; once it is submitted, it will be tried until the message is successfully submitted to the queue.

BizTalk Server

The BizTalk Server is listening to that queue through a "receive function." Whenever BizTalk Server has some time, it looks at the queue to see if there is anything to process.

When BizTalk Server pulls the document off the queue, it must process the document so it can provide to the mainframe application what is expected. The mainframe does not know anything about XML or EDI. It speaks a language called "flat file." The job of the BizTalk Server is to create a flat file so that the maintenance transmittal gets committed to the mainframe database after passing through the mainframe business rules that check for the validity of the item information.

BizTalk Server needs to transform the input document to a flat file and place this flat file on the queue that the mainframe is waiting for. The BizTalk Server Messaging service "sniffs" the Item Maintenance document in order to understand what kind of document it is, whether it is an EDI or XML document.

As I mentioned earlier, BizTalk Server has the ability to transform any given input to any given output. In this case, we have an XML input and a non-XML input. The output will always be the same non-XML flat file. So how does it do the mapping?

Once it determines the document type, it invokes one of two maps, which do the transformation from the input to the output. BizTalk uses XSLT to do the mapping. XSLT requires that the input document be XML. Transforming an XML input document to a flat file output just involves invoking the XSLT processor with an appropriate XSLT program. This is shown in Figure 6.

Aa302165.foodmovers4_06(en-us,MSDN.10).gif

Figure 6. XML-to-XML document mapping

Documents that are not defined with XML structure can also be mapped, but a more complex path is taken, as shown in Figure 7.

Aa302165.foodmovers4_07(en-us,MSDN.10).gif

Figure 7. Non-XML to non-XML document mapping (EDI to flat-file)

In this case, a non-XML document is first converted into an XML document, which is defined by a document specification. This is done by a parser that understands the source document syntax and structure. BizTalk Server ships with a number of parsers for common standards, such as EDI. You can also write your own parser if you have a document format that is not supported by the native BizTalk Server.

The temporary XML document created by the parser is transformed into the destination document.

Once the flat file has been created, BizTalk Server submits the flat file Item Maintenance document to the mainframe by placing it on the MQ Series queue. The mainframe applications have business rules that have been written throughout the years that check whether the items that the supplier has sent can be accepted.

The mainframe is busy doing things during the day, and so it processes all of the documents at night. This is an asynchronous process. When mainframe application is finished with each item maintenance document, it sends a confirmation back to the BizTalk Server using FTP. If there is a single problem with the Item Maintenance document, then the whole transmittal document is rejected.

After the mainframe processes the item maintenance document, a separate transaction sends a reply to the supplier. If the transmittal was an EDI document, another EDI document is sent to indicate the success or failure of the maintenance task. If the original document was an XML SOAP message, then another SOAP message is sent with results of the maintenance process.

Query Inventory Database

In addition to item maintenance, our suppliers are able to query our inventory database to see what our stocking level is for certain items. This is a risky proposition, since we are exposing some mission-critical and proprietary data to parties outside of our own organization. We have determined, however, that it is an acceptable risk, since we will only expose the information to suppliers that we have learned to trust over the years.

The Inventory Manager exposes a Web service that will provide our trusted partners with information about levels of certain inventory items. This is shown in Figure 8.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 8. An external Web service is exposed so suppliers have the ability to interrogate inventory levels directly.

Verification of credentials is required before any data is returned. This does two things. First, it authenticates that the party requesting the information has a pre-arranged relationship with us that authorizes them to access our system. Second, it lets us know who is accessing the database so we can expose only the items that the supplier sells. We don't want Supplier A seeing how much of their competitor, Supplier B, has on our warehouse shelves. I will be adding authentication to our system using signatures using X.509 certificates. This is discussed in detail in Section 5, Extensions: Building Blocks for Extra Functionality.

The main reason for allowing our suppliers to check on the level of inventory is to give them some idea when we will need to re-order. In fact, we also provide a facility for suppliers to place orders for their own goods. I will cover that later in this section.

Accessing the Inventory Manager Web service provides the supplier with a synchronous access to the inventory level of any items that they supply to us. The communication between Hearty Soup Company and the Inventory Manager Web service is shown in Figure 8.

Again, because the integration is based on the standards, FoodMovers and Hearty Soup can integrate their services quite easily and in an automated way.

Since we are delivering this information over public connections, we are opening ourselves to possible risk of exposing proprietary data to those whom we might not want to see the data. In Section 5, Extensions: Building Blocks for Extra Functionality, I will show how to encrypt a document in Web service transactions, so we can prevent access of the data by any parties other than the ones we trust.

Supplier Orders

FoodMovers takes great care in managing its inventory. The business logic, rules, and data access for ordering is encapsulated in a service interface called Order Manager.

During the day, FoodMovers buyers place orders with suppliers to replace the items that are moving off the warehouse shelves. Orders are created with a WinUI application, shown in Figure 9, that the internal employees use. This application connects to the Order Manager, which enters the order into the Supplier Orders tables (SupplierOrders and SupplierOrderItems).

Aa302165.foodmovers4_09(en-us,MSDN.10).gif

Figure 9. Supplier order application

This application uses many of the windows forms objects that are available in Visual Studio .NET. In fact, it uses a DataTable object to display the shopping cart at the bottom of the screen. This DataTable is bound to a DataGrid object. The code below shows how it is done.

static public UserData datUser = new UserData();

datCart.TableName = "ShoppingCart";
DataColumn myDataColumn;

myDataColumn = new DataColumn
   ("UPC", System.Type.GetType("System.String"));
datCart.Columns.Add(myDataColumn);
myDataColumn = new DataColumn
   ("Qty", System.Type.GetType("System.Int32"));
datCart.Columns.Add(myDataColumn);
myDataColumn = new DataColumn
   ("Description", System.Type.GetType("System.String"));
datCart.Columns.Add(myDataColumn);
myDataColumn = new DataColumn
   ("WholesalePrice", System.Type.GetType("System.Decimal"));
datCart.Columns.Add(myDataColumn);
myDataColumn = new DataColumn
   ("RetailPrice", System.Type.GetType("System.Decimal"));
datCart.Columns.Add(myDataColumn);
myDataColumn = new DataColumn
   ("Total", System.Type.GetType("System.Decimal"), 
   "Qty * WholesalePrice", MappingType.Attribute);
datCart.Columns.Add(myDataColumn);

this.gridCart.DataSource = datCart;

Notice the code for the Total column. It automatically contains the sum of the Qty field and the WholesalePrice field. When an item is added to the shopping cart, it is loaded into the datCart object. Since the DataGrid object is bound to the datCart object, it updates automatically. The code for adding an item to the shopping cart is shown below.

SplitString = 
   lbItems.Items[lbItems.SelectedIndex].ToString().Split('\t');
decimal RetailPrice = Convert.ToDecimal(SplitString[1]);
DataRow myDataRow = ((DataTable)datCart).NewRow();
myDataRow["UPC"] = SplitString[0];
myDataRow["Qty"] = 1;
myDataRow["RetailPrice"] = SplitString[1];
myDataRow["WholesalePrice"] = SplitString[2];
myDataRow["Description"] = SplitString[3];
datCart.Rows.Add(myDataRow);
this.gridCart.Refresh();

In Section 3, I talked about how the data access is done. To review, all data tables in the SQL database are reflected as DataSet objects in the C# project. The advantage of using the DataSet object is that filling the object with data and writing data back out to the database is easy to do.

In the case of supplier orders, there are two SQL Server database tables that must be written to when an order is placed. The SupplierOrders table contains header information about an order. This includes the order data, supplier identifier, the date needed, and so on.

There is also a SupplierOrderItems table that contains one record for each item ordered. It is related to the SupplierOrders table by the OrderID field. We created a DataSet by first creating an XML schema that reflected both tables plus a relationship between them. This is shown in Figure 10.

Aa302165.foodmovers4_10(en-us,MSDN.10).gif

Figure 10. The SupplierOrderData tables are linked with a relationship object.

Visual Studio .NET auto-generated a DataSet class for us, so we don't need to do that work.

Because there are two tables involved, we need to set up a relationship. I use the Relations object to link the OrderID in both tables. This technique is detailed in Section 2, Templates, Policies, Database, and Service Interface Design.

Now we can use this object in our application. When the user finishes selecting items and wants to commit the order, the application must write the data to the SupplierOrder tables in the database. The SupplierOrders DataSet is populated by copying appropriate information from the datCart object. This is shown below.

SupplierOrderData datSupplierOrder = new SupplierOrderData();

drNewOrder = datSupplierOrder.Tables["SupplierOrder"].NewRow();
drNewOrder["OrderID"] = 0;
drNewOrder["OrderDate"] = DateTime.Now;
drNewOrder["NeedBy"] = this.dtpNeedBy.Value;
drNewOrder["SupplierID"] = SupplierID;
drNewOrder["UserID"] = Configuration.AuthenticatedUserID;
datSupplierOrder.Tables["SupplierOrder"].Rows.Add(drNewOrder);

for (int i = 0; i < datCart.Rows.Count; i++)
{
   drNewItem = datSupplierOrder.Tables["OrderItem"].NewRow();
   drNewItem["OrderID"] = "0";
   drNewItem["UPC"] = datCart.Rows[i]["UPC"];
   drNewItem["Quantity"] = datCart.Rows[i]["Qty"];
   datSupplierOrder.Tables["OrderItem"].Rows.Add(drNewItem);
}

Once the DataTable object is populated, it is a simple matter to insert the order. The OrderManager service exposes an operation called InsertSupplierOrder, which takes the SupplierOrderData as an argument. This is shown below.

public static SupplierOrderData datConfirmOrder;
OrderManagerService svcOrderManager = new OrderManagerService();

datConfirmOrder = svcOrderManager.InsertSupplierOrder(datSupplierOrder);

The InsertSupplierOrder method is located in DataAccess/SupplierOrders.cs. The method is shown below.

public SupplierOrderData InsertSupplierOrder(SupplierOrderData data)
{
   SqlCommand command = dsCommand.InsertCommand;

   command.CommandText = "InsertSupplierOrder";
   command.CommandType = CommandType.StoredProcedure;
   command.Parameters.Clear();
   command.Parameters.Add(new SqlParameter("@OrderDate",
      System.Data.SqlDbType.DateTime,  4, "OrderDate"));
   command.Parameters.Add(new SqlParameter("@NeedBy",    
      System.Data.SqlDbType.DateTime,  4, "NeedBy"));
   command.Parameters.Add(new SqlParameter("@SupplierID",
      System.Data.SqlDbType.Int,       4, "SupplierID"));
   command.Parameters.Add(new SqlParameter("@UserID",    
      System.Data.SqlDbType.NVarChar,   10, "UserID"));
   dsCommand.InsertCommand = command;
   dsCommand.Update(data,"SupplierOrders");

   // get the order number that was returned
   int NewOrderID = 0;
   NewOrderID = Convert.ToInt32(data.SupplierOrder[0].OrderID);

   foreach (DataRow pRow in data.Tables["OrderItem"].Rows)
      pRow["OrderID"] = NewOrderID;

   command.CommandText = "InsertSupplierOrderItems";
   command.CommandType = CommandType.StoredProcedure;
   command.Parameters.Clear();
   command.Parameters.Add(new SqlParameter("@OrderID",   
      System.Data.SqlDbType.Int,       4, "OrderID"));
   command.Parameters.Add(new SqlParameter("@UPC",       
      System.Data.SqlDbType.NVarChar, 10, "UPC"));
   command.Parameters.Add(new SqlParameter("@Quantity",  
      System.Data.SqlDbType.Int,       4, "Quantity"));
   dsCommand.InsertCommand = command;
   dsCommand.Update(data,"OrderItems");

   return data;
}

Even though the two DataSet objects are related with the Relations object, they still need to be updated individually. First, we update the SupplierOrders table by invoking the InsertSupplierOrder stored procedure. This has been specified as the InsertCommand property of the DataSet object.

The OrderID field is an auto-number integer (identity), which will be set once the order is entered. After inserting the order, we get that OrderID and step through the collection of SupplierOrderItems and set the OrderID field.

Once this is set, the SupplierOrderItems table is populated using the InsertSupplierOrderItems stored procedure.

Once an order is placed in the database, it is sent to the purchasing department, where it is printed and sent to the supplier for fulfillment. This project does not address that part of the FoodMovers organization.

Supplier-Entered Orders

FoodMovers saw an opportunity of freeing our buyers of work and giving an opportunity to work closely with some of its suppliers. So, in addition to having internal buyers place orders with an in-house system, we also allow our trusted partners to place orders for certain items. This is similar to the kind of services exposed to those same trusted partners who access our inventory database.

Using the service interface exposed by Order Manager, Hearty Soup Company can place orders directly. When it does, the Order Manager runs its business logic and rules in order to authenticate the supplier, validate the order, and then enter it. This is shown in Figure 11.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 11. The Order Manager system enters orders and maintains data integrity of orders for suppliers.

The process is exactly the same, whether the order comes from a WinUI locally or from an external partner using a Web service. Of course, there is an authentication requirement for orders received over a public bus. We could build that logic into the application, but there is a better way.

In Section 5, I will use Microsoft's Web Services Enhancement toolkit (WSE) to handle both user authentication and encryption. Using this toolkit has two advantages compared with doing it ourselves. First, we don't need to create the code and maintain it. Second, using WSE will enable us to use open standards for authentication and encryption so client applications can be written on any platform that supports those standards.

Store Orders

FoodMovers is a "middleman." We order from suppliers and put their goods on our warehouse shelves. Our customers, the stores, order from us. The Order Manager handles store orders, also.

The order interface also has an application for internal FoodMovers staff to place an order for a particular store. Like the Supplier Order interface, the Store Order interface is a WinUI application that connects to the Order Manager. A typical screen is shown in Figure 12.

Aa302165.foodmovers4_12(en-us,MSDN.10).gif

Figure 12. Supplier order application

The internal processing required for placing a store order is essentially the same as placing an order from a supplier. The WinUI application shows a shopping cart, which is copied to the StoreOrderData DataSet object on checkout. This is sent to the InsertStoreOrder method, which is in DataAccess/StoreOrders.cs. Most of this code is described in Section 3.

Store-entered Orders

In addition to our internal WinUI client for entering store orders, there are several ways for the stores themselves to enter orders.

The functional diagram and process flow are shown in Figure 13.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 13. The Order Manager takes care of orders from stores from a number of different input types.

The Order Manager exposes the same methods (operations) that are used by the WinUI store order interface and the external partner interface. The only thing that is different in the store-entered orders is the interface.

Smaller stores that have a computer can log onto FoodMovers' Web site to place their order one item at a time. In a typical scenario, a store owner would keep track during the day of what items need to be restocked. At the end of the day, the owner would log onto the interactive ordering site and place his order.

This is a convenient way to enter items, and is much more efficient than ordering by phone or fax. The Web site is optimized for smaller stores and provides an interface that is efficient over dial-up connections.

If a store wants to upgrade from this largely manual method, it can buy a wireless device with a barcode scanner running Pocket PC software. This is essentially the same device the warehouse personnel use to receive incoming orders, but has different software. In a typical scenario, a grocer would simply scan or enter the UPC number of the product and enter the quantity desired. The device would send a message directly to the compact interface at FoodMovers through a cellular data connection built into the device. Instant feedback is given, and the grocer knows that the order has been received. This makes inventory maintenance at the grocery store much easier and faster.

Medium-size store chains probably already have some kind of computer infrastructure that tracks sales throughout the day and keeps track of inventory levels for each of its products. This type of system usually provides a list of items that it needs to order every night. Since this list is already available, it would be redundant for someone to enter the data onto a Web site or use a scanner. For these stores, the FoodMovers order interface allows the upload of a Microsoft Excel spreadsheet containing the order information. FoodMovers IT personnel would work with these chains to teach them how to create a compatible spreadsheet and upload it over their Web interface.

Let's take just one as an example. The WebUI form is shown in Figure 14.

Aa302165.foodmovers4_14(en-us,MSDN.10).gif

Figure 14. The Store Order application allows users to access the order system from anywhere.

When the Check Out button is clicked, the shopping cart object is copied to the StoreOrderItem object, which is updated by using the InsertStoreOrder operation exposed by the Order Manager interface.

The Store Order interfaces show the power of a service-oriented architecture. We wrote the Order Manager once, but we were able to use it to build five distinct processes on four different platforms. By layering the data, business logic, and presentation code, we are able to leverage our development, resulting in a more robust system.

Services-to-Services Integration

Now that I have covered the inventory maintenance and ordering processes, I'd like to talk about integrating Web services internally.

In the FoodMovers scenario, the Warehouse Manager connects directly to the Inventory Manager to provide inventory information for warehouse functions. This service-to-service integration is made easy by the use of Web services standards.

Integrating Web services is much easier, because we have exposed a common interface for the character set, vocabulary, service definitions, and encapsulation characteristics. As a result of this common interface, integration between service interfaces becomes as easy as accessing the services with the user interfaces as I have done earlier in this section.

Receive supplier orders

When a truck from a supplier pulls up to our warehouse, the items in the truck must be loaded into the warehouse and marked in our orders database as being received. The warehouse interfaces has a ReceiveOrders process, which is designed to update the database when items are received.

Warehouse Manager does a sanity check on the items and then sends a request to the Inventory Manager, which updates the database and reports on the status. The functional diagram and process flow are illustrated in Figure 15.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 15. The Warehouse Manager integrates the warehouse receiving functions from a wireless device or a Windows terminal.

The Warehouse Manager can be accessed with a WinUI client if the user has a printout from the truck that must be entered. But there is an alternate interface that provides the warehouse user to scan items as they come off the truck with a wireless device.

This PocketPC device is connected to the warehouse floor system by radio, and can update items immediately, and provide an instant feedback to the user by way of a confirming beep.

Which ever method was used to receive an item, the item information that is captured by the clients is now available to the Warehouse Manager. The Warehouse Manager runs its business rules against the item to check if the received item has, in fact, been outstanding in the order database.

Once the sanity check is completed, the Warehouse Manager sends the transaction information Inventory Manager. This requires the two Web services to communicate. As I have mentioned before, this is the advantage of the Service-oriented architecture where the business logic and the data access is combined to deliver a business outcome, to accomplish a service.

As the Warehouse Manager hands in each item to the Inventory Manager, the Inventory Manager marks the item as received in the SupplierOrders and SupplierOrderItems databases, and updates the Inventory database to reflect the new goods coming in.

Ship to stores

Our last subsystem provides the functionally for fulfilling store orders. Each morning, FoodMovers warehouse personnel must load their trucks for delivery to the stores. The Warehouse Manager system has an interface that prints a shipping manifest for all store orders due that day. This system is illustrated in Figure 16.

Functional Diagram Process Flow
Click here for larger image Click here for larger image

Figure 16. The Warehouse Manager provides a function for printing a set of shipping instructions for each truck.

Store orders have been entered during the day, and are sitting in the database with a "need by" date. The Shipping Manifest uses Crystal Reports to create a paper report listing the items that go to each store.

The inventory database has a record for shelf address, and each store is assigned to a particular driver. The system creates one report for each driver, with all items sorted by shelf address in the warehouse. This report is given to the driver, who knows how to sort the items in the truck to make the most sense when he or she is out on delivery.

Conclusion

In this section, we have seen several examples of services interacting with other services, with end-user applications, and with external partners. The architecture of the system lends itself nicely to providing a robust, scaleable system because of the layering of certain aspects of data and logic. The service-oriented architecture approach allows us to define the entire system as a set of services that are exposed by interfaces that provide the correct data to the correct application.

We used several different technologies, including BizTalk Server, mainframe applications, and Web standards, all implemented using Microsoft Visual Studio .NET.

In the coming sections of this project, I will extend the system we have built by using Microsoft's Web Services Enhancement toolkit (WSE) and finally deploy, test, and monitor the system. (To see an overview of the entire project, read FoodMovers: Building Distributed Applications using Microsoft Visual Studio .NET.)

The following sections will cover:

Section 5, Extensions: Building Blocks for Extra Functionality

By now, we have created a system using the tools in Visual Studio .NET Enterprise Architect Edition. But we have just a basic system. What about security? What about attachments? What about updates? What about administration? We could develop these ourselves, but it would be nice if there was an alternative to custom development of these pretty standard pieces.

What we need is a concept of interchangeable parts for software development. This has been tried again and again with varying success. The C programming language came with a standard library (stdlib) of functions, such as printf and sscanf, that most C programmers gladly used rather than writing their own. Later, the Microsoft Foundation Class (MFC) for C++ development was made available to programmers working in an object-oriented Windows environment. Who wants to write a dialog box function if there is one available that works and does mostly what is needed?

In this section, I talk about the Web-service version of interchangeable parts. They take the form of standard extensions that are becoming available in the Web services universe. These extensions are part of Microsoft's Web Services Enhancements for Microsoft .NET (WSE). WSE extensions take the form of building blocks that can be integrated into a Web service quickly and easily. We will add attachments and security to our system to show how the building-block approach works.

Section 6, Going Live: Instrumentation, Testing, and Deployment

Once the architecture is designed and the code framework is created using Visual Studio, it is time to describe our plan for deployment and administration. In addition, there are several areas of implementation that need to be addressed before a robust, reliable, and secure architecture is deployed.

First, we need to develop a plan for "instrumentation." By placing "sensors" in our application, we can use instruments to provide a dashboard of our deployed system. Then we need to exercise the system in two areas, text and staging, before finally deploying the system in a production environment.

In this section, I detail a plan for exception and event management, and introduce the concept of "exception mining," which provides a method for wading through the information stream coming from the application to find events that need attention.

About the author

Brian Travis is Chief Technical Officer and Founder of Architag International Corporation, a consulting and training company based in Englewood, Colorado. Brian is an expert in real-world XML implementations. Since founding Architag in 1993, he has created intelligent content management systems and e-business solutions for Architag clients around the world. Brian is also a noted instructor and popular lecturer in XML and related standards. In his role as principal instructor for Architag University, he has been teaching clients about XML in the U.S., Europe, Africa, and Asia.

Brian has lectured at seminars around the world and has written about XML, Web services, and related technologies. His most recent book, Web Services Implementation Guide, is a guide for IT architects and developers who need to integrate internal systems and external business partners. The book provides the basis for understanding the goals of Web services for the Enterprise Architect, Project Architect, Deployment Architect, developer, and manager. It outlines the steps an organization must take in order to align itself with the new world of Web services.