Bryan, Each of our ESB's act independently, allowing the one in the philippines to serve local requirements that don't involve corporate or shared data. At the same time, the local ESB can act as an extension of the corporate ESB when we need to communicate between the two. They can also be configured such that if the main ESB goes down, the other ESB can assume it's duties. Most of the commercial ESBs have this capability built-in. If you go with something like MULE, you'll probably have more work to make this happen.
As far as data model goes, the approach we've taken is to create schema for the various entities we're moving across the ESB. For example, we have a canonical model for customer, WIP transaction, PO, Die Release, etc. There are still a lot of areas we don't have canonical models for, but we're adding new schema as we need them. The schema are independent of any of the database structures where data may come from, or where it may end up - what's important is that you capture all of the data that is available/could be required for the entity you are modeling. Take a customer for example. System A may only need customer name and customer number, while system B needs those fields as well as address, city, state, country, etc. All of those elements need to be part of the schema that you define. All of the work that we've done actually started from external B2B integration and has worked itself back into the rest of the organization. In our case, we participate in RosettaNet, which provides standard XML formats (and more) for transaction types that are specific to the electronics industry. These schema represent the data format that our customers expect to receive from our systems. Our internal systems don't store data in the same format, so what we do is this: Data from source systems is first converted into our canonical format, based on transaction type and temporarily stored in an xml database. When it's time to send to a customer, we pull the data from the xml db and transform it into the RosettaNet specific format (which almost never is really standard), then the data is sent on to the customer. We also have the same process run in reverse, where we get customer data in RN format, transform it to our canonical format before storing it in our xml db (most times), then transform it again for our internal systems, whether they be SAP or our manufacturing execution systems. The way to actually connect a DB to your ESB really depends. In some cases, our ESB will reach directly into a DB (usually staging tables) and extract the data it needs for a message. We prefer it when systems push data to the ESB, but that isn't always possible in our environment. In the case where we pull data directly, it's done using JDBC. In our case, we created a generic POJO that does the extraction, and it simply becomes a step in a "sequence" or "flow". For governance, there is really two types that come into play - design time, and runtime. What you are talking about regarding availability, etc. is handled by runtime governance. You don't want to build this yourself unless you have to. There are decent systems out now that handle this (Software AG has a good one, as does Amberpoint). Most governance solutions include UDDI, but the directory is really only a minor part of their overall functionality. It can be difficult to wrap your brain around all the SOA governance, but once the lightbulb goes on, it gets easier. Both SAG and Amberpoint regularly offer webcasts and sometimes local seminars on SOA Governance. I suggest checking them out for some free info on exactly what they offer and the problems they are trying to solve. Eventually you'll get to the point where you realize that services alone are only part of the story and what's actually more important is that you treat them as manageable resources (sounds like this is what you are saying), where each service has an SLA, security, auditability, versioning, etc. I hope this helps. If you have more questions, keep them coming. This is fun ;-) -Rob On Apr 3, 10:23 am, "Bryan Hogan" <[EMAIL PROTECTED]> wrote: > Hi Rob, > > How do you manage your federated ESB architecture? Is their built in > management support in Crossvision? I assume Crossvision manages the > routing dependant on local or are these sites disaster recovery sites? > I am currently working on a canonical data model for our endpoints; > however, I am having trouble grasping how to connect databases. How do > you present your data model? > > We have less of a need for connecting internal applications and more of > a customer centric approach. I believe it would be wise to provide > access to each service via UDDI as we go. Governance will be critical > to us. In a basic form I was thinking of developing unit tests for each > endpoint and build servers which polled each to log and report > availability, usage, etc. > > My main sticking point is in understanding how to represent the data > model for DB to DB needs such as implementations where data is > transformed from disparate methods into our data model. I'm not > entirely sure this belongs in the ESB; however, it seems that it can > fit. > > Look forward to your posts. > > Thanks, > Bryan > > -----Original Message----- > From: [email protected] [mailto:[EMAIL PROTECTED] On Behalf > > Of Rob Brooks-Bilson > Sent: Thursday, April 03, 2008 12:16 AM > To: CFCDev > Subject: [CFCDEV] Re: architecture question: communication between > applications > > Hi Bryan, > > I'm working on the post, slowly ;-). In the meantime, right now, we > have a federated ESB architecture with one ESB in the US and one in > the Philippines (probably will be adding a third in Korea > eventually). Our current ESB is Software AG's Crossvision Service > Orchestrator. Since SAG recently bought WebMethods, the combined > company will be combining their ESB/Integration server products into a > new offering later this year. We'll eventually upgrade to that > product. We are using web services, message queues, JDBC connectors, > FTP, email, directory watching, http post, and custom Java adapters > for our endpoints. We have canonical data model for most of our > transaction types (this is one of the most important things you will > need to do), and we're currently transforming and pushing millions of > messages through the bus a month. Our ESB is used for both internal > integration, and for RosettaNet based B2B integration. > > We've been at this for a little over two years now, and are just > heading down the governance road where we'll be putting the > infrastructure, policies, etc. in place to better manage what we've > built. UDDI will be a part of this. Right now, we're not exposing > much in the way of services to our application developers, so it's > been relatively easy to manage up until now (we've mostly been using > the ESB to integrate existing systems). There are several governance > products we've looked at, and there is a lot of promise, but there's > also a lot of complexity (and cost) to most of them. > > -Rob --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "CFCDev" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/cfcdev?hl=en -~----------~----~----~----~------~----~------~--~---
