About 15 years ago, Microsoft upgraded their internal system that manages the list of unique product offerings: our product catalog. The catalog is fairly complex. Microsoft sells products all around the world in over a hundred different languages using different mechanisms for licensing. It is not enough to say that MS sells Word, but more appropriately MS sells Word 2003 Service Pack 1 Portuguese edition in Brazil, as sold through the Open Value license program. The packaging is unique, as are many of the localized aspects of the application itself.
Of course, the data in this catalog is key to a lot of business processes, so literally hundreds of systems have been integrated, either directly or indirectly, with this source system. The categories of downstream systems include financial systems, order management, sales allocation, supply chain, marketing management, partner management, and more, with a dozen or more applications in each category.
The creation of this catalog source predates most of the ERP systems within Microsoft. Therefore, the ERP systems, including Dynamics AX, while capable of managing a large and complex catalog, are not largely being used for that purpose.
Most of the downstream systems who consume this catalog do so through an older Master Data Management (MDM) system that was also developed around the same time. The MDM system provided a way to subscribe to large flat files and/or tables (remember: this catalog system predates XML) that contain the catalog data. If you want to write an application that consumes from this catalog, your application could subscribe to get hourly updates directly to your database tables, and the MDM system would manage the SQL Integration at the fine detailed level.
It is basically a single table, so the feed is flat (no heirarchy). While it is managed using SQL, it is not dissimilar from managing it as flat files. This large flat file feed accounts for over 90% of all of the systems that integrate with the product catalog system.
Here’s an illustration:
OK, so we have this heavily wired infrastructure and we want to change it. What do we change it to? Well, Microsoft Enterprise Architecture has stated, categorically, that it is better to leverage an existing system than buy a new one, so we are leveraging our existing ERP and CRM infrastructure. We will move the catalog to the ERP system.
Of course, ERP systems are known for being versatile. Every business group within Microsoft wants their own attributes attached to their own products, often in rather distinct ‘models.’ So a team of folks went through the business, for well over a year, interviewing all of the different business groups about what they want in their models. I’m confident that they will be able to implement their model, or something that represents the model, in the ERP system.
Interestingly enough, the ERP system is not new. We are just using it in a new way. One of those downstream systems that consumes the old catalog is our ERP system. (Really, it’s plural. There are many ERP systems in place, including Dynamics AX. We’ve picked one to source this data.)
Now, how do we change the integration? The obvious answer is right in front of us: have the new system produce the old data feed. That way, we get the benefit of the new data model, but all the downstream systems that rely on the old data feed can continue to operate, at least until our army of IT developers can crack the covers and either shut down the old apps or refactor them to use the new data structure.
So, here’s the logical view of what we would change:
Making the new system produce the old feed will not be easy. You see, with the addition of these different models and with the increased versatility of the ERP solution, we will have a rather different looking catalog. The number of products will be different as will their names and attributes. The end result will be similar in that we will still sell products, but some of the things we cannot do in the old system will be fairly easy in the new one.
In addition to this, the old catalog system used auto-number (Identity) columns to create unique ids. The ERP system would create altogether new numbers. If we are going to keep the feed functional for more than a day, we’d need the business event of ‘add a new product’ to produce the same data effects in the MDM system as before. Making the ERP system do this is difficult and byzantine.
This is tricky. We need the ability to move a downstream system from the old data structure to the new one. We cannot move all of them at once. So, for some period of time, both the old and new data structures have to be available. But producing the old data structure from the new system is difficult.
The solution we are looking at is interesting. We don’t produce the old data structure from the new system. We produce it from the old one.
In this model, all changes to the product catalog happen in the ERP system. The ERP system has the new data structures in it. In the old model, data used to flow From the catalog system To the ERP system. In this model, we reverse the flow. Data will flow From the ERP system To the old catalog system.
Now, we can’t have users entering data in the old system as well, since that data wouldn’t make it to the ERP system any more, so we cut off the GUI. All new data comes in through the ERP system. However, the old data structure continues to be generated by the old system, along with all of it’s hidden intricacies that the downstream systems are tightly bound to.
By definition, the data files will behave in the same way, because the tight dance that each of the downstream systems has learned, over the years, continues to play on.
Of course, a new data feed will move to the MDM system as well, this time using the new data structure. When a downstream system is ready to adopt the new model, a dev project is fired off to either refactor it to consume the new data feed or to retire it completely. In a sense, it’s a cleanup of the magnitude of Y2K in that we will have to examine each one of those systems to decide if it is worthy of the investment needed to fix it.
The advantage of this mechanism:
Data is only entered once, in the ERP system, and it is replicated in a definable manner downstream.
The logic needed to map the data from the old catalog system to the ERP system can be reversed to feed the old catalog system first, to make sure it works, before you put the new data structure in the ERP system.
There is no pressure for a “big bang” migration of downstream systems, or even a series of “little bang” migrations. They can be done in any order we want.
The disadvantage of this mechanism
As changes occur in the new ERP data model, the data mapping components in the reversed feed will also have to be kept up to date. This adds to the cost of changing the ERP data model.
It’s an interesting problem.
There are other solutions, of course. I won’t go into details on why I think they are less appealing. This one is my favorite, but the decision will be made by a team of experts.
One thought on “A case study in breaking up a tightly coupled integration”
PingBack from http://desktopcomputerreviewsblog.info/inside-architecture-a-case-study-in-breaking-up-a-tightly-coupled/