Our economic system works not because the central government controls things, but because it only controls the things that need controlling. Our IT systems need to find that same sweet spot.
I’m involved in a portfolio optimization process, so for the time being, I’m spending a lot of time trying to figure out where a great many applications can be replaced with many fewer applications (hopefully, some of them will be commercial apps, and not all home-grown).
This tends to lead to a false sense of control. If there are 100 ‘buckets’ of functionality that we identify, and then we look for overlaps within each one, it can make you think that you have a ‘final’ list, when you will never have a final list.
In fact, you never should. Governments that control free speech have had a hard time with the Internet. Centrally planned economies have not been able to exploit many of the innovations of the software age. People who thought that they had a final list only effectively delay things when the list must grow.
On the other hand, standards don’t come out of thin air. For decades, it was obvious to anyone who looked that the Canadian system for health insurance reimbursement had a serious advantage over ours… they had one standard data format for basic insurance transactions, so there wasn’t a need for expensive translation systems. We had competition and no clear standard. We actually had a couple of standards.
Not until the HIPAA law created a single standard that everyone must use. That doesn’t mean that the government is regulating relationships, but it does mean that a huge source of inefficiency is removed from the system. Now, for the first time, it is economical for software, running at a doctor’s office, to format the transactions and send them to the insurance companies directly, without the need for many expensive middle-men. (You still have overhead, but it is smaller).
I’d venture that the sweet spot, between central control and freedom to innovate lies in the standards. Not just privacy and security, but also data integration standards, monitoring standards, and even hosting/deployment, so that the cost of deploying an application, and the difficulty of rolling one into production is so small that we can scale applications up, and down, based on actual utilization, and not based on ‘limits’ and the overhead of management.
I’d venture that the Java world is ahead of the game here, and that we need to catch up. This cannot be a Microsoft-driven effort but it cannot be one that excludes Microsoft either. That road has been tried. It doesn’t work.
In that vein, I’d say that SOAP is going to have to win over REST (even though I love the basic concepts of REST). Why? Tools are there. Tools to make REST work would require bits that I haven’t seen. (If you know of a good REST library for .Net and want to convince me otherwise, please post a link).
I’d say that we need a bit more than just picking the protocol mechanisms. We need to have standardized definitions of business transactions. I know that Oasis has done some work there as has Rosetta. I haven’t dug as deeply as I could have. The end result has to be that we all leverage each other’s ideas and brains. This has to be approachable, and easy to learn. The transactions have to make sense. Common sense.
Then we need to step back. If a set of applications uses database-to-database communication to integrate… well, OK. SOA does not equal ‘correct’ and not-SOA does not equal ‘incorrect’. The rules don’t say ‘SOA’ but they do say ‘real-time integration using data standards.’
That’s where you find the sweet spot. That’s where, I think, today, we need to set up our oversight. Everywhere else, stay out of the developer’s hair.
They have enough to deal with.