Via Mark Baker I found an article in the ACM Queue entitled The Rise and Fall of CORBA by Michi Henning. Lots of good stuff in the article some of which is excerpted below.

the CCM (CORBA Component Model). A specification for CCM was finally published in late 1999 but turned out to be largely a nonevent:

  • The specification was large and complex and much of it had never been implemented, not even as a proof of concept. Reading the document made it clear that CCM was technically immature; sections of it were essentially unimplementable or, if they were implementable, did not provide portability.
  • No commercial CORBA vendor made a commitment to implement CCM, making it a stillborn child.
  • Even if implementations had been available by the time CCM was finally published, it was too late. The horse had already bolted: EJB had become entrenched in the industry to the point where another component technology had no chance of success.

The failure of CCM did little to boost the confidence of CORBA customers, who were still stuck with their complex technology.
...
What steps should we take to end up with a better standards process and better middleware? Seeing that procedural failures are the root cause of technical failures, I suggest at least the following:

  1. Standards consortia need iron-clad rules to ensure that they standardize existing best practice.
  2. No standard should be approved without a reference implementation.
  3. No standard should be approved without having been used to implement a few projects of realistic complexity.
  4. Open source innovation usually is subject to a Darwinian selection process.
  5. To create quality software, the ability to say “no” is usually far more important than the ability to say “yes.”.

The lessons listed above seem rather self evident and obvious yet it s a sad fact of the software industry that the mistakes of CORBA keep getting made all over again. Core XML technologies like W3C XML Schema and XQuery are 'standards' without a reference implementation which invented new features by committee instead of standardizing best practice. At least one of the guidelines is probably unrealistic though. It is hard to require that a standard shouldn't be approved until it has been used to solve a real-world problem since people solving real-world problems typically don't want to be used as guinea pigs.


 

Tuesday, 20 June 2006 04:29:11 (GMT Daylight Time, UTC+01:00)
> the ability to say “no” is usually far more important than the ability to say “yes.”

That's the killer. The problem as I see it is that it is much easier to get consensus to do more than less. This is the way the DOM spec got the way it is: One group wanted to take one approach (e.g. the parent-child-sibling model of tree navigation) and another wanted a different one (e.g. the "lists within lists" approach used by Netscape's original API). The easy way out was to support both, thus bloating the API. Issue after issue was resolved that way, removing whatever conceptual integrity it might have had. I didn't follow the XSD discussions very closely, but the result has the same flavor of "you want 'A', I want 'B', so let's compromise on 'A and B'".

Tuesday, 20 June 2006 07:25:11 (GMT Daylight Time, UTC+01:00)
> It is hard to require that a standard shouldn't be approved until it has been used to solve a real-world problem since people solving real-world problems typically don't want to be used as guinea pigs.

Hmmm, I disagree. Are you really suggesting that we should publish a standard *before* we have built something realistic with it? That makes people guinea pigs *after* the standard is published, which I would think is very much worse than making them guinea pigs *before* the standard is published.

Let's face it: in general, for problems of any complexity, software engineering is not advanced enough to figure out whether something works or not until after we've built it. It follows that we shouldn't standardize until after we know for sure. Successful standards (such as the Single Unix Specification) are published only after we've had experience with the technology, and we've managed to wear off the rough edges, not before.

Cheers,

Michi.
Tuesday, 20 June 2006 12:04:01 (GMT Daylight Time, UTC+01:00)
I also disagree that guideline #3 is unrealistic. That guideline plus some of the others are essentially saying that de jure standards (decided by committee) should really be just acknowledgement of de facto standards (decided by natural selection). If a committee wants to propose something that might possibly be a good idea, that's fine. But that should never be approved as a "standard" until it actually is a standard.
Tuesday, 20 June 2006 12:49:57 (GMT Daylight Time, UTC+01:00)
"No standard should be approved without having been used to implement a few projects of realistic complexity."

This is an interesting thing to say in the context of the ODF versus Open XML standards kerfuffle.

If Open XML gets recognition as a standard, it is represented by one implementation to the best of my knowledge - MS Office 2007. And a host of MS Office 2007 add-ons.

ODF on the other hand is represented by three different office suites, several documentation management systems, and a variety of other such systems, including one to manage Boeing's vast collection of CAD and related systems. "realistic complexity" gets put in perspective by Boeing's development and adoption of ODF.
Wesley Parish
Comments are closed.