July 17, 2004
@ 02:40 AM

Dave Winer writes

Russ Beattie says we should be careful not to give the Republicans ammo to kill Kerry. I am sorry Russ, I'm not worried about that. I'm more worried that the Dems are too flustered by the hardball tacticts of the Reps to fight back.

The only time I tend to watch regular TV that isn't TiVo is while working out in the morning at the health club. I've noticed that while John Kerry's ads tend to be about the qualities that  make him a good candidate for president, George Bush's ads have mostly been negative ads attacking John Kerry. Personally I would love it if Kerry's campaign continues to take the high ground and shows the Republican party up for the rabid attack dogs that they are. The problem with this is that negative ads work and some people tend to look at not hitting back as a sign of weakness, which is what it seems Dave Winer is doing.

Whatever happened trying to change the tone in Washington and elevate the discourse? Just another case of "Do what I say, not what I do" I guess.

 


 

Categories: Ramblings

I was reading an XML-Deviant column on XML.com entitled Browser Boom when I came across the following excerpt

The inevitable association with Microsoft's CLI implementation is proving a source of difficulty for the Mono project. The principal author of Mono's XML support, Atsushi Eno, posted to the Mono mailing list on the problems of being conformant in Mono's XML parser implementation. More specifically, whose rules should Mono conform to. W3C or Microsoft?

MS XmlTextReader is buggy since it accepts XML declaration as element content (that violates W3C XML specification section 3 Logical Structures). ... However, there is another discussion that it is useful that new XmlTextReader (xmlText, XmlNodeType.Element, null) accepts XML declaration.

... that error-prone XmlTextReader might be useful (especially for people who already depends on that behavior)

... we did not always reject Microsoft badness; for example we are copying System.Xml.XmlCDataSection that violates W3C DOM interface hierarchy (!)

The root of the dilemma is similar to that which Mozilla and Opera are trying to manage in the browser world.

What I find interesting is that instead of pinging the MSFT XML folks (like myself) and filing a bug report this spawned a dozen message email discussion on whether Mono should be bug compatible with the .NET Framework. Of course, if the Mono folks decide to be bug compatible with this and other bugs in System.Xml and we fix them thus causing breaking changes in some cases will we see complaints about how Microsoft is out to get them by being backwards incompatible? Now that Microsoft has created the MSDN Product Feedback Center they don't even have to track down the right newsgroup or email address of a Microsoft employee to file the bug.

It's amazing to me how much work people cause for themselves and conspiracy theories they'd rather live in than communicate with others.

Update: I talked to developer responsible for the XmlTextReader class and she responded "This is by design. We allow XML declaration in XML fragments because of the encoding attribute. Otherwise the encoding information would have to be transferred outside of the XML and manually set into XmlParserContext."


 

Categories: Life in the B0rg Cube | XML

Taken from the an article on TheServerSide.com entitled Microsoft Responds to Sun’s Web Service Benchmarks

In a paper published last month, Sun claimed that Java based web services outperform .NET based web services both in throughput and response times. Microsoft has released a paper on TheServerSide.NET responding to those claims stating that Sun’s representation of the .NET performance was understated by 2 to 3 times and that in many, but not all cases, .NET exceeded the Java benchmarks.
...
Read the Microsoft response on TheServerSide.NET: Web Services Performance: Comparing J2EE and .NET

Read Sun's original paper: J2EE claimed to have better Web Services performance than .NET

It should be noted that Sun did not publish source code for their benchmark so Microsoft had to  re-create Sun's benchmark based on the details from the original paper. The Microsoft response has the full source code for both the .NET Web Service implementation, the Java Web Service implementation using Sun's JWSDP 1.4 along with the test program used to benchmark both services. I always believe the best way to verify a benchmark is to run it yourself. The performance of the .NET XML Web Service implementations should prove to be a lot better than what is implied by the original paper from Sun.


 

July 14, 2004
@ 09:14 PM

In the midst of a back and forth internal discussion on whether it is appropriate for folks to be griping about the recently announced MSFT benefit cuts on their work related blogs someone sent me a link to the Mini-Microsoft blog which describes itself thusly

Let's slim down Microsoft into a lean, mean, efficient customer pleasing profit making machine! Mini-Microsoft, Mini-Microsoft, lean-and-mean!

Subscribed!!!


 

A little while ago some members of our team experimented various ways to reduce the Relational<->Objects<->XML (ROX) impedance mismatch by adding concepts and operators from the relational and XML (specifically W3C XML Schema) world into an object oriented programming language. This effort was spear headed by a number of smart folks on our team including Erik Meijer, Matt Warren, Chris Lovett  and a bunch of others all led by William Adams. The object oriented programming language which was used as a base for extension was C#. The new language was once called X# but eventually became known as Xen.

Erik Meijer presented Xen at XML 2003 and I blogged about his presentation after the conference. There have also been two papers published about the ideas behind Xen; Programming with Rectangles, Triangles, and Circles and Unifying Tables, Objects and Documents. It's a new year and the folks working on Xen have moved on to other endeavors related to future versions of Visual Studio and the .NET Framework.

However Xen is not lost. It is now part of the Microsoft Research project, Cw (pronounced C-Omega). Even better you can download a preview of the Cw  compiler from the Microsoft Research downloads page


 

Categories: Technology | XML

Torsten has a blog post about an interesting bug in RSS Bandit. If you are subscribed to both Joe Gregorio and Ian Hixie's blogs then one of the entries in Ian Hixie's blog appears with the wrong date. The blog post that appears with the incorrect date is the post State of the WHAT from Ian Hixie's blog which is linked to from Joe Gregorio's post 3270 Redux. Instead of being dated 2004-06-29 as appears in Ian's RSS feed it is dated 2004-06-05 which is the same date as from Joe's post.

The problem arises from a workaround we came up with to deal with feeds that don't provide dates. Many users dislike feeds that don't have dates and prefer that we display some default date for such feeds. What we ended up doing was using the date the item was seen in the feed as the date for each item. In many cases this date isn't accurate. Sometimes the inaccuracy of this date is particularly glaring when a post from a feed with dates in it links to one with no dates in the feed because it may look like a feed is linking to a post in the future. For example, Joe Gregorio's post dated 2004-06-05 links to a post made by Ian Hixie on 2004-06-29. In this case, this is valid because Joe Gregorio went back and edited his blog post but didn't update the date in his feed. However RSS Bandit thinks this discrepancy in the dates is because we guessed the date for the entry in Ian's blog and thus corrects it by aligning it with the date from Joe's entry. The rationale for this behavior is that guessing that an undated entry was posted on the same day as someone linked to it is more accurate than guessing that it was posted when it was fetched. The bug is that when we use this heuristic we don't check to see if the entry whose date is being adjusted is actually an undated entry.

This has been fixed in the current codebase. The next question is whether we should actually be adjusting dates in this manner in any case.


 

Categories: RSS Bandit

July 11, 2004
@ 12:50 AM

In a post entitled Dare Obasanjo is raining on the W3C's parade, Mike Dierken responds to my recent post which asks Is the W3C Becoming Irrelevant? by writing

Either way the primary mechanism the W3C uses to produce technology specs is to take a bunch of contradictory and conflictiong proposals then have a bunch of career bureaucrats try to find some compromise that is a union of all the submitted specs

Damn those career bureaucrats that built XML. Or is it the SOAP design process that caused the grief? And where did that technology come from anyway?

My original post already described the specs that have caused grief and show the W3C is losing its way. I assume that Mike is trying to use XML 1.0 and SOAP 1.1 as counter examples to the trend I pointed out. Well first of all, XML 1.0 was a proposal to design a subset of SGML so by definition it could not suffer the same problems that face attempts to innovate by committee which have hampered the W3C in current times. Also when XML 1.0 was created it was much smaller and a majority of the participants in the subsetting of SGML had similar goals. As for SOAP 1.1, it isn't a W3C spec. SOAP 1.1 was created by Don Box, Dave Winer and a bunch of Microsoft and IBM folks and then submitted to the W3C as a W3C Note.

Of course, the W3C has created iterations of both specs (XML 1.1 & SOAP 1.2) which in both cases are backwards incompatible with the previous versions. I leave it as an excercise to the reader to decide if having backwards incompatible point releases of Web specifications is how one 'leads the Web to its full potential'.


 

Categories: XML

July 10, 2004
@ 06:09 PM

While reading Dave Winer's blog today I stumbled on a link to the New York Times editorial on the Sentate Intelligence Committee's recent report. Below is an excerpt

In a season when candor and leadership are in short supply, the Senate Intelligence Committee's report on the prewar assessment of Iraqi weapons is a welcome demonstration of both. It is also disturbing, and not just because of what it says about the atrocious state of American intelligence. The report is a condemnation of how this administration has squandered the public trust it may sorely need for a real threat to national security.

The report was heavily censored by the administration and is too narrowly focused on the bungling of just the Central Intelligence Agency. But what comes through is thoroughly damning. Put simply, the Bush administration's intelligence analysts cooked the books to give Congress and the public the impression that Saddam Hussein had chemical and biological weapons and was developing nuclear arms, that he was plotting to give such weapons to terrorists, and that he was an imminent threat.

These assertions formed the basis of Mr. Bush's justifications for war. But the report said that they were wrong and were not a true picture of the intelligence, and that the intelligence itself was not worth much. The freshest information from human sources was more than four years old. The committee said the analysts who had produced that false apocalyptic vision had fallen into a "collective groupthink" in which evidence was hammered into a preconceived pattern. Their bosses did not intervene.

The report reaffirmed a finding by another panel investigating intelligence failures before the 9/11 attacks in saying that there was no "established formal relationship" between Saddam Hussein and Al Qaeda. It also said there was no evidence that Iraq had been complicit in any attack by Osama bin Laden, or that Saddam Hussein had ever tried to use Al Qaeda for an attack. Although the report said the C.I.A.'s conclusions had been "widely disseminated" in the government, Mr. Bush and Vice President Dick Cheney have repeatedly talked of an Iraq-Qaeda link.

Sadly, the investigation stopped without assessing how President Bush had used the incompetent intelligence reports to justify war.

It is now quite clear that GW Bush and his cronies started a war that has claimed the lives of hundreds of Americans and thousands of Iraqis, cost the US and Iraq billions of dollars, and has increased the negative feelings towards the US across the world [especially in the Middle East] for no just cause. What I'd like to know is if anyone is going to go to is what the legal punishment for their transgressions actually will be.

Growing up in Nigeria, I saw first hand what happens when the government commits crimes against the people with no fear of accountability. Lack of accountability seeps into the national fabric and varying degrees of corruption follow. Hopefully, America won't follow the example of the tin pot dictatorships across the third world where everyone knows the governments lie and are corrupt but shrug it off as being a way of life.

Bush and his cronies are destroying America and everything it stands for one day at a time. I pray we don't get four more years of this disaster.


 

Categories: Ramblings

For a long time I used to think the W3C held the future of the World Wide Web in its hands. However I have come to realize that although this may have been true in the past the W3C has become too much of a slow moving bureaucratic machine to attract the kind of innovation that will create the next generation of the World Wide Web. From where I sit there are three major areas of growth for the next generation of the World Wide Web; the next generation of the dynamic Web, syndication and distibuted computing across the Web. With the recent decisions of Mozilla and Opera to form the WHAT working group and Atom's decision to go with the IETF it seems the W3C will not be playing a dominant role in any of these 3 areas.

In recent times the way the W3C produces a spec is to either hold a workshop where different entities can submit proposals and then form a working group based on coming up with a unification of the various proposals or forming a working group to find come up with a unification of various W3C Notes  submitted by member companies. Either way the primary mechanism the W3C uses to produce technology specs is to take a bunch of contradictory and conflictiong proposals then have a bunch of career bureaucrats try to find some compromise that is a union of all the submitted specs. There are two things that fall out of this process. The first is that the process takes a long time, for example the XML Query workshop was in 1998 and six years later the XQuery spec is still a working draft. Also XInclude proposal was originally submitted to the W3C in 1999 but five years later it is just a candidate recommendation. Secondly, the specs that are produced tend to be too complex yet minimally functionaly since they compromise between too many wildly differing proposals. For example, W3C XML Schema was screated by unifying the ideas behind DCD, DDML, SOX, and XDR. This has lead to a dysfunctional specification that is too complex for the simple scenarios and nigh impossible to use in defining complex XML vocabularies.

It seems many vendors amd individuals are realizing that the way to produce an innovative technology is for the vendors that will mostly be affected by the technology to come up with a specification that is satisfactory to the participants as opposed to trying to innovate by committee.  This is exactly what is happening with the next generation of the dynamic Web with the WHAT working group, with XML Web Services with WS-I and in syndication with RSS & Atom.

The W3C still has a good brand name since many associate it with the success of the Web but it seems that it has become damage that vendors route around in their bid to create the next generation of the World Wide Web.


 

Categories: XML

I often think to myself that there is a lot of background racism in the United States. By background racism, I mean racism that is so steeped into the culture that it isn't even noticed unless pointed out by outsiders. One example sprang to mind after reading Robert Scoble's post Did China beat Christopher Columbus by decades? where he writes

Speaking of Chinese, I'm reading a book "1421 The Year China Discovered America" that makes a darn good case that Christopher Columbus didn't discover America. He's done a ton of work that shows that the Chinese were actually here 60 years prior and that Christopher Columbus actually had copies of their maps!

That basically throws out a whole ton of history I learned in elementary school.

What I find interesting is this concept of "discovering America". There were already people on the North American content when Columbus [or the Chinese] showed up in the 15th century. So "discovered" really means "first European people to realize the American continent existed". Now every child in America is brought up to believe that Europeans showing up on some land that was already inhabited by natives is "discovering America" and introducing it to the world. 

This makes me wonder how much the history lessons I received growing up in Nigeria differs from the version British kids got about the African colonies. Perhaps there is also some white guy celebrated for having "discovered Africa" and civilizing the black savages who he met when he got there. At least whatever tribes that welcomed whoever he was aren't extinct today, too bad you can't say the same for the tribes that greeted Columbus.    


 

Categories: Ramblings