September 8, 2004
@ 03:23 PM

Roger Costello recently started a discussion thread on the XML-DEV mailing list about the common misconceptions people have about XML document validation and schemas. He recently summarized the discussion thread in his post Fallacies of Validation, version #3. His post begins

The purpose of documenting the below "fallacies" is to identify erroneous common thought that many people have with regards to validation and its role in a system architecture.  Perhaps "assumptions" would be a better term to use than "fallacies".  In any case, the desire of this writeup (which is a compilation of discussions on the xml-dev list) is to provoke new ways of thinking about validation, and reject limiting and static views on validation. 

Fallacies of Validation

1. Fallacy of "THE Schema"

2. Fallacy of Schema Locality

3. Fallacy of Requisite Validation

4. Fallacy of Validation as a Pass/Fail Operation

5. Fallacy of a Universal Validation Language

6. Fallacy of Closed System Validation

7. Fallacy that Validation is Exclusively for Constraint Checking

I mostly agree with the fallacies as described in his post.

Fallacy #1 has been a favorite topic of Tim Ewald over the past year. It isn't necessarily true that there is one canonical schema for an XML vocabulary. Instead the schema for the vocabulary may depend on the context the XML document is being used in. A classic example of this is XHTML which has 3 schemas (DTDs) for a single format.

I consider Fallacy #2 to be more of a common mistake than a fallacy. Many people create validation systems that work in a local environment such as creating specific patterns or structures for addresses or telephone numbers which may work in a local system but break down when used in a global environment like the World Wide Web. This common mistake isn't limited to XML validation but applies to all arenas where user input is validated before being stored or processed

Fallacy #3 is interesting to me because I wonder how often it occurs in the wild. Are there really that many people who believe they have to validate XML documents against a schema?

Fallacy #4 is definitely a good one. However I disagree with the quotes he uses to butress the main point for this fallacy. I especially don't like the fact that he uses a generalization from Rick Jellife about bugs in a few schema validators as a core part of his argument. The important point is that schema validation should not always be viewed as a PASS/FAIL operation and in fact schema languages like W3C XML Schema go out of their way to define how one can view an XML document as being part valid, part invalid.

One size doesn't fit all is the message of Fallacy #5 to which I heartily cheer "Hear! Hear!". I agree 100%. There is no one XML schema language that satisfies every validation scenario.

I don't really understand Fallacy #6 without seeing some examples so I won't comment on it. I'll see if I can dig up the discussion threads about this on XML-DEV later.

Fallacy #7 is another one where I agree with the message but mostly disagree with how he argues the point. All of his examples are all variations of using schemas for constraint checking, they just differ on how the document is processed does after constraint checking is done. To me, the prime example of the fact that schema validation is not just for constraint checking is that many technologies actually using schemas for creating typed XML documents or for translating XML from one domain to another (e.g. Object<->XML, Relational<-> XML),

Everything said, this was a good list. Excellent work from Roger as usual.


 

Categories: XML

Recently I found out that we no longer had office supplies on the floor of the building I work in. Now if you need to grab a pen or get a marker after your last one runs out in the middle of a meeting you need to go upstairs. Folks have given me the impression that this is due to the recent cost cutting drive across the company. At first, I couldn't figure out why disrupting people by making them go to another floor for office supplies would cut costs.

Then it hit me. When faced with having to go to another floor to find office supplies the average geek desk jockey will probably say "forget it" and do without. The immediate saving is less office supplies used. But I suspect this is only phase one of the plan. Most people at MSFT believe that on average 50% - 75% of projects and features an employee works on in his career in the b0rg cube never ship. This is all just wasted cash. The best way to nip this is in the bud by preventing people from being able to write down their ideas or whiteboard different ideas with coworkers thus spreading the meme about new projects or features. The amount of money saved by not investing in new money losing ventures like *** and **** would be immense. It all makes a weird kind of sense now.

Seriously though. I've been reading blog posts like Dangerous Transitions and Dangerous Thoughts which call for Microsoft to start performing targetted layoffs instead of cost cutting with skepticism. When I think of the ways Microsoft spends immense amounts of cash for little return I don't worry about John Doe the tester who files on average less bugs than the other members of his team or Jane Doe the developer who writes buggier code than the rest of her team. I think about things like MSN, XBox, the uncertainty around MBF after purchasing Great Plains for billions, embarking on overambitious attempts to rewrite most of the APIs in Windows in an effort that spans 3 product units, spending years working on ObjectSpaces then canning it because there was potential overlap with WinFS and various other white elephant projects.  

All of the above cost from millions to billions of dollars and they are the result of decisions by folks in middle and upper management. I'm glad that Microsoft has decided not to punish rank and file employees for what are basically missteps by upper management in contravention to the typical way corporate America does business.

Ideally, we'd see our upper management address how they plan to avoid missteps like this in future instead of looking for minor bumps in the stock price and our paychecks by sacrificing some low level employees and coworkers.


 

Categories: Life in the B0rg Cube

September 6, 2004
@ 01:02 AM

According to HipHopGame.com Young Buck To 'Stomp' Out Luda/ T.I. Beef On Debut Album

If you have a mixtape featuring "Stomp," Young Buck's posse cut with T.I. and Ludacris, hold onto it. It's a collector's item. The track as we know it, with Cris and Tip battling each other, isn't going to be included on Buck's upcoming Straight Outta Cashville LP. Instead, a remix is going on the album, with newcomer D-Tay replacing T.I.
...
Buck says he asked 50 Cent to reach out to T.I. for a collaboration for Straight Outta Cashville. 50 obliged, and the track was sent to Atlanta for T.I. to rhyme on. Buck said he was surprised when the song came back with the line "And me getting beat down, that's ludicrous," because he didn't know if was a dis or not.

"I was hearing on the streets that [T.I.] and Luda be having problems with each other, and I know I just did a song with Luda's group about a week or two before," Buck elaborated. "Me and Luda are cool. To be all the way honest, I'd known Luda before I knew T.I., so I couldn't just jump on this record and have them having differences with each other, and then [have Luda] be like, 'Yo, Buck, what's up?' "

Staying diplomatic, Buck talked the situation over with Cris and even played T.I.'s verse for him. Ludacris confirmed that the two had been going back and forth, and he wanted to get on the song and speak his piece.

"I even got at T.I. like, 'Yo, Luda heard this record. He wanna jump on the record,' " Buck explained, "just to make sure all the feelings and everything would stay the same way. And he was like, 'Oh, I'm cool. I'm cool with it.' "

So Ludacris laced "Stomp" with his own battle raps, and the streets have been talking ever since.

T.I. and Cris have apparently now squashed their beef, Buck said, but controversy still surrounds the song. According to Buck, T.I.'s camp requested that Ludacris change his verse before they clear Tip to be on the album. (A T.I. spokesperson had no comment on that.) The G-Unit soldier said Cris has refused.

"Even throughout the song, you don't hear either one talking about killing each other," Buck lamented.

I'm not surprised Ludacris didn't want his rhymes removed. He totally schooled T.I. on that track. It's also a statement as to who is the bigger star that Luda's verses stay but T.I.'s will be removed given the standoff between both rappers. The track is hot, too bad it won't be making it onto the album.

By the way, Young Buck is wrong about them not talking about killing each other though.  T.I.s verse ends with When the choppers hit you bitch, you'll wish you got your ass stomped.


 

If you use RSS Bandit and recently installed .NET Framework 1.1 SP1 you may have noticed that you started getting errors of the form

Refresh feed 'SomeCategory\SomeFeed' failed with error: The underlying connection was closed: The server committed an HTTP protocol violation.

This is due to changes made to the System.Net.HttpWebRequest class to make it more compliant to the HTTP specification. For example, it now errors when fetching the Microsoft Research feeds because the web server returns the Content-Location header as "Content Location" with a space. The fix is straightforward and involves placing the following element as a child of the configuration element within the rssbandit.exe.config file in the C:\Program Files\RssBandit folder.

<system.net>
 <settings>
  <httpWebRequest useUnsafeHeaderParsing="true" />
 </settings>
</system.net>

This is also taken care of by v1.2.0.117 of RSS Bandit. When running it detects whether this option is available and enables it automatically so you don't have to mess around with XML configuration files.


 

Categories: RSS Bandit

In my recent post entitled The MSDN Camp vs. The Raymond Chen Camp I wrote

Our team [and myself directly] has gone through a process of rethinking a number of decisions we made in this light. Up until very recently we were planning to ship the System.Xml.XPath.XPathDocument class as a replacement for the System.Xml.XmlDocument class
...
The problem was that the XPathDocument had a radically different programming model than the XmlDocument meaning that anyone who'd written code using the XmlDocument against our v1.0/v1.1 bits would have to radically rewrite their code to get performance improvements and new features. Additionally any developers migrating to the .NET Framework from native code (MSXML) or from the Java world would already be familiar with the XML DOM API but not the cursor-based model used by the XPathDocument. This was really an untenable situation. For this reason we've reverted the XPathDocument to what it was in v1.1 while new functionality and perf improvements will be made to the XmlDocument. Similarly we will keep the new and improved XPathEditableNavigator XPathNavigator class which will be the API for programming against XML data sources where one wants to abstract away what the underlying store actually is. We've shown the power of this model with examples such as the ObjectXPathNavigator and the DataSetNavigator.

I've seen some concerned statements about this posts from XML developers who use System.Xml such as Oleg Tkachenko, Fumiaki Yoshimatsu and Tomas Restrepo so it seems I should clarify some of the decisions we made and why we made them.

In version 1.0 of the .NET Framework we provided two primary classes for interacting with XML; the XmlDocument and XmlReader. The XmlReader provided an abstract interface for interacting with a stream of XML. One can create an XmlReader over textual XML using the XmlTextReader or over virtual XML data sources such as is done with the XmlCsvReader. On the other hand with the XmlDocument we decided to eschew the approach favored by the Java world which used interfaces. Instead we created a single concrete implementation. This turned out to be a bad idea. It tied the the interface for programming against XML in a random access manner with a concrete implementation of an XML store. This made it difficult for developers who wanted to expose their data sources as XML stores and led to inefficient solutions such as the XmlDataDocument.

To rectify this we needed to separate the programming model for accessing XML data sources from our concrete implementation of the XmlDocument. We chose to do this by extending the cursor based programming model we introduced in v1 with the XPathNavigator instead of moving to an interface based approach with XmlDocument. The reason for choosing to go with a cursor based model over a tree based model is summed up in this quote from my article Can One Size Fit All?

In A Survey of APIs and Techniques for Processing XML, I pointed out that cursor-model APIs could be used to traverse in-memory XML documents just as well as tree-model APIs. Cursor-model APIs have an added advantage over tree-model APIs in that an XML cursor need not require the heavyweight interface of a traditional tree-model API where every significant token in the underlying XML must map to an object.

So in Whidbey, the XPathNavigator will be the programming model for working with XML data sources when one wants to abstract away from the underlying source. The XPathNavigator will be changed from the v1.0 model in the following ways (i) it will be editable and (ii) it will expose the post schema validation infoset. I've already worked with Krzysztof Cwalina on updating the Design Guidelines for Exposing XML data in WinFX to account for this change in affairs.

As for the XPathDocument, it is what it always has been. A class optimized for use in XPath and XSLT. If you need 10% - 25% better perf [depending on your scenario] when running XPath over an XML document or running XSLT over in-memory XML then this class should be preferred to the XmlDocument.   


 

Categories: Life in the B0rg Cube | XML

September 3, 2004
@ 03:54 AM

Joe Beda is leaving his position as a dev lead on Avalon to go work at Google. I learned about this a couple of days ago but was waiting for him to blog the news first. I suspect there'll be a lot of attrition across various product teams across Microsoft given the recent news about Longhorn.


 

Categories: Life in the B0rg Cube

Yesterday I installed .NET Framework v1.1 service pack 1 and it messed up my ASP.NET permissions. I decided to use this opportunity to kill two birds with one stone. My weblog is currently hosted on my Windows XP machine using IIS meaning that there are several limitations on the web server. The limitation on number of connections means several times during the day people get "Too Many Users" errors when connecting to this website.

I decided to install Apache and try out Movable Type 3.1. That led to a wasted morning trying to install various Perl modules. I tried some more when I got back from work and eventually gave up. Torsten gave me some tips this morning which fixed my ASP.NET permissions and my weblog is back up.

In the mean time it turns out that the v1.2.0.114 SP1 installer for RSS Bandit turns out to have had a number of  issues. If you're an RSS Bandit user please upgrade to v1.2.0.117.


 

Categories: Ramblings | RSS Bandit

August 31, 2004
@ 05:52 AM

This release fixes a number of bugs in v1.2.0.114. Major features will show up in the next release as described in the RSS Bandit product roadmap.

Download the installer from here.  Differences between v1.2.0.117 and v1.2.0.114 below.

UPDATE: v1.2.0.117 replaces v1.2.0.114 SP1

  • FEATURE: French translation added.

  • FEATURE: Double clicking on an item in the list view opens the link in a new browser tab.

  • FIXED: In certain situations RSS Bandit crashes with the following error message 'StartMainGui() exiting main event loop on exception.'

  • FIXED: Mouse wheel now supported in embedded Web browser pane.

  • FIXED: Changing the maximum item age for a feed does not refresh the feed and download old items still available in the feed.

  • FIXED: Synchronizing feed lists caused exceptions if categories were renamed, moved or deleted.

  • FIXED: Feeds that use 302 redirects with relative URIs as the location no longer result in 'Unsupported URI format' errors

  • FIXED: Fetching feeds behind a proxy no longer results in 'The remote server returned an error: (407) Proxy Authentication Required.' when the feed references a DTD

  • FIXED: Duplicate entries when the title of an item changes

  • FIXED: Relative URIs in images and links show up with about:blank as their base URI for certain XSLT templates.

  • FIXED: Space bar sometimes does not always move to the next unread item

  • FIXED: Environment variables are not accepted within Options|Remote Storage configuration protocol "File Share"

  • FIXED: Toolbars don't remember that their visible state on restart when RSS Bandit closed via system tray context menu

  • FIXED: Feeds returning HTTP 410 status code now disabled automatically and report the error "The requested resource is not longer available"


 

Categories: RSS Bandit

August 30, 2004
@ 04:21 PM

Today I was going to release the RSS Bandit v1.2.0.114 service pack 1. However I will not be able to because even though everything worked fine when testing the application the moment I decided to build the installer and run it for the first time on my machine it crashed. The only difference I could tell was that while testing it I created 'Debug' builds but for the installer I created a 'Release' build. Even more annoying is the exception that occurs. It seems having an empty static constructor is causing a TypeInitializationException but only in 'Release' builds.

I hate this shit. I'll have to look into this when I get back from work this evening.

On the positive side, it looks like the RSS Bandit road map is on schedule. SP1 for v1.2.0.114 is basically done once I can track down the TypeInitializationException issue and then an installer should show up shortly afterwards. We've started the refactorings needed for NNTP support this weekend and should be done later this week. It's funny, I used to think the code was well factored because the infrastructure for supporting multiple  syndication formats was straightforward and flexible. Then supporting NNTP comes along and a lot of the assumptions we made in the code need to be thrown out of the Window.


 

Categories: Ramblings | Technology

August 28, 2004
@ 03:52 PM

According to the Microsoft press release Microsoft Announces 2006 Target Date for Broad Availability Of Windows "Longhorn" Client Operating System

Microsoft will deliver a Windows storage subsystem, code-named "WinFS," after the "Longhorn" release. The new storage system provides advanced data organization and management capabilities and will be in beta testing when the "Longhorn" client becomes available.

"We’ve heard loud and clear from customers that they want improved productivity, easier deployment, increased reliability and enhanced security, as well as the many innovations we’ve been working on. We’ve had to make some trade-offs to deliver the features corporate customers, consumers and OEMs are asking for in a reasonable time frame," said Jim Allchin, group vice president of the Platforms Group at Microsoft. "Our long-term vision for the Windows platform remains the same."
...
At a meeting today with several hundred of the company’s top developer evangelists from around the world, Microsoft also announced that the Windows WinFX developer technologies, including the new presentation subsystem code-named "Avalon" and the new communication subsystem code-named Indigo, will be made available for Microsoft® Windows XP and Windows Server 2003 in 2006. This availability will expand the scope of opportunity for developers by enabling them to write applications that can run on hundreds of millions of PCs, resulting in enhanced experiences for users of those operating systems.

I'm happy about this decision, but the cynic in me is thinking "better late than never". This has been an instructive experience in learning how long it takes information to go from the front lines to the ones pulling the strings in the B0rg cube.  And how long it takes folks to act on this information.


 

Categories: Life in the B0rg Cube