Saturday was musuem day for me. I visited both the Science Fiction Museum and Hall of Fame in Seattle and the Museum of Glass in Tacoma. The entrance fee for the Science Fiction Museum $12.95 which is a bit overpriced considering what one gets out of the tour. The musuem is primarily a collection of old science fiction books and magazines as well as props from various movies. Some of the props are quite cool such as the alien queen and Ripley's construction suit from Aliens while others such as the collection of phasers from the various Star Trek movies failed to light my fire. At the end it seemed more like I'd just been shown some geek's private hoard of science fiction memorabilia than I'd been at an actual musuem or hall of fame. I probably would have felt less ripped off if the cover fee was $5 instead of almost $13.

The Museum of Glass was more satisfying although it also felt like it was over too soon. The  Einar and Jamex de la Torre: Intersecting Time and Place was amazing although a bit gory. The artwork by the Torre brothers had demons, exposed human organs and catholic religious relics as a recurring theme. It led to some interesting artwork which I could see some of the parents with younger children had difficulty explaining to their kids. The Solid Cinema: Sculpture by Gregory Barsamian was also impressive. They were mechanical animated pieces illuminated by strobe light. There were only three pieces but they were extremely well done and I spent some time scratching my head trying to figure out how they worked. The Museum of Glass was definitely worth the $10.

The next local museum on my list is the Museum of Flight.


 

Categories: Ramblings

Michael Gartenburg has a blog posting entitled Is Google doing what Microsoft couldn't with their new search bar?  where he writes

As Yogi would say, "it's deja vous, all over again". When Google introduced the newest version if their toolbar, it seems they added a feature that sounds very similar to what Microsoft wanted to do with SmartTags. Apparently the new software will create links in web text that will send you back to Google sites or sites of their choosing. If I recall correctly, there was a huge outcry over the SmartTag feature. Even petitions. How come there is no outcry here? Is it because Google does no evil?

Like I said yesterday, who needs a new browser to do stuff like this when you can co-opt IE with a toolbar?

This is one of the key differences between Google and Microsoft; perception. I am glad to see Google imitating one of Microsoft's innovations from a few years ago. After all, imitation is the sincerest form of flattery. As can be expected Dave Winer is already on the offensive.

Personally, I can't wait to see how much cognitive dissonance this causes the Slashdot crowd.


 

Categories: Technology

February 16, 2005
@ 04:17 PM

From the first day we added the Search Folder feature to RSS Bandit, I've wanted to be able to create a folder that contained unread messages that are a week old which contain 'Microsoft' in the title from the Slashdot, InfoWorld or Microsoft-Watch feed. However the existing Search Folder feature did not allow you to restrict the targets of a search to particular feeds or categories. The back end code to do this has existed for a while but Torsten never got around to adding the UI for this, until yesterday. It's looking like the Wolverine release is shaping up to be the best release yet.


 

Categories: RSS Bandit

This morning I saw a post by Tim Bray entitled Another Loyal Oppositionist where he pointed to a post by James Governor stating that Microsoft is ignoring the demand for toolkits that support plain old XML over HTTP and instead focusing on SOAP-based XML Web Services and the WS-* family of specifications. Before I could respond I saw that Mike Champion had already beat me to the punch with his post  MS Ignoring developer demand for REST tools? where he writes

The long running PR battle is heating up again between those who advocate the implementation of service architectures [1] "RESTfully" using XML transferred via HTTP, vs those who work with SOAP, WSDL, and the specs built on them known as "WS-*".  The item that finally got me to blog about this painful [2] subject was James Governor's SOAP is boring, wake up Big Vendors or get niched

Evidence continues to mount that developers can' t be bothered with SOAP and the learning requirements associated with use of the standard for information interchange.  ...Developers are turning their backs on the standard. Folks that is, building interesting information splicing apps--semantically rich platforms like flickr and Amazon are being accessed by RESTful methods, not IBM/MS defined "XML Web Services" calls.

Yesterday my partner Stephen issued a wake up call for middleware and tools vendors - give developers what they want, not what you think they should have. ...

One big question is why haven't IBM and Microsoft responded? The obvious answer is vested interest. When you have "bet the company" on a technology stack its kind of a drag to have to respond to something else.

A few thoughts: For one thing, I'd note that Microsoft  has responded to this need ... in about 1999.  The XMLHTTPRequest object implemented since the second version of the MSXML library provides such a powerful and convenient way to use XML and HTTP together that this API has been implemented in competing platforms, and was featured in an XML.com article just last week.   Forgive us for not touting Microsoft's support for this newfangled "RESTful" approach, I guess people here thought it was old news :-)

When I read James Governor's post, I also wondered what he was expecting from a toolkit for supporting XML Web Services that just used POX (plain old XML). We've shipped multiple XML APIs and have libraries for working with HTTP in the .NET Framework. In MSXML, we shipped XMLHTTP which recently started making headlines again because Google has been using it heavily in their recent web applications such as Google Suggest. Microsoft has and will continue to ship libraries and tools that make it easier for developers to work with plain old XML over HTTP.

SOAP and the WS-* family of specifications target developer problems with more sophisticated requirements and needs than can be satisfied by simply using POX. As I stated in my recent post  On Interoperability and Tim Ewald's 3 Web Services Stacks

However context is everything. Replacing a world of distributed applications written with DCOM, CORBA, Java RMI, etc with one where they are written using the WS-* protocols is a step in the right direction. I completely agree that it leads to better interoperability across technologies produced by the big vendors who are using incompatible technologies today.

But when your plan is to reach as many parties as possible one should favor simpler Web services technologies like plain old SOAP or just plain old XML (aka POX). Plain old XML doesn't necessarily mean following the principles of REST either. After all a number of successful plain old XML services on the Web don't follow these principles to the letter. For example, Joe Gregorio pointed out in his article Amazon's Simple Queue Service that Amazon's Queueing Service violated these principles. In Sam Ruby's post entitled Vacant Space he points out that plain old XML web services exposed by Bloglines and del.icio.us aren't RESTful either.

Given that Don Box linked to the above post and commented favorably on it I believe this makes it clear that we at Microsoft understand that POX, SOAP, and WS-* each have a part to play in the world of XML Web services.


 

Categories: XML Web Services

For Her and For Him. Too bad I found about these after I got my gifts. I guess there's always next year.


 

February 11, 2005
@ 04:57 AM

Steve Vinoski has a blog posting entitled Focus on the contract where he writes

Tim offers some extremely excellent advice (as usual) regarding what really matters when you write your services. If I may paraphrase what he says and perhaps embellish it a bit, starting from the implementation language and generating your contracts from it is just plain wrong, wrong, wrong, at least for systems of any appreciable magnitude, reach, or longevity. Instead, focusing on the contracts first is the way to go. I've written about this for many years now, starting well over a decade ago.

When you start with the code rather than the contract, you are almost certainly going to slip up and allow style or notions or idioms particular to that programming language into your service contract. You might not notice it, or you might notice it but not care. However, the guy on the other side trying to consume your service from a different implementation language for which your style or notions or idioms don't work so well will care.

Although Steve Vinoski's argument sounds convincing, there is one problem with it. It is actually much easier to make an uninteroperable Web service if one starts with the service contract instead of with object oriented code. The reason for this is quite simple and one I've harped on several times in the past; the impedance mismatch between XSD and objects is quite significant. There are several constructs in W3C XML Schema which simply have no counterpart in traditional object oriented languages which cause current XML Web service toolkits to barf when consuming them. For example, the XmlSerializer class in the .NET Framework supports about half the constructs in W3C XML Schema. Most XML Web Service toolkits support a similar number [but different set] of features of W3C XML Schema.

This isn't theoretical, more than once while I was the program manager for XML Schema technologies in the .NET Framework I had to take conference calls with customers who'd been converted to the 'contract first' religion only to find out that toolkits simply couldn't handle a lot of the constructs they were putting in their schemas.Those conversations were never easy.

The main thing people fail to realize when they go down the 'contract first' route is that it is quite likely that they have also gone down the 'XML first' route which most of them don't actually want to take. Folks like Tim Ewald don't mind the fact that sometimes going 'contract first' may mean they can't use traditional XML Web Service toolkits but instead have to resort to SAX, DOM and XSLT. However for many XML Web Service developers this is actually a problem instead of a solution.


 

Categories: XML | XML Web Services

Tim Ewald has a post entitled My 3 Web Services Stacks where he writes

The point Chris was making was that you created a COM class, your choice of data type determined who could use your component. In other words, your choice affected how far your code could reach. There will be a similar split in the Web services world, hence the title of this post. This time though the spin will be a little different. It won't focus on data format so much (it's all just XML), but on behavior. To wit, “My 3 Web Service Stacks“:

First, there is basic SOAP over HTTP described with WSDL and XSD. It hits a sweet spot for developers who want tools to hide XML details but is still simple enough to implement everywhere. Then there is WS-*, which offers richer functionality but is harder to implement and will not be supported by all products, especially legacy systems. Finally, there are toolkit specific features like BEA's idiom for async messaging over HTTP using a header to convey a callback URL, Iona's Artix support for a CORBA binding, or Microsoft's Indigo support for binary messaging.

In this world, your choice of feature will decide how far you can reach: to everyone, to everyone using WS-* enabled tools and infrastructure, or only to others using the same toolkit you are.

I completely agree with this and in fact back when I was on the XML team I used to be frustrated with the fact that the given all the reams of text coming out of the XML Web Services Developer Center on MSDN Microsoft never did a good job of explaining this to developers. Of course, this would require admitting that many developers don't require the functionality of the WS-* specs and due to their complexity in certain cases this would actually hamper interoperability compared to plain old SOAP (aka POS). To some this may contradict Bill Gates's statements on interoperability in his recent executive memo Building Software That Is Interoperable By Design  where he wrote

The XML-based architecture for Web services, known as WS-* ("WS-Star"), is being developed in close collaboration with dozens of other companies in the industry including IBM, Sun, Oracle and BEA. This standard set of protocols significantly reduces the cost and complexity of connecting disparate systems, and it enables interoperability not just within the four walls of an organization, but also across the globe. In mid-2003, Forrester Research said that up to a "ten-fold improvement in integration costs will come from service-oriented architectures that use standard software plumbing." 

However context is everything. Replacing a world of distributed applications written with DCOM, CORBA, Java RMI, etc with one where they are written using the WS-* protocols is a step in the right direction. I completely agree that it leads to better interoperability across technologies produced by the big vendors who are using incompatible technologies today.

But when your plan is to reach as many parties as possible one should favor simpler Web services technologies like plain old SOAP or just plain old XML (aka POX). Plain old XML doesn't necessarily mean following the principles of REST either. After all a number of successful plain old XML services on the Web don't follow these principles to the letter. For example, Joe Gregorio pointed out in his article Amazon's Simple Queue Service that Amazon's Queueing Service violated these principles. In Sam Ruby's post entitled Vacant Space he points out that plain old XML web services exposed by Bloglines and del.icio.us aren't RESTful either.

When building distributed applications using XML, one should always keep in mind the 3 web service stacks. My day job now involves designing XML web services for both internal and external consumption so I'll probably start writing more about web services in the coming months.


 

Categories: XML Web Services

February 10, 2005
@ 02:01 PM

Just saw Iran Promises 'Burning Hell' for Any Aggressor and N.Korea Says It Has Nuclear Arms, Spurns Talks. Looks like the world woke up on the wrong side of bed.


 

February 9, 2005
@ 03:05 PM

David Megginson (the creator of SAX) has a post entitled The complexity of XML parsing APIs where he writes

Dare Obasanjo recently posted a message to the xml-dev mailing list as part of the ancient and venerable binary XML permathread (just a bit down the list from attributes vs. elements, DOM vs. SAX, and why use CDATA?). His message including the following:

I don’t understand this obsession with SAX and DOM. As APIs go they both suck[0,1]. Why would anyone come up with a simplified binary format then decide to cruft it up by layering a crufty XML API on it is beyond me.

[0] http://www.megginson.com/blogs/quoderat/archives/2005/01/31/sax-the-bad-the-good-and-the-controversial/

[1] http://www.artima.com/intv/dom.html

I supposed that I should rush to SAX’s defense. I can at least point to my related posting about SAX’s good points, but to be fair, I have to admit that Dare is absolutely right – building complex applications that use SAX and DOM is very difficult and usually results in messy, hard-to-maintain code.

I think this is a pivotal part of the binary XML debate. The primary argument for binary serializations of XML is that certain parties want to get the benefit of the wide array of technologies for processing XML yet retain the benefits of a binary format such as reduced size on the wire and processing time. Basically having one's cake and eating it too.

For me, the problem is that XML is already being pulled from too many directions as it is. In retrospect I realize it was foolish for me to think that the XML team could come up with a single API that would satisfy people processing business documents written in  wordprocessingML, people building distributed computing applications using SOAP or developers reading & writing to application configuration files. All of these scenarios use intersecting subsets of the full functionality of the XML specification. The SOAP specs go as far as banning some features of XML while others are simply frowned upon based on the fact that the average SOAP toolkit simply doesn't know what to do with them. One man's meat (e.g. mixed content) is another man's poison.

What has ended up happening is that we have all these XML APIs that expose a lot of cruft of XML that most developers don't need or even worse make things difficult in the common scenarios because they want to support all the functionality of XML. This is the major failing of APIs such as the .NET Framework's pull model parser class, System.Xml.XmlReader, DOM and SAX. The DOM also has issues with the fact that it tries to support conflicting data models (DOM vs. XPath) and serialization formats (XML 1.0 & XML 1.0 + XML namespaces). At the other extreme we have APIs that try to simplify XML by only supporting specific subsets of its expressivity such as the System.Data.DataSet and the System.Xml.XmlSerializer classes in the .NET Framework. The problem with these APIs is that the developer is dropped of a cliff once they reach the limits of the XML support of the API and have to either use a different API or resort to gross hacks to get what they need done. 

Unfortunately one of the problems we had to deal with when I was on the XML team was that we already had too many XML APIs as it was. Introducing more would create developer confusion but trying to change the existing ones would break backwards compatibility. Personally I'd rather see efforts being to create better toolkits and APIs for the various factions that use XML to make it easier for them to get work done than constantly churning the underlying format thus fragmenting it.


 

Categories: XML