Over the holidays I had a chance to talk to some of my old compadres from the XML team at Microsoft and we got to talking about the JSON as an alternative to XML. I concluded that there are a small number of key reasons that JSON is now more attractive than XML for kinds of data interchange that powers Web-based mashups and Web gadgets widgets. This is the second in a series of posts on what these key reasons are.

In my previous post, I mentioned that getting around limitations in cross domain requests imposed by modern browsers has been a key reason for the increased adoption of JSON. However this is only part of the story.

Early on in the adoption of AJAX techniques across various Windows Live services I noticed that even for building pages with no cross domain requirements, our Web developers favored JSON to XML. One response that kept coming up is the easier programming model when processing JSON responses on the client than with XML. I'll illustrate this difference in ease of use via a JScript code that shows how to process a sample document in both XML and JSON formats taken from the JSON website. Below is the code sample

var json_menu = '{"menu": {' + '\n' +
'"id": "file",' + '\n' +
'"value": "File",' + '\n' +
'"popup": {' + '\n' +
'"menuitem": [' + '\n' +
'{"value": "New", "onclick": "CreateNewDoc()"},' + '\n' +
'{"value": "Open", "onclick": "OpenDoc()"},' + '\n' +
'{"value": "Close", "onclick": "CloseDoc()"}' + '\n' +
']' + '\n' +
'}' + '\n' +
'}}';


var xml_menu = '<menu id="file" value="File">' + '\n' +
'<popup>' + '\n' +
'<menuitem value="New" onclick="CreateNewDoc()" />' + '\n' +
'<menuitem value="Open" onclick="OpenDoc()" />' + '\n' +
'<menuitem value="Close" onclick="CloseDoc()" />' + '\n' +
'</popup>' + '\n' +
'</menu>';

WhatHappensWhenYouClick_Xml(xml_menu);
WhatHappensWhenYouClick_Json(json_menu);

function WhatHappensWhenYouClick_Json(data){

  var j = eval("(" + data + ")");

  WScript.Echo("
When you click the " + j.menu.value + " menu, you get the following options");

  for(var i = 0; i < j.menu.popup.menuitem.length; i++){
   WScript.Echo((i + 1) + "." + j.menu.popup.menuitem[i].value
    + " aka " + j.menu.popup.menuitem[i].onclick);
  }

}

function WhatHappensWhenYouClick_Xml(data){

  var x = new ActiveXObject( "Microsoft.XMLDOM" );
  x.loadXML(data);

  WScript.Echo("When you click the " + x.documentElement.getAttribute("value")
                + " menu, you get the following options");

  var nodes = x.documentElement.selectNodes("//menuitem");

  for(var i = 0; i < nodes.length; i++){
   WScript.Echo((i + 1) + "." + nodes[i].getAttribute("value") + " aka " + nodes[i].getAttribute("onclick"));
  }
}

When comparing both sample functions, it seems clear that the XML version takes more code and requires a layer of mental indirection as the developer has to be knowledgeable about XML APIs and their idiosyncracies. We should dig a little deeper into this. 

A couple of people have already replied to my previous post to point out that any good Web application should process JSON responses to ensure they are not malicious. This means my usage of eval() in the code sample, should be replaced with JSON parser that only accepts 'safe' JSON responses. Given that that there are JSON parsers available that come in under 2KB that particular security issue is not a deal breaker.

On the XML front, there is no off-the-shelf manner to get a programming model as straightforward and as flexible as that obtained from parsing JSON directly into objects using eval(). One light on the horizon is that E4X becomes widely implemented in Web browsers . With E4X, the code for processing the XML version of the menu document above would be 

function WhatHappensWhenYouClick_E4x(data){

  var e = new XML(data);

  WScript.Echo("When you click the " + j.menu.value + " menu, you get the following options");

  foreach(var m in e.menu.popup.menuitem){
   WScript.Echo( m.@value + " aka " + m.@onclick);
  }

}

However as cool as the language seems to be it is unclear whether E4X will ever see mainstream adoption. There is an initial implementation of E4X in the engine that powers the Firefox browser which seems to be incomplete. On the other hand, there is no indication that either Opera or Internet Explorer will support E4X in the future.

Another option for getting the simpler object-centric programming models out of XML data could be to adopt a simple XML serialization format such as XML-RPC and providing off-the-shelf Javascript parsers for this data format. A trivial implementation could be for the parser to convert XML-RPC to JSON using XSLT then eval() the results. However it is unlikely that people would go through that trouble when they can just use JSON.

This may be another nail in the coffin of XML on the Web. 


 

Categories: Web Development | XML | XML Web Services
Tracked by:
http://just.shelleypowers.com/juststuff/dare-to-stay/ [Pingback]
http://sitecore.alexiasoft.nl/2007/01/02/json-in-the-news/ [Pingback]
http://www.megginson.com/blogs/quoderat/2007/01/03/all-markup-ends-up-looking-li... [Pingback]
http://www.25hoursaday.com/weblog/PermaLink.aspx?guid=d639d908-7fbc-40cd-8e36-e6... [Pingback]
http://blogs.msdn.com/peterlau/archive/2007/01/04/dare-obasanjo-discusses-how-br... [Pingback]
http://slashstar.com/blogs/tim/archive/2007/01/05/json-and-xml.aspx [Pingback]
http://blogs.nitobi.com/dave/?p=178 [Pingback]
http://www.ajaxgirl.com/2007/01/04/json-vs-xml-the-debate/ [Pingback]
http://cheapflightblogs.info/json-vs-xml-browser-programming-models/ [Pingback]
http://utlf9fi.net/05/index.html [Pingback]
http://zabivwn.net/california/sitemap1.html [Pingback]
http://fwmwly7.net/delaware/sitemap1.html [Pingback]
http://yftbsy1.net/movies/sitemap1.html [Pingback]
http://fwmwly7.net/colorado/sitemap1.html [Pingback]
http://blogger/cs/blogs/mconly/archive/2008/03/12/GeoJSON-_2D00_-there_2700_s-a-... [Pingback]
http://ukpuuq8.net/activities/sitemap1.html [Pingback]
http://ukpuuq8.net/estate/sitemap1.html [Pingback]
http://lvxjtu6.net/motorcycles/sitemap1.html [Pingback]
http://kivablog.com/sitemap1.html [Pingback]
http://restablog.dreamhosters.com/ [Pingback]
http://box405.bluehost.com/~dugablog/sitemap2.html [Pingback]
http://tkru7ln.net/sitemap1.html [Pingback]
http://bycv3ve.net/mortgage/sitemap1.php [Pingback]
http://node22.myserverhosts.com/~boosters/ebay/sitemap1.html [Pingback]
http://kiva.startlogic.com/sitemap1.html [Pingback]
http://host239.hostmonster.com/~blogford/sitemap1.html [Pingback]
http://host239.hostmonster.com/~blogford/sitemap2.html [Pingback]
http://gator413.hostgator.com/~digital/investment/sitemap1.html [Pingback]
http://gator413.hostgator.com/~digital/credit/sitemap1.html [Pingback]
http://fskh6vo.net/sitemap1.html [Pingback]
http://da7fcil.net/09/index.html [Pingback]
http://gh9kwkn.net/livejournal/sitemap1.php [Pingback]
http://mi9bxxt.net/sitemap1.html [Pingback]
http://kmnjey0.net/family/sitemap1.html [Pingback]
http://qw6yfnf.net/sitemap1.html [Pingback]
http://d579737.u108.floridaserver.com/sitemap2.html [Pingback]
http://box432.bluehost.com/~zbloginf/sitemap2.html [Pingback]
http://gator442.hostgator.com/~hockteam/southwest/sitemap1.html [Pingback]
http://gator442.hostgator.com/~hockteam/university/sitemap1.html [Pingback]

Tuesday, January 02, 2007 9:26:21 PM (GMT Standard Time, UTC+00:00)
I'll further point out that your XML example only works on Windows, and I think only in IE as well. You'd need to write even more code to make it work cross-platform and cross-browser. In contrast, the JSON example works pretty much everywhere.

As for E4X, Flash 9 supports it fully, and since the Tamarin javascript engine is going to be part of the Mozilla engine, so will FireFox someday. In the meantime, anyone can use Flash 9 to process E4X for their webpages today, in most any browser. Longer term, Mozilla/Firefox is adopting the Flash 9 javascript engine (Tamarin), so they too will have full E4X support. Maybe Microsoft should try and catch up with the rest of the web and support it too.
Wednesday, January 03, 2007 12:11:54 AM (GMT Standard Time, UTC+00:00)
As Andrew pointed out, Tamarin will make E4X a very real reality, and I have to say that I personally prefer that direction. I think my bias comes from the conventional usage pattern of just eval'ing potentially executable code as a method of data interchange. Perhaps JSON parsers completely answer that complaint, but I'll need to spend some time using both over the next few months to see if my feelings on the matter will change.
Wednesday, January 03, 2007 3:01:27 PM (GMT Standard Time, UTC+00:00)
As Andrew Shebanow points out, "selectNodes" is an MSXML-specific extension function; using standard DOM methods to extract the desired information is considerably more verbose.

The XSLT approach is overkill for new development when, as you point out, one can just start out with JSON. However it can be of some value when used with legacy data which is already available in XML. This is something I have done when adding a new front end to an existing system; transcoding on the server using XSLT is a quick and easy way to maintain access to XML-consuming clients while allowing new clients to work with JSON.
Comments are closed.