One of the biggest problems that faces designers of distributed application is making sure their applications are resistant to change (i.e. versioning). Making sure services are designed with forwards and backwards compatibility in mind is especially challenging when one has no control over the various parties that will be invoking the service. 

In traditional applications, enumerated types (aka enums) are particularly problematic when it comes to versioning. The problem case being when new values are added to an enumerated type in a later version. The .NET Framework Design Guidelines about adding new values to enumerated types shows how insidious this problem actually can be. The original guidelines stated that it was OK to add values to enumerated types but this was later surrounding with lots of warnings as to why this is a bad idea. The original guideline states

It is acceptable to add values to enums

If a caller receives an enum as an out or return value (or as a parameter to a virtual method), and switches over every valid value of the enum and throws unconditionally in the default case, then added values will cause the caller to perform the default case, and throw

If a caller receives an enum as an out or return value, and performs some default behavior in the default case, then added values will behave as if they were default values

If you receive compatibility data for your application which indicates returning the new values from the existing API will cause issues for callers, consider adding a new API which returns the new (and old) values, and deprecate the old API. This will ensure your existing code remains compatible.

The following addendum was later added

Adding a value to an enum has a very real possibility of breaking a client. Before the addition of the new enum value, a client who was throwing unconditionally in the default case presumably never actually threw the exception, and the corresponding catch path is likely untested. Now that the new enum value can pop up, the client will throw and likely fold.

The biggest concern with adding values to enums, is that you don't know whether clients perform an exhaustive switch over an enum or a progressive case analysis across wider-spread code. Even with the FxCop rules above in place and even when it is assumed that client apps pass FxCop without warnings, we still would not know about code that performs things like  if (myEnum == someValue) ...  in various places.

Clients may instead perform point-wise case analyses across their code, resulting in fragility under enum versioning. It is important to provide specific guidelines to developers of enum client code detailing what they need to do to survive the addition of new elements to enums they use. Developing with the suspected future versioning of an enum in mind is the required attitude.

There is an additional wrinkle when adding values to an enumerated type in XML Web Services especially if the calling application is built using the .NET Framework. Let's say we have the following enumerated type declaration in the schema for v1 of our service

<xsd:simpleType name="SyndicationFormat">
  <xsd:restriction base="xsd:string">
    <xsd:enumeration value="RSS10"/> 
    <xsd:enumeration value="RSS20"/>     
    <xsd:enumeration value="CDF"/>

and in a later version modify it in the following way

<xsd:simpleType name="SyndicationFormat">
  <xsd:restriction base="xsd:string">
    <xsd:enumeration value="RSS10"/> 
    <xsd:enumeration value="RSS20"/>        
    <xsd:enumeration value="CDF"/>
    <xsd:enumeration value="Atom"/> 

Of course, as mentioned in the amended discussion on adding values to enumerated types in the .NET Framework design guidelines, this is a forwards incompatible change because new messages will very likely not be properly processed by old clients. However when the consuming applications are built using the XML Web services capabilities of the .NET Framework we dont even get that far. Instead you will most likely get an exception that looks like the following

Unhandled Exception: System.InvalidOperationException: There is an error in XML document (1, 1022). ---> System.InvalidOperationException: 'Atom' is not a valid value for SyndicationFormat.
   at Microsoft.Xml.Serialization.GeneratedAssembly.XmlSerializationReaderSyndicationService.Read4_SyndicationFormat(String s)

This is because the client code will have been generated from the v1 version of the WSDL for the service where "Atom" was not a valid value for the SyndicationFormat enumerated type. So adding a value to an enumerated type in an existing XML Web service end point is pretty much guaranteed to break client applications.

I love my day job. ;)

PS: This example is made up but the problem is real.


Categories: XML Web Services

From the Microsoft press release Microsoft Acquires Teleo, Innovative VoIP Technology Company we learn

REDMOND, Wash. — Aug. 30, 2005 — Microsoft Corp. today announced it has acquired Teleo Inc., a provider of voice over Internet protocol (VoIP) software and services that enable the placement of phone calls from PCs to traditional phones and that deliver this technology in unique ways through a variety of software and Web applications. Microsoft expects to combine the technology and expertise of Teleo with the existing VoIP investments of MSN to further develop products and services that connect consumers to the people and information that most matter to them. Financial details were not disclosed.

Founded in 2003 and headquartered in San Francisco, Teleo is a privately held company whose initial planned service offering, also called Teleo, was designed to allow customers to use their PC to make phone calls to cell phones, regular phones or other PCs. Through its integration with Microsoft® Outlook® and Microsoft Internet Explorer, the Teleo service was designed to facilitate click-to-call dialing of any telephone number that appears on-screen, for example through a Web site or via search results or e-mail.

VoIP technology already is prominently featured in MSN® Messenger as well as other Microsoft products and services. Microsoft plans to incorporate and expand upon Teleo technologies, integrating them into the infrastructure that supports MSN and ultimately projects delivering new VoIP consumer applications in future releases of MSN services.

This is pretty good news for MSN Messenger users. Instant Messaging is more than just sending text from one computer to another. Voice and video conversations are also valid ways to communicate using our instant messaging client. Also the ability to communicate with people on other devices besides their computers is also one thing we think is important. In fact, while I was in Nigeria I made heavy use of MSN Messenger's SMS to IM conversation capabilities to send messages to my girl friend's phone while she was at work back here in Seattle.

It's all about communication and the addition of the Teleo folks to our fold will only increase and improve the communication capabilities of MSN Messenger. Excellent.


Categories: MSN

Torsten just posted the following announcement about the RSS Bandit support forums

Because of the amount of spam from fake user/member accounts we changed the rules related to how new members are approved to use/post to our forums.

Each new user will be approved by one of the forum administrators manually. You will have to provide a VALID e-mail address while registering. So if you don't get approved within 10 minutes, please don't try again with other/changed e-mail addresses.
We are humans and need to sleep some minutes every day

If you get access, you will receive a mail from forumadmin at soon...

We were getting almost a dozen spam posts a day which made the forum RSS feed completely useless. We shut down accepting new members for a few days until Torsten figured out that we could add an approval process for new member requests.

Having to deal with approving new member requests is a hassle but a lot less than having to delete posts and member accounts created by spambots. Spammers suck. 


Categories: RSS Bandit

Since I've been in the process of adding support for synchronization between RSS Bandit and Newsgator Online, I've been trying to eat my own dogfood and use both applications. A ready opportunity presented itself when I travelled to Nigeria a few weeks ago and wanted to keep up with my RSS feeds. While I was in Nigeria, I was always on a dialup connection and used about four different PCs and 1 Mac. It seemed to make sense to favor a web-based RSS reader as opposed to trying to install RSS Bandit and most likely the .NET Framework on all these machines which in some cases I didn't have administrator access to anyways.

After unsuccesfully trying to use Newsgator Online I ended up settling with Bloglines instead for a number of reasons. The first being that Bloglines is a lot faster than Newsgator Online whose interface seems to move at a snail's pace over dial up. The second being that a basic feature like "Mark All Items As Read" seems to be missing from Newsgator Online. Trying to visit every feed individually to mark all its items as read became such an ordeal, I simply gave up.

I'd rather not think that I've wasted the time I've spent working on implementing synchronization between RSS Bandit and Newsgator Online since the current user experience of the latter service leaves much to be desired. I sincerely hope there are some changes in the works for the service.


Lots of folks I've talked to at work have had mixed feelings about the recently announced Google Talk. The feelings are usually relief and disapointment in equal portions. Relief because Google hasn't shipped yet another application that redraws the lines of what that class of application should look like (e.g. GMail and Google Maps) meaning we have to play catch up. Disappointment because we actually expect better from Google.

You can see some of these sentiments from various folks at MSN such as Mike Torres in his post Competition & Google Talk, Sanaz Ahari in her posts Google Talk review and Google Talk pt II, and Richard Chung in his post Google Talk Blows.

Of course, Microsoft employees aren't the only ones underwhelmed by Google Talk. Doing a quick search of the blogosphere for comments about Google Talk leads me to lots of bloggers expressing ambivalence about the application.

The most interesting reaction I noticed was from Robert X. Cringely who was inspired to ask Has Google Peaked? in his most recent column. In the article he not only asks whether Google's best products are already behind it but also points out that they have become experts at Fire and Motion. Below is the relevant excerpt from Robert X. Cringely's column

Google plays on its technical reputation even though, if you look closely, it isn't always deserved. Many Google products haven't been revved since they were introduced. And while some Google products are excellent, some aren't, too.

Google likes to play the Black Box game. What are they DOING in all those buildings with all those PhDs? I'm sure they are doing a lot that will change the world, but just as much that will never even be seen by the world. For the moment, though, it doesn't matter because Google can play the spoiler. They offered a gigabyte of e-mail storage, for example, at a time when they had perhaps one percent the number of e-mail users as a Hotmail or Yahoo. And by limiting the Gmail beta, they avoided the suffering of both those other companies when they, too, had to increase their storage allocations, but for tens of millions of real users.

Now Google will do something similar for chat and VoIP with Gtalk, pushing the others toward an interoperability that undermines the hold each company thinks it has on its users.

In my original post about Google Talk I mentioned that using Jabber/XMPP was an attempt at a disruptive move by Google. Of course, it is only disruptive if its competitors like AOL, MSN and Yahoo! react by breaking down the walls of the walled gardens they have created in their various IM products.

I find it interesting that instead of trying to push the envelope with regards to the user experience in instant messaging, Google chose to 'punk' its competitors instead. I guess we'll just have to see how MSN, Yahoo & AOL end up reacting. 


Categories: MSN

Sean Lyndersay has posted about an update to the Simple List Extensions specification. The update fixes some of the issues that were pointed out by members of the RSS community such as the problem pointed out in Phil Ringnalda's post MS Embraces RSS because RSS elements were being reused outside their original context. The cf:listinfo element now has the following structure

label="User-readable name for the sort field"
yes|no" />

User-readable name for the grouping" />


This is a lot better than the original spec* which instead of naming the element being sorted on using attributes of  the cf:sort element actually included it as a child element instead. The only problem I have with the spec is that I don't see where it states the date format that is expected to be used if the data type is date. I guess this was problematic since different syndication formats use different date formats. RSS 2.0 uses the RFC 822 format, Atom 1.0 uses the RFC 3339 format while Dublin Core [which is what RSS 1.0 dates typically are] uses the format from the W3C Note on Date and Time formats. So an extension element really can't define what the date format will be in the feed it is embedded in since it may be embedded in an RSS or Atom feed.

That is a potential gotcha for implementers of this spec. I think I'll pass at implementing support for the Simple List Extensions spec in RSS Bandit this time around since my plate is full for this release. I'll add it to the list of features that will show up in the following release [tentatively codenamed Jubilee].

* I would have linked to the spec but as usual MSDN has broken permalinks such as . Someone really needs to force everybody that works there to read Cool URIs don't change.


August 25, 2005
@ 03:28 PM

David Card of Jupiter Research has a blog post entitled Pre-emptive IM Strike from MSN where he describes one of the reasons I love working at MSN

Finally, MSN wants to remind everyone that it's got six years of experience in this stuff -- hear that, Sergey? -- and is sticking to its promise of thrice-yearly upgrades, so watch for more goodies in November.

The upgrades are all fine, but I was actually more impressed by Irving's crisp articulation of the IM Big Picture. MSN is trying to move the conversation away from IM (defined as "real-time text messaging," how dull) to "contacts." I think they downplay presence management, but that's okay, presence sounds too much like AOL-friendly talk. As does Buddy Lists, but I can't break the habit.

What's critical about IM isn't real-time text messaging but the Buddy List as a communications/presence management hub.(Link is ancient history for geek/vision cred.) You manage your buddies and buddy groups and their relationships to you (and each other), shifting those according to what persona you're inhabiting (work, home, fun, shopping, etc.) and what communications are available to you or you want to make available to them. Then broadcast that selectively. The company that can teach consumers how to do this, and own that management tool is in a very powerful position. The portals will be duking it out with the mobile carriers for this, I suspect.

MSN's vision is pretty parallel to the one above. Irving claims Microsoft has an "ABCH" -- Address Book Clearing House -- that is a repository for all those contacts, relationships and permissions that come from Messenger and Hotmail. You can imagine how powerful that might be -- we're not just talking "gleams" and sharing playlists here -- and how much grief Microsoft will get for playing Big Brother.

Anyway, MSN gets it.

Besides the fact that I'm one of the program managers for ABCH and thus it's kind of cool to get a shout out from our VP, there are some other things about this excerpt that have me smiling. When I first came to MSN I thought I'd have to beat people over the head with the message that Social Software is the Platform of the Future but I didn't have to because everybody gets it. Everyone I've talked to from vice presidents and general managers to developers and testers working directly on individual features has their mind wrapped around our social software vision. It's really simple, the #1 goal of social software should be around improving the ways I communicate and interact with the people I know and a secondary goal is giving me avenues to communicate and interact with people I don't. It's really that simple.

We definitely have some interesting stuff coming down the road in the next couple of months. I can only hope our users have as much fun using our software as we had building it.

On an unrelated note, I have updated the track of the week from my high school days on my space. Just click on the play button on the Windows Media player module to hear this week's track. Listening to some of this stuff I can't help thinking, "I could have been a contender". :)


Categories: MSN

As I mentioned a few weeks ago, we plan to ship an alpha of the next version of RSS Bandit [codenamed Nightcrawler] next week. So far we've been making some progress towards that goal. I checked in the changes to enable marking items as read or flagging them from the newspaper view and I've already found it to be quite useful. A screenshot is below

I've also been hard at work implementing support for synchronization with Newsgator Online via the Newsgator API. It's been more difficult than I expected but I'm sure our users will love being able to move between a web based aggregator or a desktop aggregator as the need arises but have their subscriptions and items they've read stay consistent between both places. Ideally the experience should be the same as moving back and forth between Web mail (e.g. Hotmail) and a desktop mail reader (e.g. Outlook Express).

I've also completed our support for Atom 1.0 and tested against a number of known Atom 1.0 feeds in the wild.

Torsten is working on the new subscription wizard and the podcasting support which should be finally checked in by next week. 

By the way Torsten has started an RSS Bandit new logo design contest and we'd appreciate your comments. It seems a lot of our users who use RSS Bandit from their place of work feel our current smily face icon and logo are unprofessional. I don't mind changing our application icon but would probably like to keep the smily bandit in the logo.

So it looks like we are on track for having the installer for the Nightcrawler alpha available next week. Hope you guys like the new features.


Categories: RSS Bandit

Today I stumbled on the post "Sign up for Gmail"on the Google blog and I was stunned by the following excerpt

For the last 16 months, a lot of people have been asking us how they can sign up for Gmail, and today we're happy to be able to say, "Just go to" From there, you can get an invitation code sent to your mobile phone, and with this code, you can create a Gmail account. Once you have Gmail, you can try out our brand new IM and voice service, Google Talk.

Why use mobile phones? It's a way to help us verify that an account is being created by a real person, and that one person isn't creating thousands of accounts. We want to keep our system as spam-free as possible, and making sure accounts are used by real people is one way to do that.

The privacy implications of having a company collect people's verified mobile phone numbers just for free email accounts boggles the mind. It is common knowledge that web surfers often give websites information they consider private thus I'm sure lots of people will take them up on their offer.  Looking at the GMail SMS mail sign up page it boldly states they plan to store the phone number indefinitely and then points to a privacy policy that doesn't say anything about what they plan to do with our phone numbers. Is their legal team asleep at the wheel or something?

I guess once they ship whatever mobile services that emerge from their purchase of Dodgeball and Android, they'll have a ready pool of phone numbers to launch the service with. That's just genius. Almost evil genius.


There were two instant messaging releases shipped yesterday from two of the major online players.

  1. Google Talk Beta: Google has finally shipped an instant messaging application as we all expected. You can get the scoop from Joe Beda's post Welcome To Google Talk. It seems Joe is one of the folks at Google who helped ship this. From his post we learn they provide
    • Instant messaging server based on the XMPP/Jabber protocol.  This is an IETF approved protocol.  Check out for more info.
    • Out of the box support for many third party clients.  Choose iChat, Gaim, Trillian or a host of others.  We support them from day one.
    • Our own client available for download from  We've concentrated on a simple to use and clean interface.  We've tried to strip IM down to its essence.
    • Support for voice calls between clients that just work.  We've worked hard to support all sorts of network topologies.  We are also using first class industry leading audio codecs.
    • A commitment to openness moving forward.  Choose your platform (Window, Mac, Linux, etc), choose your client (ours or others) and choose your service provider (we are commited to federation).  We want to make IM as open as the web or email.
  2. MSN Messenger 7.5: MSN shipped to its instant messaging client yesterday. You can get the scoop in Leah Pearlman's post MSN Messenger 7.5 - I can't believe It's Not Beta. From her post we learn that some of the new features are
    • Dynamic Backgrounds: Backgrounds that subtly animate in cool ways.

    • Voice Clip: Press and hold the Voice Clip button (or F2) and record up to 15 seconds of your voice or anything. When you release, it goes to your buddies just like an IM, and they can hear it instantly upon receipt.
    • Freaking Awesome Audio Quality Improvements: Yes, that’s the technical name. Our new audio technology makes free Voice (Voip) calls super-clear. Mostly thanks to much improved echo-cancellation. It’s now just like using a phone except: "Look ma! No hands!"
    • Patching: Due to the plethora of features in the Messenger client the download size has grown. In the future, instead of having to download the entire client each time an particular release is updated, we can download a small patch, on the order of 100K on to the user’s machine instead.

Google's entrance into the instant messaging landscape is interesting although unsurprising. As usual Google has entered the space with a disruptive move but instead of the move being the feature set of its IM client it is by not treating their IM network as a walled garden as AOL, MSN and Yahoo! have done. People aren't restricted to the Google Talk client and anyone can write a client application to connect people within their network. I'm not sure this is a smart move but it definitely is a disruptive one.


Last week I had lunch with Joshua Allen and mentioned that I was planning to write a blog post about the game changing effect of some entity adding generally accessible offline support to the AJAX capabilities of traditional web browsers. It seems Jason Kottke has beaten me to writing this with his post  GoogleOS? YahooOS? MozillaOS? WebOS? , he even has a roll call of the usual suspects who might build this and why.

He writes

So who's going to build these WebOS applications? Hopefully anyone with XHTML/JavaScript/CSS skills, but that depends on how open the platform is. And that depends on whose platform it is. Right now, there are five organizations who are or could be moving in this direction:

  • Google. If Google is not thinking in terms of the above, I will eat danah's furriest hat. They've already shifted the focus of Google Desktop with the addition of Sidebar and changing the name of the application (it used to be called Google Desktop Search...and the tagline changed from "Search your own computer" to the more general "Info when you want it, right on your desktop"). To do it properly, I think they need their own browser (with bundled Web server, of course) and they need to start writing their applications to work on OS X and Linux (Google is still a Windows company)[4]. Many of the moves they've made in the last two years have been to outflank Microsoft, and if they don't use Google Desktop's "insert local code into remote sites" trick to make whatever OS comes with people's computers increasingly irrelevant, they're stupid, stupid, stupid. Baby step: make Gmail readable offline.
  • Yahoo. I'm pretty sure Yahoo is thinking in these terms as well. That's why they bought Konfabulator: desktop presence. And Yahoo has tons of content and apps that that would like to offer on a WebOS-like platform: mail, IM, news, Yahoo360, etc. Challenge for Yahoo: widgets aren't enough...many of these applications are going to need to run in Web browsers. Advantages: Yahoo seems to be more aggressive in opening up APIs than Google...chances are if Yahoo develops a WebOS platform, we'll all get to play.
  • Microsoft. They're going to build a WebOS right into their operating's likely that with Vista, you sometimes won't be able to tell when you're using desktop applications or when you're at They'll never develop anything for OS X or for Linux (or for browsers other than IE), so its impact will be limited. (Well, limited to most of the personal computers in the world, but still.)
  • Apple. Apple has all the makings of a WebOS system right now. They've got the browser, a Web server that's installed on every machine with OS X, Dashboard, iTMS, .Mac, Spotlight, etc. All they're missing is the applications (aside from the Dashboard widgets). But like Microsoft, it's unlikely that they'll write anything for Windows or Linux, although if OS X is going to run on cheapo Intel boxes, their market share may be heading in a positive direction soon.
  • The Mozilla Foundation. This is the most unlikely option, but also the most interesting one. If Mozilla could leverage the rapidly increasing user base of Firefox and start bundling a small Web server with it, then you've got the beginnings of a WebOS that's open source and for which anyone, including Microsoft, Google, Yahoo, and anyone with JavaScript chops, could write applications. To market it, they could refer to the whole shebang as a new kind of Web browser, something that sets it apart from IE, a true "next generation" browser capable of running applications no matter where you are or what computer (or portable device) you're using.

So yeah, that's the idea of the WebOS (as I see it developing) in a gigantic nutshell.

I disagree with some of his post; I think desktop web servers are a bad idea and also that the claims of the end of Microsoft's operating system dominance are premature. He is also mistaken about MSN not building stuff for browsers other than IE. Of course, overestimating Microsoft's stupidity is a common trait among web developer types.

However the rest of his post does jibe with a lot of thinking I did while on vacation in Nigeria. I'd suggest that anyone interested in current and future trends in web application development should check it out.



Categories: Web Development

My post on Why I Prefer SOA to REST got some interesting commentary yesterday that indicates that I probably should clarify what I was talking about. The most interesting feedback actually came from email from some evangelists at Microsoft who had a bunch of criticisms from the fact that I dared to use Wikipedia as a definitive reference to pointing out that SOA is a meaningless buzzword. So I'll try this again without using links to Wikipedia or the acronym "SOA".

My day job is designing services that will be used both within MSN by a number of internal properties (Hotmail, MSN Spaces, MSN Messenger, and a lot more) as well as figuring out what our external web services story will be for interacting with MSN Spaces. This means I straddle the fence of dealing with building distributed applications in a primarily homogenous intranet environment and the heteregenous World Wide Web. When I talk about "distributed applications" I mean both scenarios not just Web service development or enterprise service development.

Now let's talk about REST. In Chapter 5 of Roy Fieldings Dissertation where he introduces the concept of Representational State Transfer (REST) architectural style he writes

5.1.5 Uniform Interface

The central feature that distinguishes the REST architectural style from other network-based styles is its emphasis on a uniform interface between components (Figure 5-6). By applying the software engineering principle of generality to the component interface, the overall system architecture is simplified and the visibility of interactions is improved. Implementations are decoupled from the services they provide, which encourages independent evolvability. The trade-off, though, is that a uniform interface degrades efficiency, since information is transferred in a standardized form rather than one which is specific to an application's needs. The REST interface is designed to be efficient for large-grain hypermedia data transfer, optimizing for the common case of the Web, but resulting in an interface that is not optimal for other forms of architectural interaction.

In order to obtain a uniform interface, multiple architectural constraints are needed to guide the behavior of components. REST is defined by four interface constraints: identification of resources; manipulation of resources through representations; self-descriptive messages; and, hypermedia as the engine of application state. These constraints will be discussed in Section 5.2.
... Resources and Resource Identifiers

The key abstraction of information in REST is a resource. Any information that can be named can be a resource: a document or image, a temporal service (e.g. "today's weather in Los Angeles"), a collection of other resources, a non-virtual object (e.g. a person), and so on. In other words, any concept that might be the target of an author's hypertext reference must fit within the definition of a resource. A resource is a conceptual mapping to a set of entities, not the entity that corresponds to the mapping at any particular point in time.

The REST architecture describes how a large interlinked web of hypermedia works which is what the World Wide Web is. It describes a way to build a certain class of distributed application, specifically one where you are primarily interested in manipulating linked representations of resources where the representations are hypermedia.

On to service orientation. The canon of service orientation are the four tennets taken from the article A Guide to Developing and Running Connected Systems with Indigo by Don Box where he wrote

In Indigo, a service is simply a program that one interacts with via message exchanges. A set of deployed services is a system. Individual services are built to lastthe availability and stability of a given service is critical. The aggregate system of services is built to allow for changethe system must adapt to the presence of new services that appear a long time after the original services and clients have been deployed, and these must not break functionality.

Service-oriented development is based on the four fundamental tenets that follow:

Boundaries are explicit A service-oriented application often consists of services that are spread over large geographical distances, multiple trust authorities, and distinct execution environments...Object-oriented programs tend to be deployed as a unit...Service-oriented development departs from object-orientation by assuming that atomic deployment of an application is the exception, not the rule. While individual services are almost always deployed atomically, the aggregate deployment state of the overall system/application rarely stands still.

Services are autonomous Service-orientation mirrors the real world in that it does not assume the presence of an omniscient or omnipotent oracle that has awareness and control over all parts of a running system.

Services share schema and contract, not class Object-oriented programming encourages developers to create new abstractions in the form of classes...Services do not deal in types or classes per se; rather, only with machine readable and verifiable descriptions of the legal "ins and outs" the service supports. The emphasis on machine verifiability and validation is important given the inherently distributed nature of how a service-oriented application is developed and deployed.

Service compatibility is determined based on policy Object-oriented designs often confuse structural compatibility with semantic compatibility. Service-orientation deals with these two axes separately. Structural compatibility is based on contract and schema and can be validated (if not enforced) by machine-based techniques (such as packet-sniffing, validating firewalls). Semantic compatibility is based on explicit statements of capabilities and requirements in the form of policy.

One thing I want to point out at this point is that neither REST nor service orientation are technologies. They are approaches to building distributed applications. However there are technologies typically associated with both approaches, REST has Plain Old XML over HTTP (POX/HTTP) and service orientation has SOAP.

My point from yesterday was that as far as approaches go, I prefer to think of building distributed applications from a service oriented perspective than from a REST perspective. This is completely different from endorsing SOAP over using POX/HTTP as the technology for building distributed applications. That is a discussion for another day.


Categories: XML Web Services

Can't write lyrics worth a damn, can't rap to save his life but can make phat beats and has an ego the size of a small planet. I guess we now know what it takes to be a successful rapper and it has nothing to do with rapping.


Categories: Music

Omar Shahine has a post where he talks about FireAnt. FireAnt is communications part of the AJAX framework shared by Hotmail,, MyMSN and MSN Spaces which Steve Rider alluded to in his post Spaces, Hotmail and Start (oh my!).

Omar writes

Last summer we spent a lot of time at the white-board evaluating a number of ways to deliver a new architecture for Hotmail. We considered a number of things:

  1. Modification of the current C/C++ ISAPI architecture to support a hybrid ASP model.
  2. .NET rewrite for the DataLayer and BusinessLayer and XML/XSLT for the PresentationLayer
  3. Same as #2 but the Presentation layer would be JavaScript, XMLHTTP, and DHTML/CSS. This now has the fancy name, AJAX.

After much deliberating, we chose #3, and started running. For 4 weeks basically 1 PM, a developer and an intern built a prototype, and then the real thing (in case you are in college I’d note how cool it is that we put an intern on the most important technology we we're building). As more people started to move over to the FireAnt project, we got more and more excited about what was happening. You see, writing AJAX code can be a pain, and we didn’t want to spend our days and nights writing a lot of JavaScript and debugging client side Script. Instead we built an infrastructure that dynamically take server side objects (classes and methods) and automatically generates client side JavaScript stubs. The end result is that the client side object model looked exactly like the server side object model. Information was transported across the wire using XMLHTTP and the whole thing happened Asynchronously.

We extended .NET Attributes to mark classes and methods as FireAnt classes/methods and at build time the script is generated. If you think of SOAP support in the .NET Framework, it’s basically similar. As a developer you do not worry about generating SOAP messages, or building a SOAP parser. All you do is mark your method as [WebMethod] and your classes as [Serializable] and the .NET framework takes care of proxying, class generation etc. That’s what we were shooting for.

This was a big deal for us as it allows us to be incredibly productive. Since last summer, we have built a ton of features using FireAnt and the JavaScript Frameworks from Scott Isaacs. Late last fall we went up to Redmond and showed FireAnt to a number of folks in MSN, one of those folks was Steve Rider. It was really exciting to see the looks on folks faces when Walter (our FireAnt “architect”) setup his “Hello World” demo. You could just see that people realized that doing AJAX style development any other way was crazy.

We’ve since showed our stuff to a number of teams inside Microsoft. As a result of our work, Walter and Scott have spent a considerable amount of time with the Whidbey/ASP.NET folks and it’s pretty exciting to see ATLAS come together. If you want to learn more, Walter will be giving a talk at the PDC on what we’ve built. It’s great so see collaboration between our team and the Developer Division as the end result will be a better more scalable version of the .NET Framework for you.

Trying to build a complex AJAX website with traditional Visual Studio.NET development tools is quite painful which is why the various teams at MSN have collaborated and built a unified framework. As Omar points out, one of the good things that has come out of this is that the various MSN folks went to the Microsoft developer division and pointed out they are missing the boat key infrastructure needed for AJAX development. This feedback was one of the factors that resulted in the recently announced Atlas project.

A key point Omar touches on is that development became much easier when they built a framework for handling serialization and deserialization of objects to transmitted using XMLHTTP. The trick here is that the framework handles both serialization and deserialization on both the server (ASP.NET code) and the client (Javascript code). Of course, this is AJAX development 101 and anyone who's used AJAX frameworks like AJAX.NET is familiar with these techniques. One of the interesting things that falls out of using a framework like this is that the serialization format becomes less interesting, one could as easily use JavaScript Object Notation (JSON) as opposed to some flavor of XML.

If you're going to be at the Microsoft Professional Developer's Conference (PDC) and are interested in professional AJAX development you should definitely make your way to the various presentations by the MSN folks. Also, we're always looking for developers so if building AJAX applications that will be utilized by millions of people on a daily basis sounds like your cup of tea give us your resume.


Categories: MSN | Web Development

August 21, 2005
@ 12:56 AM

The good folks at Google, Yahoo and MSN announced some sweet stuff for Web search aficionados this week.

  1. MSN: From the MSN Search blog post entitled Extending the MSN Search Toolbar we learn that the first of many add ins for the MSN Search toolbar is now available. The weather add in for MSN Toolbar is something I've wanted in a browser toolbar for a while. There is also information for developers interested in building their own add ins.

  2. Google: From the Google Weblog post entitled The linguasphere at large we learn that Google has launched the Google Search engine in 11 more languages bringing to total number of languages supported to 116. The coolest part of this announcement is that I can now search Google in Yoruba which is my dad's native tongue. Not bad at all.

  3. Yahoo: The last but not the least is the recent announcement about the Next Generation of Yahoo! Local on the Yahoo! Search blog. The launch showcases the integration of ratings and reviews by Yahoo! users with mapping and local information. This kind of like Yahoo! Maps meets CitySearch. Freaking awesome. Yahoo! continues to impress.



August 21, 2005
@ 12:47 AM

Yesterday I was chatting with Matt after he reviewed the paper I plan to submit for the next Bill Gates Think Week and he pointed out something that had been nagging me about using REpresentational State Transfer(REST) as a model for building distributed applications.  

In the current Wikipedia entry on REST, it states

An important concept in REST is the existence of resources (pieces of information), each of which can be referred to using a global identifier (a URL). In order to manipulate these resources, components of the network (clients and servers) communicate via a standardised interface (HTTP) and exchange representations of these resources (the actual files uploaded and downloaded) -- it is a matter of debate, however, whether the distinction between resources and their representations is too Platonic for practical use on the web, though it is popular in the RDF community.

Any number of connectors (e.g., clients, servers, caches, tunnels, etc.) can mediate the request, but each does so without "seeing past" its own request (referred to as "layering", another constraint of REST and a common principle in many other parts of information and networking architecture). Thus an application can interact with a resource by knowing two things: the identifier of the resource, and the action required -- it does not need to know whether there are caches, proxies, gateways, firewalls, tunnels, or anything else between it and the server actually holding the information. The application does, however, need to understand the format of the information (representation) returned, which is typically an HTML or XML document of some kind, although it may be an image or any other content.

Compare the above to the typical notion of service orientation such as that espoused in the article A Guide to Developing and Running Connected Systems with Indigo by Don Box where he wrote

In Indigo, a service is simply a program that one interacts with via message exchanges. A set of deployed services is a system. Individual services are built to last—the availability and stability of a given service is critical. The aggregate system of services is built to allow for change—the system must adapt to the presence of new services that appear a long time after the original services and clients have been deployed, and these must not break functionality.

Service-oriented development is based on the four fundamental tenets that follow:

Boundaries are explicit   A service-oriented application often consists of services that are spread over large geographical distances, multiple trust authorities, and distinct execution environments...Object-oriented programs tend to be deployed as a unit...Service-oriented development departs from object-orientation by assuming that atomic deployment of an application is the exception, not the rule. While individual services are almost always deployed atomically, the aggregate deployment state of the overall system/application rarely stands still.

Services are autonomous   Service-orientation mirrors the real world in that it does not assume the presence of an omniscient or omnipotent oracle that has awareness and control over all parts of a running system.

Services share schema and contract, not class   Object-oriented programming encourages developers to create new abstractions in the form of classes...Services do not deal in types or classes per se; rather, only with machine readable and verifiable descriptions of the legal "ins and outs" the service supports. The emphasis on machine verifiability and validation is important given the inherently distributed nature of how a service-oriented application is developed and deployed.

Service compatibility is determined based on policy   Object-oriented designs often confuse structural compatibility with semantic compatibility. Service-orientation deals with these two axes separately. Structural compatibility is based on contract and schema and can be validated (if not enforced) by machine-based techniques (such as packet-sniffing, validating firewalls). Semantic compatibility is based on explicit statements of capabilities and requirements in the form of policy.

The key thing to note here is that REST is all about performing a limited set of operations on an object (i.e. a resource) while SOA is all about making requests where objects are input and/or output.

To see what this difference means in practice, I again refer to the Wikipedia entry on REST which has the following example

A REST web application requires a different design approach than an RPC application. In RPC, the emphasis is on the diversity of protocol operations, or verbs; for example, an RPC application might define operations such as the following:


With REST, on the other hand, the emphasis is on the diversity of resources, or nouns; for example, a REST application might define the following two resource types

 User {}
 Location {}

Each resource would have its own location, such as Clients work with those resources through the standard HTTP operations, such as GET to download a copy of the resource, PUT to upload a changed copy, or DELETE to remove all representations of that resource. Note how each object has its own URL and can easily be cached, copied, and bookmarked. POST is generally used for actions with side-effects, such as placing a purchase order, or adding some data to a collection.

The problem is that although it is easy to model resources as services as shown in the example in many cases it is quite difficult to model a service as a resource. For example, a service that validates a credit card number can be modeled as a validateCreditCardNumber(string cardNumber) service. On the other hand it is unintuitive how one would model the service as a resource. For this reason I prefer to think about distributed applications in terms of services as opposed to resources.

This doesn't mean I don't think there is value in several aspects of REST. However I don't think it is the right model when thinking about building distributed applications.


Categories: XML Web Services

August 20, 2005
@ 11:32 PM
I've posted in the past about not understanding why people continue to use It seems more people have realized that the service has been in bad shape for months and are moving on. Jason Kottke has a blog post entitled So Long, Technorati where he writes

That's it. I've had it. No more Technorati. I've used the site for, what, a couple of years now to keep track of what people were saying about posts on and searching blogs for keywords or current events. During that time, it's been down at least a quarter of the time (although it's been better recently), results are often unavailable for queries with large result sets (i.e. this is only going to become a bigger problem as time goes on), and most of the rest of the time it's slow as molasses.

When it does return results in a timely fashion for links to, the results often include old links that I've seen before in the results set, sometimes from months ago. And that's to say nothing of the links Technorati doesn't even display. The "" smart list in my newsreader picks up stuff that Technorati never seems to get, and that's only pulling results from the ~200 blogs I read, most of which are not what you'd call obscure. What good is keeping track of 14 million blogs if you're missing 200 well-known ones? (And trackbacks perform even better...this post got 159 trackbacks but only 93 sites linking to it on Technorati.)

Over the past few months, I've been comparing the results from PubSub to those of Technorati and PS is kicking ass. Technorati currently says that 19 sites have linked to me in the past 6 days (and at least four of those are old and/or is from last September, fer chrissakes) while PubSub has returned 38 fresh, unrepeated results during that same time. (Not that PubSub is all roses and sunshine either...the overlap between the result sets is surprisingly small.)

While their search of the live web (the site's primary goal) has been desperately in need of a serious overhaul, Technorati has branched out into all sorts of PR-getting endeavors, including soundbiting the DNC on CNN, tags (careful, don't burn yourself on the hot buzzword), and all sorts of XML-ish stuff for developers. Which is all great, but get the fricking search working first! As Jason Fried says, better to build half a product than a half-assed product. I know it's a terrifically hard problem, but Figure. It. Out.

Jason Kottke recommends IceRocket's blog search at the end of his post. I've been using the Bloglines Citations feature for the past couple of months and love it. That in combination with RSS feeds of search results via PubSub have replaced Technorati for all my ego searching needs.


Tim Bray has a recent post entitled The Real Problem that opens up the quarterly debate on the biggest usability problem facing XML syndication technologies like RSS and Atom; there is no easy way for end users to discover or subscribe to a website's feed.

Tim writes

One-Click Subscription First of all, most people don’t know about feeds, and most that do don’t subscribe to them. Check out the comments to Dwight Silverman’s What’s Wrong with RSS? (By the way, if there were any doubt that the blogging phenomenon has legs, the fact that so many people read them even without the benefits of RSS should clear that up).

Here’s the truth: an orange “XML” sticker that produces gibberish when you click on it does not win friends and influence people. The notion that the general public is going to grok that you copy the URI and paste it into your feed-reader is just ridiculous.

But, as you may have noticed, the Web has a built-in solution for this. When you click on a link to a picture, it figures out what kind of picture and displays it. When you click on a link to a movie, it pops up your favorite movie player and shows it. When you click on a link to a PDF, you get a PDF viewer.

RSS should work like this; it never has, but it can, and it won’t be very hard. First, you have to twiddle your server so RSS is served up correctly, for example as application/rss+xml or application/atom+xml. If you don’t know what this means, don’t worry, the person who runs your web server can do it in five minutes.

Second, you either need to switch to Atom 1.0 or start using <atom:link rel="self"> in RSS. If our thought leaders actually stepped up and started shouting about this, pretty well the whole world could have one-click subscriptions by next summer, using well-established, highly-interoperable, wide-open standards.

As long as people expect one click subscription to depend on websites using the right icons, the right HTML and the right MIME types for their documents it won't become widespread. On the other hand, this debate is about to become moot anyway because every major web browser is going to have a [Subscribe to this website] button on it in a year or so. Firefox already has Live Bookmarks, there's Safari RSS for Mac OS X users and Internet Explorer 7 will have Web Feeds.

As far as I'm concerned, the one click subscription problem has already been solved. I guess that's why Dave Winer is now arguing about what to name the feature across different Web browsers. After all, RSS geeks must always have something to argue about. :)


For the few folks that have asked, I have uploaded 50 Photos from my Nigeria Trip to the photo album in my MSN Space. The photos are from all over Nigeria specifically Lagos, Abuja and Abeokuta. They are somewhat crappy since I used a disposable camera I bought at the airport but they do capture the spirit of the vacation.

I guess I should start thinking about investing in a digital camera.

Update: Even though no one asked I've also decided to start rotating a song of the week on my space from the rap group I formed with Big Lo and a couple of other guys back in high school. Just click on the play button on the Windows Media player module to hear this week's track.


Categories: Trip Report

August 16, 2005
@ 06:40 PM

Joe Wilcox, an analyst for Jupiter Research, recently posted his changed impressions on MSN Spaces in his blog post Making Room for My Space. He writes

I have started using MSN Spaces as the place where I keep my personal Weblog. Duing 2004 and part of 2005, I used TypePad's blogging service, and more recently moved one of my domains to a bloghoster. While a domain offers great search engine exposure, using the hosted blogging software requires some knowledge of HTML/CSS coding and other techniques; it's more work and trouble than I have time for. TypePad is a good alterative that's fairly easy to use, but it's by no means point and click.

To Microsoft's credit, MSN Spaces is remarkably easy to use, or so I am discovering as I give the service a hard second look. Sure, there were glitches at beta launch, but the service seems solid now. Some established blogger balked at the lack of control, meaning Microsoft tools took most of it, when the service launched as beta. But Microsoft never meant the service for them, but the masses of people that hadn't yet started blogging, and maybe folks like me too busy to become an amateur blogsite designer.

The simplicity and beauty of Microsoft's approach foreshadows possible future product changes competitors and partners shouldn't ignore...MSN Spaces takes that approach, of providing easy tools for doing the most common blogsite tasks. The user doesn't have as much control, but he or she can get the most common tasks quickly done. Over time, Microsoft has increased the amount of control and customization that power users would want, such as Friday's release of three MSN Spaces PowerToys, for layout control, custom (sandbox) modules and Windows Media content.
I would encourage Microsoft competitors and partners to closely watch MSN Spaces' progress. Apple blindsided Microsoft with iPod and the iTunes Music Store, a circumstance well understood by Microsoft product managers. Simplicity is one cornerstone of the products' success. Synching iPod to iTunes is no more complicated than connecting the device to the computer. There are settings to do more, but the baseline functionality that is suitable to most users is plug and synch. Microsoft has embarked on a similar, simpler approach with MSN Spaces.

It is interesting seeing how geeks adore complexity in the software and hardware that they use. I can still remember Robert Scoble's complaints about Spaces in his post MSN Spaces isn't the blogging service for me  or even CmdrTaco's comments when Apple released the iPod, "No wireless. Less space than a nomad. Lame". Despite both products being dissed by A-list geeks they have become widely adopted by millions of people.  

More proof that designing for regular people is a lot different from designing for geeks.


Categories: MSN

I was recently re-reading Jesse James Garrett's article Ajax: A New Approach to Web Applications and it struck me that the article was very much an introductory text on building web applications which skirted a number of real world issues. The core idea behind the article is that using DHTML and server callbacks results in web sites that are more responsive [from the end user's perspective] than traditional web applications. This is very true.

However if you are building a large scale web application there is more to consider when using AJAX than how to create a function that hides the differences between the XMLHttpRequest object in IE and Firefox. Problems that have to be solved [or at the very least considered] include

  1. How to abstract away browser detection from each page in the application
  2. How to make the site accessible or at least work on non-Javascript enabled browsers
  3. How to efficiently manage the number of connections to the server created by the client given the "chattiness" of AJAX applications compared to traditional web applications
  4. How to reduce the amount of time spent downloading large script files
  5. How to create permalinks to portions of the application
  6. How to preserve or at least simulate the behavior of the browser's 'Back' button

At MSN we've had to come up with solutions to a number of these problems while working on, MSN Spaces, the next version of Hotmail, and the next version of MyMSN. We have built our own AJAX framework and come up with a number of best practices for building large scale applications using AJAX. 

Much to my surprise Scott Isaacs (one of the inventors of DHTML and now an architect at MSN) has started a series on the problems that face web sites that plan to utilize AJAX and how we solved them at MSN. The first two articles in the series are Why Ajax is so 1999? Part 1 and Why Ajax is so 1999? Part 2. Scott will also be giving a talk at the Microsoft Professional Developer's Conference (PDC) about and some of the cool AJAX stuff we've done.

I actually expected this would be the kind of information we'd keep close to our chest since they give us a competitive advantage so it is quite a welcome surprise to see us sharing knowledge with the Web developer community this way. I've already started nagging Scott to write a book about this stuff or at least update his Inside Dynamic HTML for the new millenium.


Categories: MSN | Web Development

Since leaving the XML team last year I haven't paid much attention to the various betas and CTPs of Visual Studio.NET 2005 that have been made available over the past year. Thus I don't have a position on the article Developers seek third beta release for Visual Studio 2005 from InfoWorld which states

After having been stalled several times already, it would seem that the last thing developers would want for the Visual Studio 2005 toolset is another delay. Nonetheless, a request from some developers for a new beta release would, if granted, potentially hold back the product set yet again.
In launching an effort for a third beta release, developers are citing bugs and performance issues with existing prereleases. A suggestion posted on the MSDN Product Feedback Center seeks support for a third beta release of Visual Studio 2005 and Visual Studio 2005 Team System in late September.

"Push back RTM (release to manufacturing) if you have to," the online suggestion states. "RTM December 31st or push it to 2006 (just keep the 2005 name then, no big deal)."

The release-to-manufacturing date signifies the product’s impending general availability for customers.

"There are still way too many bugs and performance issues. Too many issues get resolved as 'postponed,'" the online request continued. "Developers won't care about when the RTM date was a few months after RTM if the product is full of bugs."

Seventy-two people had voted on the suggestion as of Friday afternoon.

"I would much rather that Microsoft push this release back and get things right," according to one person who commented.

"A Beta 3 is absolutely required," stated another person who signed the petition. "There are so many outstanding bugs and issues that a Beta 3 is required to ensure stability of the final release."

Microsoft released a prepared statement via e-mail Friday noting the planned November 7 release date.

"Microsoft appreciates feedback from all users. For this version of Visual Studio, Microsoft has continually solicited product feedback by issuing multiple betas and Community Technology Previews (CTPs) and encouraging the community to provide feedback via the MSDN Product Feedback Center. The community of 6 million Visual Studio developers and more than 240 Visual Studio Industry Partners (VSIP) have been providing a great deal of valuable feedback and telling Microsoft that they are very excited [about] the November 7 launch."

Interesting feedback. The number of votes on the issue have doubled to about 143 votes as at a few minutes ago when I checked on the issue entitled Suggestion Details: Release .Net 2.0 Beta 3 on the MSDN Product Feedback Center.

Despite how negative this seems, it is great that customers can give such direct feedback to Microsoft product teams in such a transparent manner. The developer division faces a tough challenge if the claims being made by the commenters are valid.  I wish them luck.


August 15, 2005
@ 06:20 PM

It seems there is an open call for participation for the 2006 edition of the O'Reilly Emerging Technology Conference (ETech). Although I'm not in right demographic to be an ETech speaker since I don't work at a sexy Silicon Valley startup, I'm not a VP at a major software company and don't consider myself a Friend of O'Reilly, I plan to toss my hat in the ring and send in two talk proposals anyway.

  1. What's Next for RSS, Atom and Content Aggregation: Currently the primary usage of content syndication technology like RSS has been consuming news and blog postings in desktop or web-based RSS readers. However the opportunities created by syndication technologies go much further than enabling us to keep up with Slashdot and Boing Boing in our RSS reader of choice. Podcasting is one manifestation of the new opportunities that arise once the concept of content syndication and aggregation is applied to domains outside of news sites and blogs. This talk will focus problem areas and scenarios oustde of blogs, news sites and traditional RSS readers that can benefit from the application of syndication technologies like RSS and Atom.

  2. Bringing MSN into Web 2.0: The essence of Web 2.0 is moving from a Web consisting of Web pages and Web sites (Web 1.0) to a Web consisting of Web applications based on open data that are built on Web platforms. MSN hosts a number of properties from social software applications like MSN Spaces, Hotmail, MSN Groups and MSN Messenger which are used by hundreds of millions of people to communicate to software that enables people to find information they need such as MSN Search and MSN Virtual Earth. All of these applications. A number of these web sites are making the transition to becoming web platforms; MSN Virtual Earth has a Javascript API, MSN Search exposes search results as RSS feeds and MSN Spaces will support the MetaWeblog API which uses XML-RPC. This talk will focus on the current and future API offerings coming from MSN and give a behind the scenes look as to how some of these APIs came about from conception and getting sign off from the bean counters to technical details on building web services.

These are my first drafts of both proposals, criticisms are welcome. If they don't get accepted, I'll survive. Now that I've actually written the abstracts I can just keep submitting them to various conferences I'm interested in until they get accepted somewhere. In my experience, it usually takes about 2 or 3 submissions to get a talk accepted anyway.


Categories: Web Development

August 15, 2005
@ 03:07 AM

The MSN Mobile team dropped two excellent betas last week. The first was which is mentioned in Mike Torres's post on the Mobile Spaces (Beta) where we learn

you can:

  1. Create a space from a mobile device.  Pocket PCs, Palms, and most popular mobile phones are supported.  Just browse over to from your mobile device (or go to and you will be redirected to the mobile version)
  2. See a list of your contacts' recently updated spaces.  This feature is really useful for a mobile device and great for catching up with people!  Just "click" on a contact to get to their space and start exploring.
  3. Add blog entries, view your archives, email a link to your space, and even change your settings - all from your itty bitty mobile device.
  4. Read and add new comments (my favorite feature!)  You are now able to stay on top of discussions from wherever you happen to be - in school, on a bus, in a meeting, or in line at Starbucks.

The second beta is which brings local search to your mobile device. This is mentioned in the blog post Get Local Search with Maps and Directions on your phone!  from the MSN Search blog where we learn

So what does it do? You can search for a restaurant, store, school, dentist, museum – basically, anything listed in the Yellow Pages and White Pages. Just enter your search term (i.e. "coffee" or "Victrola" ) and location (zip code, city/state or full street address) and hit the Search button. Your recently used locations are even stored and easily accessible the next time you use the service. We’ll return the first handful of results, including name, address, distance from your current location and phone number – which you can dial by clicking!  Select the result name and you’ll see a page with more detail, including a color map. Select "get directions" and we’ll provide turn-by-turn driving directions between your starting location and result address (both editable). All of these features have been specially designed to work on your phone, requiring minimal interaction and optimized for speed and ease of use.

The MSN Mobile crewis definitely shipping some good stuff. Props go out to Michael Smuga and the rest of the gang.


Categories: MSN

Today I was working on completing the support for Atom 1.0 in the next version of RSS Bandit and decided to make the changes for parsing out enclosure/podcast elements while I was in that part of the code. RSS 2.0 is pretty straightforward, there is an <enclosure> element that is a child of the <item> element.

On the other hand, the Atom 1.0 specification has two completely different mechanisms for creating podcasts. Both mechanisms are described in the article by James Snell entitled An overview of the Atom 1.0 Syndication Format. From the article

Support for enclosures

Listing 4. Atom 1.0 podcasting example

<feed xmlns="">
  <title>My Podcast Feed</title>
    <name>James M Snell</name>
  <link href="" />
  <link rel="self" href="" />
    <title>Atom 1.0</title>
    <link href="" />
    <summary>An overview of Atom 1.0</summary>
    <link rel="enclosure" 
          length="1234" />
														  <link rel="enclosure"
          length="1234" />
    <content type="xhtml">
      <div xmlns="">
        <h1>Show Notes</h1>
          <li>00:01:00 -- Introduction</li>
          <li>00:15:00 -- Talking about Atom 1.0</li>
          <li>00:30:00 -- Wrapping up</li>

Atom enclosures allow you to do more than just distribute audio content. Enclosure links can reference any type of resource. Listing 5, for instance, uses multiple enclosures within a single entry to reference translated versions of a single PDF document that's accessible through FTP. The hreflang attribute identifies the language that each PDF document has been translated into.


In addition to support for links and enclosures, Atom introduces the ability to reference entry content by URI. Listing 6, for instance, illustrates how an Atom feed for a photo weblog might appear. The content element references each individual photograph in the blog. The summary element provides a caption for the image.

Listing 6. A simple list of images using Atom 1.0

<feed xmlns=""
  <title>My Picture Gallery</title>
    <name>James M Snell</name>
     <title>Trip to San Francisco</title>
     <link href="/entries/1" />
     <summary>A picture of my hotel room in San Francisco</summary>
     <content type="image/png" src="/mypng1.png" />
    <title>My new car</title>
    <link href="/entries/2" />
    <summary>A picture of my new car</summary>
    <content type="image/png" src="/mypng2.png" />

This content-by-reference mechanism provides a very flexible means of expanding the types of content that one can syndicate through Atom.

After looking at this from all angles for about 30 minutes the only conclusion I can come to is that Atom provided two completely different mechanisms of achieving the same goal. This is likely a potential gotcha for aggregator authors who might end up supporting one or the other of the mechanisms instead of both.

After this, I still have to add some code to also support Yahoo! Media RSS and then track down some feeds that actually use all the various enclosure techniques so I can test my code with actual real world scenarios. I'd appreciate any pointers to test feeds especially for the Yahoo! Media extensions to RSS [which I'm considering not supporting if there aren't that many feeds that use it].

No rest for the wicked. ;)


In recent weeks there have been a number of blog postings critical of the Technorai Top 100 List of popular web logs. The criticisms have primarily been of two flavors; some posts have been critical of the idea of blogging as popularity contests which such lists encourage and others have criticized the actual mechanism of calculating popularity used by Technorati. I agree with both criticisms especially the former. There have been a number of excellent posts arguing both points which I have think are worth sharing.

Mary Hodder, in her post Link Love Lost or How Social Gestures within Topic Groups are More Interesting Than Link, argues that more metrics besides link count should be used for calculating popularity and influence. Some of the additional metrics she suggests include comment counts and number of subscribers to the site's RSS feed. She also suggests creating topic specific lists instead of one ber list for the entire blogosphere. It seems a primary motivation for encouraging this approach is to increase the pool of bloggers that are targetted by PR agencies and the like. Specifically Mary writes

However, I'm beginning to see many reports prepared by PR people, communications consultants etc. that make assessments of 'influential bloggers' for particular clients. These reports 'score' bloggers by some random number based on something: maybe inbound links or the number of bloglines subscribers or some such single figure called out next to each blog's name.

Shelley Powers has a different perspective in her post Technology is neither good nor evil. In arguing against the popularity contests inherent in creating competing A-lists or even just B-lists to complement the A-lists she writes 

Even if we tried to analyze a persons links to another, we cant derive from this anything other than person A has linked to person B several times. If we use these to define a community to which we belong, and then seek to rank ourselves within these communities, all weve done is create a bunch of little Technorati 100s and communities that are going to form barriers to entry. We see this communal behavior all too often: a small group of people who know each other link to each other frequently and to outsiders infrequently; basically shutting down the discussion outside of the community.
I think Mary should stop with I hate rankism. I understand the motivations behind this work, but ultimately, whatever algorithm is derived will eventually end up replicating the existing patterns of authority rather than replacing them. This pattern repeated itself within the links to Jay Rosens post; it repeated itself within the speaker list that Mary started for women ("where are the women speakers"), but had its first man within a few hours, and whose purpose was redefined within a day to include both men and women.

Rankings are based on competition. Those who seek to compete will always dominate within a ranking, no matter how carefully we try to 'route' around their own particular form of 'damage'. What we need to challenge is the pattern, not the tools, or the tool results. 

I agree with Shelley that attempts to right the so called "imbalance" created by lists such as the Technorati Top 100 will encourage competition and stratification within certain blogging circles. I also agree that despite whatever algorithms are used, a lot of the same names will still end up on the lists for a variety of reasons. A major one being that a number of the so-called A-list blogs actually work very hard to be "popular" and changing the metrics by which their popularity is judged won't change this fact.

So Shelley has given us some of the social arguments while popularity lists such as the Technorati Top 100 aren't a good idea. But are the technical flaws in Technorati's approach to calculating weblog popularity so bad? Yes, they are.

Danah Boyd has a post entitled The biases of links where she did some research to show exactly how flawed simply counting links on web pages isn't an accurate way to calculate popularity or influence. There are a lot of excellent points in Danah's post and the entire post is worth reading multiple times. Below are some key excerpts from Danah's post

I decided to do the same for non-group blogs in the Technorati Top 100. I hadn't looked at the Top 100 in a while and was floored to realize that most of those blogs are group blogs and/or professional blogs (with "editors" and clear financial backing). Most are covered in advertisements and other things meant to make them money. It's very clear that their creators have worked hard to reach many eyes (for fame, power or money?).

  • All MSNSpaces users have a list of "Updated Spaces" that looks like a blogroll. It's not. It's a random list of 10 blogs on MSNSpaces that have been recently updated. As a result, without special code (like in Technorati), search engines get to see MSNSpace bloggers as connecting to lots of other blogs. This would create the impression of high network density between MSNSpaces which is inaccurate.
  • Few LiveJournals have a blogroll but almost all have a list of friends one click away. This is not considered by search tools that look only at the front page.
  • Blogrolls seem to be very common on politically-oriented blogs and always connect to blogs with similar political views (or to mainstream media).
  • Blogrolls by group blogging companies (like Weblogs, Inc.) always link to other blogs in the domain, using collective link power to help all.
  • Male bloggers who write about technology (particularly social software) seem to be the most likely to keep blogrolls. Their blogrolls tend be be dominantly male, even when few of the blogs they link to are about technology. I haven't found one with >25% female bloggers (and most seem to be closer to 10%).
  • On LJ (even though it doesn't count) and Xanga, there's a gender division in blogrolls whereby female bloggers have mostly female "friends" and vice versa.
  • I was also fascinated that most of the mommy bloggers that i met at Blogher link to Dooce (in Top 100) but Dooce links to no one. This seems to be true of a lot of topical sites - there's a consensus on who is in the "top" and everyone links to them but they link to no one.

Linking patterns:

  • The Top 100 tend to link to mainstream media, companies or websites (like Wikipedia, IMDB) more than to other blogs (Boing Boing is an exception).
  • Blogs on blogging services rarely link to blogs in the posts (even when they are talking about other friends who are in their blogroll or friends' list). It looks like there's a gender split in tool use; Mena said that LJ is like 75% female, while Typepad and Moveable Type have far fewer women.
  • Bloggers often talk about other people without linking to their blog (as though the audience would know the blog based on the person). For example, a blogger might talk about Halley Suitt's presence or comments at Blogher but never link to her. This is much rarer in the Top 100 who tend to link to people when they reference them.
  • Content type is correlated with link structure (personal blogs contain few links, politics blogs contain lots of links). There's a gender split in content type.
  • When bloggers link to another blog, it is more likely to be same gender.

I began this investigation curious about gender differences. There are a few things that we know in social networks. First, our social networks are frequently split by gender (from childhood on). Second, men tend to have large numbers of weak ties and women tend to have fewer, but stronger ties. This means that in traditional social networks, men tend to know far more people but not nearly as intimately as those women know. (This is a huge advantage for men in professional spheres but tends to wreak havoc when social support becomes more necessary and is often attributed to depression later in life.)

While blog linking tends to be gender-dependent, the number of links seems to be primarily correlated with content type and service. Of course, since content type and service are correlated by gender, gender is likely a secondary effect.
These services are definitely measuring something but what they're measuring is what their algorithms are designed to do, not necessarily influence or prestige or anything else. They're very effectively measuring the available link structure. The difficulty is that there is nothing consistent whatsoever with that link structure. There are disparate norms, varied uses of links and linking artifacts controlled by external sources (like the hosting company). There is power in defining the norms, but one should question whether or companies or collectives should define them. By squishing everyone into the same rule set so that something can be measured, the people behind an algorithm are exerting authority and power, not of the collective, but of their biased view of what should be. This is inherently why there's nothing neutral about an algorithm.

There is a lot of good stuff in the excerpts above and it would take an entire post or maybe a full article to go over all the gems in Danah's entry. One random but interesting point is that LiveJournal bloggers are penalized by systems such as the Technorati Top 100. For example, Jamie Zawinski has over 1900 people who link to him from their Friend's page in LiveJournal but he somehow doesn't make the cut for the Technorati Top 100. Maybe the fact that most of his popularity is within the LiveJournal community makes his "authority" less valid than others with less incoming links that are in the Technorati Top 100 list.

Yeah, right.


August 13, 2005
@ 05:11 AM

Robert Scoble has a blog post entitled Filtering Out MSN's Filter which seems like a good enough opportunity to state why I think of the newest addition to MSN's family of offerings. Robert wrote

MSN Filter sure is getting some people upset (hi Ross Mayfield).

Personally I wanted to give MSN Filter a few weeks before giving my opinion, but Ross goaded me into it.

Boring. Boring. Boring.

First, what is it? MSN hired five people to do a blog each. There's one on sports. Another on tech. Music. TV. Lifestyle.

I have to agree with Robert, I think the MSN Filter sites are pretty boring. More importantly as a MSFT shareholder and someone that works at MSN, I think it is a bad business investment in its current incarnation. Precedents for professional blogging such as Gawker Media (e.g. Gizmodo)  and Weblogs Inc. (e.g. Engadget) family of sites are supported by topic specific ads including some from Google's AdSense program. On the other hand, if you look at MSN's Technology Filter you don't see any such ads.

I think it is pretty cool that MSN is allowing folks experiment with ventures like MSN Filter. However my personal opinion is that in its current incarnation it's a lame knock off of the stuff coming out of folks like Nick Denton and Jason Calacanis and it doesn't have a chance of making much [if any] money for us since they are eschewing targetted ads.  

Lame. Lame. Lame.


Categories: MSN

In every release of RSS Bandit, I spend some time working on performance. Making the application use less memory and less CPU time while adding more features is a challenge. Recently I read a post by Mitch Denny entitled RSS Bandit Performance Problems where after some investigation he found a key culprit for some of our memory consumption issues in the last release. Mitch wrote

Last weekend I was subscribed to over about 1000 RSS feeds and conicidentally last weekend RSSBandit also became unusable. Obviously I had reached some kind of threshold that the architecture of RSSBandit wasn’t designed to cope with.

My first instinct was to ditch and go and find something a bit faster – after all it is a .NET application and we know how much of a memory hog those things are! Errr – hang on. Don’t I encourage our customers to go out and use .NET to build mission critical enterprise applications every day? I really needed to take a closer look at what was going on.

In idle RSSBandit takes up around 120–170MB of RAM on my laptop. Thats more than Outlook and SQL Server, and often more than Visual Studio (except when its in full flight) but to be honest I’m not that surprised because in order for it to give me the unread items count it has to process quite a few files containing lots of unique strings – that means relatively large chunks of being allocated just for data.

I decided to look a bit deeper and run the CLR Allocation Profiler over the code to see where all the memory (and by extension good performance was going). I remembered this article by Rico Mariani which included the sage words that “space is king” and while I waited for the profiler to download tried to guess what the problem would be based on my previous experience.

What I imagined was buckets of string allocations to store posts in their entirety and a significant number of XML related object allocations but when I looked at the allocation graph I saw something interesting.

... [see]

As you can see there is a huge amount of traffic between this native function and the NativeWindow class. It was at this point that I started to suspect what the actual problem was and had to giggle at how many times this same problem pops up in smart client applications.

From what I can tell the problem is an excessive amount of marshalling to the UI thread is going on. This is causing threads to synchronise (tell tale DynamicInvoke calls are in there) and quite a bit of short term memory to be allocated over the lifetime of the application. Notice that there is 610MB of traffic between the native function and NativeWindow so obviously that memory isn’t hanging around.

The fix? I don’t know - but I suspect if I went in to the RSSBandit source and unplugged the UI udpates from the UpdatedFeed event the UI responsiveness would increase significantly (the background thread isn’t continually breaking into the main loop to update an unread count on a tree node).

It seems most of the memory usage while running RSS Bandit on Mitch's computer came from callbacks in the RSS parsing code that update the unread item count in the tree view within the GUI whenever a request to fetch an RSS feed is completed. Wow!!!

This is the last place I would have considered looking for somewhere to optimize our code and yet another example of why  "measurement is king" when it comes to figuring out how to improve the performance of an application. Given that a lot of the time a feed is requested there is no need to update the UI since no new items are fetched, there is a lot of improvement that we can gain here.

Yet again I am reminded that writing a rich client application like RSS Bandit using the .NET Framework means that you have to be familiar with a bunch of Win32/COM interop crap even if all you want to do is code in C#. I guess programming wouldn't be so challenging if not for gotchas like this lying around all over the place. :)


Categories: RSS Bandit | Technology

The Mini-Microsoft blog has an entry entitled 6% raise? I want to work for Dilbert's company! where he writes

Holy whatsa, Alice got a 6% raise ? I'd seriously consider hanging out in the bushes near Google-Kirkland with my aluminum bat to totally Tonya-Hardin-up some delicate competitive fingers just to get a 6% raise.

If you're a lead, you can bring up the manager review tool and check-in on how your reports are doing within The Model. Maybe some bits and pieces will move around, but the review model is pretty much done now and set to go into effect the 1st of September, with the mid-September paycheck showing any benefits.

One thing I've noticed kvetching with other managers is that once again, pay raises are minimal. I'm talking 2%-ish for a 3.5 review. That's barely keeping up with cost-of-living / inflation for doing more than is expected up of you. And of course, 3.0s, for the most part, get nothing. That's right: you're losing buying power for making a 3.0 - doing what's expected of you.

The poor quality of yearly raises was one of the reasons I decided to leave the XML team last year. I realized that no matter how hard I worked, I wouldn't be significantly financially compensated for going above and beyond what I was required to do. Given that I'm an "above and beyond" kinda guy I saw two choices; I could be underpaid or I could be underpaid and work on stuff I was passionate about. So I moved to MSN.

After doing some thinking during my vacation I concluded that MSFT isn't the kind of place I can see myself working at in 5 years. One of the repurcussions of this conclusion is that I'm going to start working on getting an MBA so I can broaden my options whenever I decide that the time has come for me to leave the B0rg cube. Of course, the nagging from my folks about when I'm going to grow up (get married, finish my education, etc) helped in coming to this decision.

Bah! Going back to school is going to suck.


Categories: Life in the B0rg Cube

August 13, 2005
@ 03:30 AM

I'm back in Seattle and may have already beaten jet lag by having never switched my watch from west coast time. It feels good to be back in my apartment. The five flights back were pretty uneventful. The only noteworthy event was that I saw Forest Whitaker in the upper class lounge of Virgin Atlantic at Heathrow airport. I was going to walk up to him and tell him how much I loved The Crying Game and Waiting To Exhale until I realized that would have made me sound like a jerk. I doubt that people in the movie business like being told their stuff rocked...a decade ago.

PS: If you are ever in the UK and you hear someone described as being Asian, it means they are from India not East Asia as is the case in the US.


Categories: Ramblings

I'm on the way back from my trip and this is the part of the vacation that sucks. It's going to take a total of 5 flights to get from Abuja back to Seattle as well as about half a day of sitting around in airports as well. Below are a bunch of last minute impressions about Nigeria and London (where I'm currently posting this from).
  • All the male restrooms in Heathrow airport have condoms dispensers. This really has me scratching my head since the only place I usually see them is in night club restrooms which makes sense since a bunch of hooking up goes on at night clubs. So now I have this impression that somewhere in Heathrow there is a bunch of debauchery going on and I'm not a part of it. It must be the first class lounges...

  • If you ask a British bartender for a 'Long Island Iced Tea', don't be surprised if he responds "We don't serve tea at the bar, twit!"

  • It seems I've picked up homophobia by osmosis while in the United States. I kept finding it weird that men could be seen holding hands together either for emphasis in a conversation or while walking without being seen as 'gay' in Nigeria. Similarly having guys sleep in the same bed also gave me a similar vibe. I can't believe I'm getting culture shock from my home country.

  • Do you know who cleans the streets of Lagos & Abuja? The street sweepers, literally. I was freaked out to see people with brooms sweeping the sides of the roads in both Abuja and Lagos without the luxury of safety cones. My memory fails me as to whether this is an improvement from not having street sweepers from a few years ago or this was just the status quo.

  • Soft drinks sold in plastic bottles seems to be gaining popularity in Lagos & Abuja. Back in the day it was all about the glass bottles, which were always redeemed by people. In fact, the price of a bottle of beer or a soft drink always assumed you'd be returning a bottle as well. It took me a while to get used to the 'wastefulness' in the United States where people just threw away the bottles. Of course, there were other places where the wastefulness surprised me as well when I first got here such as using paper towels instead of wash rags or styrofoam silverware & plates instead of reusable plastic ones at fast food places. Now it's the other way around. After doing the dishes at my mom's I was confused to not find paper towels nearby. I am becoming so American...

  • Thanks to a ban on external imports of various consumer goods we now get Heineken and Five Alive brewed locally.  Awesome!!!


Categories: Trip Report

In his post entitled Google News and RSS Dave Winer writes

It's the same reason I'm not giddy withdelight that Microsoft decided to call their support of RSS "web feeds"

Considering that the support for XML syndication technologies in IE 7 includes both flavors of RSS (1.0 & 0.91/2.0) and Atom, I personally don't think it is a good idea to call the feature 'RSS'.

Then there's the fact that RSS does sound a bit geeky, after all most people call them web pages and web sites not HTML documents and domains.

Internet Explorer is used by hundreds of millions of regular folks not just geeks. The IE team is simply trying to make the feature approachable to end users.


Categories: Life in the B0rg Cube

Over the past couple of months the MSN Spaces team has gotten a bunch of feedback about features users would like to see in the service. Common requests include more flexibility in customizing the look of the space, ability to play videos or music in a module and the ability to add modules containing cutom HTML.

The team has been listening and all of those features were released yesterday as Powertoys. As Powertoys they aren't fully supported features and are only available in English. They are basically cool hacks by some of the developers on the Spaces team which are a prelude to what this functionality might look like in a future release of Spaces.

If you are an MSN Spaces user you should read Mike Torres's posts about how to enable the HTML Module, Windows Media Player module and the Tweak UI Powertoy. They totally jazz up your Space.

Great work from Ryan for being the man with plan on getting these out.



Categories: MSN

August 8, 2005
@ 01:47 PM

In response to my post Using XML on the Web is Evil, Since When? Tantek updated his post Avoiding Plain XML and Presentational Markup. Since I'm the kind of person who can't avoid a good debate even when I'm on vacation I've decided to post a response to Tantek's response. Tantek wrote

The sad thing is that while namespaces theoretically addressed one of the problems I pointed out (calling different things by the same name), it actually WORSENED the other problem: calling the same thing by different names. XML Namespaces encouraged document/data silos, with little or no reuse, probably because every person/political body defining their elements wanted "control" over the definition of any particular thing in their documents. The <svg:a> tag is the perfect example of needless duplication.

And if something was theoretically supposed to have solved something but effectively hasn't 6-7 years later, then in our internet-time-frame, it has failed.

This is a valid problem in the real world. For example, for all intents an purposes an <atom:entry> element in an Atom feed is semantically equivalent to an <item> element in an RSS feed to every feed reader that supports both. However we have two names for what is effectively the same thing as far as an aggregator developer or end user is concerned.

The XML solution to this problem has been that it is OK to have myriad formats as long as we have technologies for performing syntactic translations between XML vocabularies such as XSLT. The RDF solution is for us to agree on the semantics of the data in the format (i.e. a canonical data model for that problem space) in which case alternative syntaxes are fine and we performs translations using RDF-based mapping technologies like DAML+OIL or OWL. The microformat solution which Tantek espouses is that we all agree on a canonical data model and a canonical syntax (typically some subset of [X]HTML).

So far the approach that has gotten the most traction in the real world is XML. From my perspective, the reason for this is obvious; it doesn't require that everyone has to agree on a single data model or a single format for that problem space.  

Microformats don't solve the problem of different entities coming up with the different names for the same concept. Instead its proponents are ignoring the reasons why the problem exists in the first place and then offering microformats as a panacea when they are not.

I personally haven't seen a good explanation of why <strong> is better than <b>...

A statement like that begs some homework. The accessibility, media independence, alternative devices, and web design communities have all figured this out years ago. This is Semantic (X)HTML 101. Please read any modern web design book like those on my SXSW Required Reading List, and we'll continue the discussion afterwards.

I can see the reasons for a number of the semantic markup guidelines in the case of HTML. What I don't agree with is jumping to the conclusion that markup languages should never have presentational markup. This is basically arguing that every markup language that may be used as a presentation format should use CSS or invent a CSS equivalent. I think that is a stretch.

Finally, one has to seriously cast doubt on XML opinions on a page that is INVALID markup. I suppose following the XML-way, I should have simply stopped reading Dare's post as soon as I ran into the first well-formedness error. Only 1/2 ;)

The original permalink to Tantek's article was broken after he made teh edit. I guess since I couldn't find it, it doesn't exist. ;)


Categories: Web Development | XML

August 7, 2005
@ 05:16 PM

I've been doing a bit more travelling around the country this week. The travel high point was a trip by helicopter today to a number of places including a local chief's palace and the village my dad where my dad was born. I took a couple of pics from the helicopter as well as on the ground and hope at least a few of them come out OK.

Below are a couple more random thoughts that have crossed my mind during this trip since my previous post

  • The proliferation of mobile phones is even more significant than I thought. I had assumed it was a city thing since the phones I saw folks with were in Abuja (current capital) and Lagos (former capital). However visiting less developed areas also have shown a high proliferation of mobile technology. In my dad's village I saw both a pay-as-you-go booth for MTN, a local mobile service provider, as well as a kiosk where a enterprising local entrepeneurs were renting out uses of their phones at 20 naira a call (about $0.15)

  • When I was growing up it was common practice for local businessmen to sell products that had been unsafe for public use in developed countries. It seems we now have a new government body called NAFDAC whose job is to act as the Nigerian version of the FDA. NAFDAC has been so effective that there have been multiple attempts on the life of the head of the organization by pissed off business owners whose products she's taken off the market.

  • The only thing scarier than being in a speeding car in typical Lagos or Abuja traffic is being driven in a speeding car in Lagos or Abuja traffic with an in-dashboard DVD player which is showing hip hop videos with half naked chicks dancing seductively. I kept wondering if the driver could keep his eyes on the road. That's it. Next time I come here, I'm walking everywhere.  

  • As I expected the common questions from family and extended family were when I'm going to show up with a future spouse and when I'm going back to school. What I didn't expect was so many people asking when I became such a fat ass. In hindsight, I should have expected it given that I haven't seen some of these folks in almost a decade and I've put on dozens of pounds since then. I definitely need to get back in shape. 



Categories: Trip Report

Tim Bray has a blog post entitled Not 2.0 where he writes

I just wanted to say how much I’ve come to dislike this “Web 2.0” faux-meme. It’s not only vacuous marketing hype, it can’t possibly be right. In terms of qualitative changes of everyone’s experience of the Web, the first happened when Google hit its stride and suddenly search was useful for, and used by, everyone every day. The second—syndication and blogging turning the Web from a library into an event stream—is in the middle of happening. So a lot of us are already on 3.0. Anyhow, I think Usenet might have been the real 1.0. But most times, the whole thing still feels like a shaky early beta to me.

I also dislike the Web 2.0 meme but not for the reasons Tim Bray states. Like the buzzword "SOA" that came before it "Web 2.0" is ill-defined and means different things to different people. Like art, folks can tell you "I know it when I see it" but ask them to define it and you get a bunch of inconsistent answers. For a while even Wikipedia had a poor definition of the term Web 2.0. The meat of the description there is still crap but the introduction is now one that doesn't make me roll my eyes. The wikipedia entry currently begins

Web 2.0 is a term often applied to a perceived ongoing transition of the World Wide Web from a collection of websites to a full-fledged computing platform serving web applications to end users. Ultimately Web 2.0 services are expected to replace desktop computing applications for many purposes.

This is a definition that resonates with me and one that has gotten me jazzed enough to have written my first Bill Gates Thinkweek paper as well as give Powerpoint pitches to lots of folks across MSN from VPs & GMs to fellow rank and file PMs & devs.  

The problem with "Web 2.0" and other over hyped buzzwords like "SOA" is that 90% of the stuff you hear or read about it is crap. Or even worse are like Tim O'Reilly's Not 2.0? post which hype it as something that will change the world but don't give you a good idea why. Reading stuff like Tim O'Reilly's

There's a set of "Web 2.0 design patterns" -- architecting systems so that they get smarter the more people use them, monetizing the long tail via a combination of customer-self service and algorithmic management, lightweight business models made possible by cooperating internet services and data syndication, data as the "intel inside", and so on.

just leaves me scratching my head. On the other hand I completely grok the simple concept that folks like me at MSN are no longer just in the business of building web sites, we are building web platforms. Our users are no longer just people interacting with our web sites via Firefox or IE. They are folks reading our content from their favorite RSS reader which may be a desktop app, web-based or even integrated into their web browser. They are folks who want to create content on our sites without being limited to a web-based interface or at least not the one created by us. They are folks who want to integrate our services into their applications or use our services from their favorite applications or sites. To me, that is Web 2.0.

You folks should probably just ignore me though since I am quite the hypocrite. I may pan the writings of folks like Tim O'Reilly and call 90% of the stuff written about "Web 2.0" crap but I did give up being a speaker on two panels at PDC to sit in the audience at the Web 2.0 conference. Due to a variety of reasons I could only pick one and based on how much value I got out of ETech I decided to pick Web 2.0


Categories: Web Development

According to the RSS Bandit roadmap the time draws nigh for the next release of RSS Bandit codenamed Nightcrawler. As with the previous release we will have an alpha version which will be mostly feature complete, followed by a beta version which will be feature complete and then the final release. Last week, Torsten and I agreed on the following plan for the alpha version of Nightcrawler.

Release Date: August 31, 2005

New Features:

  • NNTP Newsgroups support
  • Downloading of Enclosures/Podcasts
  • Subscription Wizard replaces Add New Feed dialog
  • Fast Mode (shutting off comment threading which uses a lot of CPU)
  • Synchronization with Newsgator Online
  • Atom 1.0 Support
  • Extensibility Framework to Enable Richer Plugins
  • Item Manipulation from Newspaper Views (e.g. Mark As Read, Flagging)
  • Tip of the Day on Startup

There's also a persistent bug that has been bothering some of our users where posts from different feeds end up being mixed up. We haven't located the source of this bug but have added some tracing to the build which will be enabled in the alpha. Users who end up with mixed up feeds after the alpha can send us the trace files which should help us narrow down the source of the problem.

There are a couple of features I'd like to see in the final version such as "Comment Watching" so I can tell when a post I am interested in gets new comments. However we need to start locking down for the next release so that feature isn't likely to make it in unless I can sneak it in before the beta.  If there are other small, nice to have features you'd like to see in Nightcrawler please file a feature request in SourceForge and we'll see what we can get to before the final release.

Any comments or other feedback would be greatly appreciated. 


Categories: RSS Bandit

Nick Bradbury has a post entitled in which he talks about a new non-profit entity that has been formed by Steve Gillmor, Seth Goldstein and a few others. Nick writes

In a nutshell, the idea is that your attention data - that is, data that describes what you're paying attention to - has value, and because it has value, when you give someone your attention you should expect to be given something in return. And just because you give someone your attention, it doesn't mean that they own it. You should expect to get it back.

I know that sounds a little weird - it took me a while to grok it, too. So I'll use an example that's familiar to many of us: Netflix ratings and recommendations. By telling Netflix how you rate a specific movie you're telling them what you're paying attention to, and in return they can recommend additional DVDs to you based on how other people rated the same movie. In return for giving them your attention data - which is of great value to them - they provide you features such as recommendations that they hope will be valuable to you. In my mind, this is a fair trade.

But what if Netflix collected this information without your knowledge, and rather than using it to give you added value they sold it to another service instead? I imagine that many people wouldn't like that idea - chances are, you'd want to be given the opportunity to decide who this information can be shared with. This is one of the goals of to leave you in charge of what's done with your attention data.

But what about this whole idea of mobility, as mentioned on the site? What's the benefit of making this stuff mobile? Dave Winer provides a nice example: suppose you could share your Netflix attention data with a dating site such as, so you could find possible partners who like the same movies as you? For that sort of thing to be possible, you'd need to be able to get your attention data back from any service which collects it. (As an aside, this also means you could share your Netflix queue with any new DVD rental service that comes down the pike - so my guess is that smaller, up-and-coming sites will be more willing to share attention data than the more entrenched sites will.).

The attention data is what separates the giants in the Web world like Amazon & Netflix from their competitors. It is in their best interests to collect as much data as possible about what users are interested in so they can target their users better. The fact that [for example] fans of G-Unit also like 50 Cent is data that makes Amazon a bunch of money since they can offer bundle deals and recommendations which lead to more sales. Additionally record labels and concert organizers are also interested customers in the aggregate data of where people's musical interests lie. It is arguable that this is also beneficial to customers since it makes it more likely that their favorite artists will appear in concert together (for example). Similar concepts exist in the physical world such as supermarket loyalty cards.

How much data websites can store about users can vary widely depending on what jurisdiction they are in. Working at MSN, I know first hand some of the legal and privacy hurdles we have to clear in various markets before we can collect data and how we must make users aware of the data we collect. All this is documented in the MSN Privacy policy. To better target user's we'd love to collect as much data as possible but instead adhere to strict policies informed by laws from various countries and guidelines from various privacy bureaus.

It currently isn't clear to me whether plans to become another privacy body like TRUSTe or whether they plan to be a grassroots evangelization body like the WaSP. Either approach can be effective although they require different skill sets. I'll be interested in seeing how much impact they'll have on online retailers.

As to why I called this the "Return of Hailstorm" in the title of this blog post? It's all in the 2001 Microsoft press release entitled "Hailstorm" on the Horizon which among other things stated

"HailStorm" is designed to place individuals at the center of their computing experience and take control over the technology in their lives and better protect the privacy of their personal information. "HailStorm" services will allow unprecedented collaboration and integration between the users' devices, their software and their personal data. With "HailStorm", users will have even greater and more specific control over what people, businesses and technologies have access to their personal information.

Of course we all know how that turned out. The notion of mobile attention data basically requires Web companies like Netflix & Amazon to give up what for them is a key competitive advantage. It makes no business sense for them to want to that. I wish Steve Gillmor and company luck with their new endeavors but unless they plan to lobby lawmakers I don't see them getting much traction with some of their ideas.


Categories: Technology

August 3, 2005
@ 02:13 AM

I recently stumbled upon a blog post entitled Why MSN is lost again... from Guillaume Belfiore which claimed that MSN is lost because we copy features from competitors without having a roadmap for where we want to go. He uses a specific example of the recent announcement that MSN Spaces will have a social networking feature as proof and claims that we are simply copying Yahoo! 360.

I was going to write a response but then realized that Jeremy Zawodny of Yahoo! had written a post about this topic which is a generic answer to posts like Guillaume's. In his post Secrets of Product Development and What Journalists Write Jeremy wrote

Before I came out to California to work at Yahoo, I watched the business and culture of Silicon Valley from a distance. I read lots of the trade rags, tech web sites, and books about early Internet companies (the Netscape era).

One of the things that amazed me about Internet companies (usually the portals) was how quickly they built things and were able to react to each others moves with frightening speed. Company X would do something amazing and new only to be leapfrogged by Company Y just a few weeks later.

They were putting on one hell of a show and it was all amplified by the crazy bubble of the late 90s. I loved it.

The tech and business press would say things like "in response to Company X, Company Y has just..." or "in an effort to defend their business from Company Y, Company X today launched a new..."

I saw headlines like that all the time and still see them today.

Today there's one important difference: I'm on the inside now. For the last five and a half years, I have had a front row seat to the inner workings of what I used to imagine (with the help of a small army of journalists and reports).

Now I see it first hand and hear about it from coworkers and friends at other companies. And you know what? It's even more insane than it looked from the outside.

So I'm going to let you in on a little secret about how products are developed at large companies--even large Internet companies that some people think are fast on their feet.

Larger companies rarely can respond that quickly to each other. It almost never happens. Sure, they may talk a good game, but it's just talk. Building things on the scale that Microsoft, Google, AOL, or Yahoo do is a complex process. It takes time.

Journalists like to paint this as a rapidly moving chess game in which we're all waiting for the next move so that we can quickly respond. But the truth is that most product development goes on in parallel. Usually there are people at several companies who all have the same idea, or at least very similar ones. The real race is to see who can build it faster and better than the others.

Think about this the next time a news story makes it sound like Yahoo is trying to one-up Google. Or MSN is "responding" to last week's launch of a new AOL service.

It's easy to get caught up in the drama of it all. But reality is often quite different than what you read.

Just because the media likes to paint it as if web companies respond to each other's development efforts in the twinkling of an eye as part of an eternal game of one upmanship doesn't mean this is the case. Although folks like to paint Web development as simply tweaking HTML pages, as Jeremy points out it takes a lot longer than one would expect to build and deploy services that will be utilized by millions of people.

The social networking aspects of Spaces have always been part of the vision and in fact when I was hired at MSN my boss told me that I'd be working on three things; a blogging platform, a social networking platform and an RSS platform. At the time, it wasn't clear my team would own the RSS piece so my [future] boss was worried that I'd be upset if I started on the team and the RSS piece moved elsewhere. Of course, since I already work on RSS Bandit in my free time I didn't mind if I didn't get to work on RSS as part of my day job. It turned out he was right and the RSS pieces ended up being driven by the and folks.

Don't believe the hype.


Categories: MSN

August 1, 2005
@ 07:03 PM

I've been in Nigeria for almost a week and so far it's been great. I've spent a bunch of time with family and friends, eaten a bunch of stuff I haven't hafd in years and decided I like MTV in Africa better than what we get in the United States. I've also been taking pictures of everyday life which I'll post to the photo album on my Space once I get back.

Below is a random grab bag of impressions I've had during my trip

  • The traffic scares me. A lot. When being driven in Lagos & Abuja I tend to clench my fists while expecting we'll be in an accident at any minute. I can't get over the fact that as a teenager I used to be able to drive in this chaos and never had an accident. :)

  • The proliferation of mobile phones is insane. There seem to be about half a dozen mobile phone carriers and almost everyone on the streets is carrying one. I was talking to my dad and he said the Nigerian mobile phone market is the second fastest growing in the world after to China. About two years ago when I was last here I saw more people downloading ringtones and texting than I'd seen in Seattle & Redmond, the trend has only continued. I have a bunch of pics of mobile phone ads on the sides of buildings and street hawkers selling pay-as-you-go recharge cards which I'll post once I get back.

  • There is now a large local movie & hip hop scene. The movie scene was blowing up just before I left for college but it now seems to have matured quite a bit. It seems we export movies all over Africa. Folks have started calling the Nigerian movie scene "Nollywood". There are also a ton of local hip hop acts including one of my high school friends is now a rapper called Big Lo. About a decade ago he and I were part of a rap group called R.I.G. and I still have some of our tracks on my iPod. It's great to see that at least one of us is living our teenage dream of being a famous rap star.

  • The newspaper headlines seem to focus exclusively on the goings on of the government & politicians or on tragedies involving loss of life. The contrast between that and the kind of stuff I usually see on the cover of USA Today is striking.

  • Inflation is crazy especially in Abuja. Everything seems to cost a couple of hundred or thousands of naira. I still remember when you could get a bottle of Coke or a newspaper for under one naira. Then again, that was about two decades ago.

  • There are a lot of billboards about HIV/AIDS prevention in the capital city in Abuja but almost none in Lagos (the former capital and commercial center). I'll try and get some pics of the billboards before I get back.

  • Almost every PC I've used so far has been infested with spyware. Except for the Powerbook...

  • The London bombings are on people's minds in my social circle. One of my mom's friends lost her only soon in the July 7th attacks. My sister and dad were in London during the first bombing and I was pretty rattled when it happened. It's good to see the British police have caught all the suspects from the second attack. 

  • The local airline business seems to be thriving as well. Here's another place where there seems to be at least half a dozen competitors driving down prices. It looks like the government airline, Nigeria Airways, is finally out of commission. Good riddance.

  • I miss Nigerian food.


Categories: Trip Report