Kurt Cagle has a post entitled Open Standards and Organic Foods which begins

A question was posed to me recently concerning exactly what I meant when I talked about open standards, and how they differed from open source. In reviewing some of my previous postings, one of the things that I realized was that while I had offered up a number of definitions in passing, there really wasn't any single, stock answer that I or others had seen for what exactly open standards mean. Moreover, a lot of people tend to look at open standards with a somewhat jaundiced eye, as if it was simply one more marketing label in a field that is already way oversaturated with marketing buzzwords - they didn't understand why open standards were important, or they didn't understand the distinction between open source and open standards.

The software industry is now full of buzzwords and buzz phrases that are so ambiguous that if you ask five people what they mean you are likely to get ten different definitions. The problem this causes is that people often talk past each other even if they use the same words or even worse miscommunication occurs due to basic assumptions about the conversation which are incorrect. Examples of such ambiguous buzz phrases include; web 2.0, service oriented architecture and standards.

Some people I've talked to about this are surprised that I add 'standards' to this list. However the definition of what constitutes a 'standard' is in the eye of the beholder. About a year and a half ago, I wrote a blog post entitled Are Standards in the Software Industry a Chimera? which stated 

The word "standard' when it comes to software and computer technology is usually meaningless. Is something standard if it produced by a standards body but has no conformance tests (e.g. SQL)? What if it has conformance testing requirements but is owned by a single entity (e.g. Java)? What if it is just widely supported with no formal body behind it (e.g. RSS)?

For every one of the technologies mentioned above (RSS, Java, and SQL) you'll find people who will argue that they are standards and people who will argue that they aren't. SQL is produced by a standards body and has a number of formal specifications but since there is no conformance requirements most database vendors have embraced and extended it. It is difficult to write non-trivial SQL queries that will work across Microsoft's SQL Server, MySQL, Oracle's databases and IBM's DB2. The Java programming language and platform is supported by a number of vendors and has rigid conformance tests which make the statement "write once, run anywhere" true for the most part, however it is a proprietary technology primarily controlled by Sun Microsystems. Content syndication using RSS 0.9x/RSS 2.0 feeds is the most popular web service on the planet but the specifications were primarily authored and controlled by a single individual and have no formal standards body or corporation backing them till this day. In each case, the technology is 'standard' enough for there to be thriving markets around them with multiple vendors providing customers with valuable services.

From a customer perspective, standards are a means to an end and in this case the goal of standards is to prevent vendor lock-in. As long as users can choose between multiple RSS readers or developers can choose between multiple Java implementations, there is enough standardization for them. Where things become contentious is that there are multiple ways to get to the same solution (lack of lock-in).

"Open standards" are even more ambiguous since [as an industry] we don't even have a clear idea of what constitutes a standard. I read through Kurt Cagle's post and he never actually ends up defining "Open Standard" beyond providing analogies and rationales for why he believes in them. An interesting statement that Kurt makes in his post is the following

I suspect that in many ways the open standards movement is, at its core, a reaction to the rather virulent degenerate capitalism that exists today, in which a person can profit far out of proportion to the amount of work that they do, usually at the expense of many others who lose disproportionately to their work load.

The notion of 'profitting in proportion to your work' is pretty bogus and foreign to capitalism. Capitalism is all about the value of your work to others not how much work you put in. A minor league baseball player doesn't work an order of magnitude less than a major league baseball player yet he makes that much less. A multiplatinum recording artist doesn't work an order of magnitude harder than local bands trying to get big but makes that much more. It may not sound fair but that's capitalism. In recent centuries humans have experimented with other socio-economic movements that are more 'fair' but so far capitalism is what has stuck. </digression>

Anyway, my point is that buzz phrases like "standards", "service oriented architecture" and "web 2.0" have such diluted and ambiguous definitions to be effectively meaningless in technical discourse. People who've been in the industry for a while eventually learn to filter out these phrases [and often the people speaking them as well] when engaged in technical discourse. If you are a technical person you probably should be more explicit about what you mean by using phrases such as "freely implementable and patent unencumbered", "SOAP-based web services" and "AJAX powered website" in place of the aforementioned buzz phrases. Oh, and if they don't match up to what you mean when you use those statements then that just proves my point about the ambiguity of these buzz phrases.


Categories: Technology

I've been surprised by how often movies with similar themes end up being released at roughly the same time by Hollywood studios. Since it takes several months to shoot a movie this means that somewhere along the line some Hollywood exec hears about a rival studio producing a movie and decides to produce a movie with a similar theme. Some examples that come to mind are

I'm sure there are dozens of examples like this from across the years. What I wonder is whether I'm right that Hollywood execs just have a "follow the leader" mentality and decide that if a competitor is shooting a disaster movie for summer of next year that sounds like a hit then they need to shoot one as well. Or is there some more sophisticated reasoning at work?


Categories: Ramblings

December 27, 2005
@ 07:15 PM

About a month ago we released the first cut of the Nightcrawler edition of RSS Bandit and promised that there'd be a bug fix release within a month. So exactly 30 days and about 100,000 downloads later, we're shipping the bug fix version of the Nightcrawler release.

Download the installer from here. Differences between v1.3.0.38 and v1.3.0.42 below.


  • This release is available in the following languages; German, English, Brazilian Portuguese, Traditional Chinese, Polish, Serbian, Russian, Japanese, Italian, Spanish, Bulgarian, and Turkish.


  • The application can now be installed on computers that only have v2.0 of the .NET Framework available.

  • A mini-application for editing the keyboard shortcuts used by the RSS Bandit has now been added. To locate the application, look in the RSS Bandit application folder which is usually "C:\Program Files\RssBandit" for the application named ShortcutsEditor.exe


  • Fixed issue where the subscription wizard picks the first feed found on a web page in auto-discovery mode instead of displaying the list of feeds found so users can choose which feed to add to their subscriptions.

  • Feed autodiscovery now uses the Firefox icon () which will also be used by IE 7, Outlook 12 and FeedDemon for indicating when feeds have been found on a web page within RSS Bandit.

  • The '+' that indicates that inline comments are available for a post now controlled by the Tools->Options->Display->Display Related Items as discussion threads checkbox. Note that this feature is now off by default.

  • Fixed issue that feed validation/autodiscovery/search in subscription wizard could not be cancelled

  • Fixed problems with processing titles in Atom feeds containing markup characters. We now pass all of Phil Ringnalda's Atom title test cases. .

  • Fixed issue where the title of first post in the feed is used as the feed title in some RSS 1.0 feeds..

  • Fixed issue where some comments aren't shown inline for some feeds that support <wfw:commentRss>

  • Fixed issue where the blog post is repeated as the first comment in some feeds that support <wfw:commentRss>.


  • Added ability to create new posts in a newsgroup.
  • Subscribing to a newsgroup no longer asks for username and password per subscription since this information is specified when the news server was first added.

  • Subscribing to multiple newsgroups at once now supported.

  • Fixed bug where subscribing to newsgroups via Tools->Newsgroups results in 'Input string was not in a correct format' error on attempting to download posts from the newsgroup.

  • Fixed issue where downloading messages from newsgroups times out. This was fixed by setting the maximum number of items to download to 1000.

  • Fixed issue where subscribing to password protected newsgroups results in "Credentials property not an instance of NetworkCredential" error.

Synchronization with NewsGator Online

  • Fixed issue where when downloading feedlist from NewsgatorOnline the following error occurs; "Feedlist download failed with error: Object reference not set to an instance of an object".

  • Fixed issue that downloading feedlists from Newsgator Online never stops


Categories: RSS Bandit

December 27, 2005
@ 05:40 PM

Niall Kennedy has a blog post entitled Exclusive: Google to offer feed API where he reveals

Google plans to offer a feed reader API to allow third-party developers to build new views of feed data on top of Google's backend. The new APIs will include synchronization, feed-level and item-level tagging, per-item read and unread status, as well as rich media enclosure and metadata handling. Google Reader PM Jason Shellen and engineer Chris Wetherell both confirmed Google's plans after I posted my reverse-engineering analysis of the Google Reader backend.

The new APIs will allow aggregator developers to build new views and interactions on top of Google's data. Google currently has at least two additional Google Reader views running on current development builds.

Google may offer public access to the feed API as early as next month. Shellen said the team wants to nail a few more bugs before publicly making the service available to the world.
Google's new offering is direct competition to NewsGator's synchronization APIs but are easier to code against (no SOAP required). Google currently does not have the same reach across devices as NewsGator but an easy-to-use API from the guys who brought you the Blogger API and "Blog This!" might really shake up the feed aggregator ecosystem.

As someone who's been thinking about synchronization between RSS readers for a few years I definitely see this as a welcome development. The Bloglines sync API is too limited in its functionality to be useful while the NewsGator API is both complex and designed with too many assumptions to be widely usable. However, unlike Niall, I blame the complexity of the NewsGator API on the data model and expected data flow than on whether it uses SOAP versus Plain Old XML (POX) as the wire format.

Once the Google Reader API ships, I'll definitely investigate the feasibility of adding support for it to the Jubilee release of RSS Bandit.


December 22, 2005
@ 08:05 AM

Finding myself with a few hours to kill this evening, I decided to update my Seattle Movie Finder mashup which provides information on show times for movies currently playing in the Seattle area. The update was to use the new map control that is being used by Windows Live Local instead of the old Virtual Earth map control.

So far there isn't a lot of accurate information out there for working with the new control. The best guide I found was the article Creating Your First Virtual Earth v2 Page, almost every other article or reference document seems to be outdated. As usual Chandu Thota was a fountain of wisdom when it came to getting info about the API as was Chris Pendleton.

Note: This version works in Internet Explorer. I've pinged the Windows Live Local folks to complain about Firefox support so I expect it should be forthcoming since they seemed aware of the problem.

It works in Firefox and Internet Explorer.


Categories: Windows Live

The Reuters article AOL, Google ad pact to include video, instant msgs states

America Online said Google had agreed to invest $1 billion to take a 5 percent stake in AOL, as part of an enhanced pact where Google will move beyond text-based advertising to allow AOL to sell graphical ads to Google's fast-growing ad network.

The stake effectively values AOL at $20 billion, a key benchmark should Time Warner elect to spinoff or sell a part of its Internet unit in response to dissident shareholder Carl Icahn's proxy campaign to break up the company.

Terms of the deal call for AOL to make more of its Web sites searchable via Google search, including a plans to feature AOL's premium video services within Google Video, a way of searching for Web-based video programming.

They also said they had agreed, under certain unspecified conditions, to allow users of Google's recently introduced instant messaging system Google Talk to communicate with users of AOL's market-leading AIM instant messaging service.

This is a very interesting development when combined with the recent release of the Libjingle library which allows developers to use the Google Talk API. Does this mean it'll soon be possible for any developer who grabs Libjingle off of SourceForge to be able to integrate the ability to instant message with any AOL Instant Messenger (AIM) into their applications free of charge? That is definitely game changing. I haven't looked at Libjingle [for obvious reasons] but I am interested in comments on whether my analysis is on the mark or not from people who've tried it.

I'll definitely be watching the Google Talk blog and Joe Beda's blog to keep on top of the developments in this space. Interesting stuff indeed. Perhaps I'll soon be able to integrate chatting with your AIM buddies into RSS Bandit ?


Categories: Web Development

December 21, 2005
@ 01:00 AM

I just found out about Yahoo! Open Shortcuts. This is SICK!!!. Yahoo! has done it again. The excerpt below is taken from the FAQ

  1. How do I use it?
  2. Do you have some shortcuts that are ready to use?
  3. How do I create my own?
  4. How do I manage my Open Shortcuts?
  5. Is there search meta word trigger?
  6. Can I name my shortcut the same name as a default shortcut?
  7. I've got feedback about Yahoo! Open Shortcuts. Who do I tell?
  1. How do I use it?

    Type an ! (exclamation point) followed by the name of the shortcut in the Yahoo! Search box.



    !my to navigate to "http://my.yahoo.com"
    !wsf to search "weather san francisco" on Yahoo!

    For shortcuts that search favorite sites or start an Internet application, you can also type optional term(s) after the shortcut name.

    !shortcut_name term(s)


    !ebay lamps to search for "lamps" on Ebay
    !mail bill@yahoo.com to send email to "bill@yahoo.com"

  2. Do you have some shortcuts that are ready to use?

    These popular shortcuts are available for immediate use:

    !clist http://www.craigslist.org/
    !my http://my.yahoo.com/
    !mysp http://www.myspace.com/
    !amazon http://www.amazon.com/exec/obidos/external-search?mode=blended&keyword=%s
    !audio http://audio.search.yahoo.com/search/audio?p=%s
    !ebay http://search.ebay.com/search/search.dll?satitle=%s
    !flickr http://www.flickr.com/photos/tags/%s/
    !image http://images.search.yahoo.com/search/images?p=%s
    !mw http://myweb2.search.yahoo.com/myresults/ourresults?p=%s
    !news http://news.search.yahoo.com/news/search?p=%s
    !php http://www.php.net/%s
    !shop http://search.shopping.yahoo.com/search?cop=mss&p=%s
    !video http://video.search.yahoo.com/search/video?p=%s
    !wiki http://en.wikipedia.org?search=%s
    !note http://notepad.yahoo.com/?v=161&DESC=%s
    !mail http://compose.mail.yahoo.com/ym/Compose?login=1&To=%s
  3. How do I create my personal shortcuts?

    To create a navigational shortcut, use the keyword: !set with your shortcut name and destination URL.

    !set shortcut_name URL


    To create a shortcut named "ff" to go to Yahoo! Sports Fantasy Football,
    Type: !set ff http://football.fantasysports.yahoo.com/f2

    To create a common search shortcut, use the keyword: !set with your shortcut name and the query.

    !set shortcut_name query


    To create a shortcut named "watx" to search Yahoo! for "weather austin texas",
    Type: !set watx weather austin texas

    To create a shortcut that searches a favorite site or starts an Internet application, use the keyword: !set with your shortcut name and the destination URL. Within the URL, substitute the query term(s) with the "%s" placeholder.

    !set shortcut_name URL_%s


    To create a shortcut named "yf" to search stock quotes on Yahoo! Finance,
    Type: !set yf http://finance.yahoo.com/q?s=%s

This is extremely useful. I've been wanting a short cut for searching Wikipedia on popular search engines for quite a while. Just for this feature alone I can see my usage other search engines dropping. 


Categories: Web Development

We were planning to ship a bugfix release of RSS Bandit before Christmas which fixed all the major issues reported in the recently Nightcrawler edition of RSS Bandit.

Unfortunately, it seems that either due to complexity or buginess I simply can't get the NewsGator API to perform the straightforward task of marking something as read in NewsGator Online which was viewed in RSS Bandit. I spent all of yesterday afternoon plus a couple of hours this morning working on it and I've finally given up. This feature simply won't work in the bugfix release shipping later this week. Maybe I'll have better luck when we ship the Jubilee release.

To make myself feel better, I'll work on fixing some of the Atom parsing bugs reported by Phil Ringnalda and the issues with password protected newsgroups. Nothing like having your self-worth defined by how many bugs you close in a database on SourceForge.

Update: So not only have I already fixed the newsgroup issues and the problems with parsing Atom feeds pointed out by Phil Ringnalda but I just got pinged by Gordon Weakliem who is the developer of the NewsGator API. Perhaps my Christmas can be salvaged after all.


I've seen a number of interesting posts today in response to the news that Microsoft will end support for Internet Explorer for the Mac ths year. The best posts have been from people who used to work on the product.

Jorg Brown has a comment on Slashdot entitled I was on the MacIE 6 team when it got canned... which contains the following excerpt

MacIE had one of the strangest and saddest histories I've seen, of any product.

MacIE 5 was an awesome release, critically aclaimed and everything, with a good development team and a strong testing team, that included daily performance measurement.

And yet, almost immediately after 5.0 was released, the MacIE team was redeployed to work on a set-top DVR box. The notion at the time was that the team would continue to do MacIE work in their spare time, since IE 5 was the leader among Mac browsers and no longer needed a full-time team.

The problem with that notion was that WebTV, the team's new bosses, had no reason to actually schedule any time for real IE work. So later, when that particular set-top box got cancelled, the IE team got redployed for other WebTV work, and since this was now out of MacBU's control, nothing could really be done.

3 or 4 years went by before enough people in the Mac division wanted to resume work on IE, and when it looked like we might actually need the technology, as a base for MSN-for-Mac, the IE 6 team was formed. It got a firm OS X-only foundation, a new even more complient browser base, and then suddenly it became apparent that Apple was doing their own browser, because, well, there were lots of small clues, but the big clues was that Apple had started calling the old Mac IE team offering them jobs.

By that time the Mac division had formally committed to MSN-for-Mac-OSX, so it's not like we were completely going to stop work. But a meeting was held internally, the outcome of which was that it didn't make sense to build our own browser if Apple was going to bundle one, because the marketshare and mindshare of the distant-second-place browser, on the distant-second-place platform, wasn't worth pursuing. A week later we had a meeting with high-up people at Apple, where they told us they were doing a browser. And the week after that, after confirming it with Bill Gates, who was reportedly sad but understanding of the decision, MacIE was officially shut down.

MSN-for-MacOSX went ahead, and was also critically acclaimed, but once released, indications were that the number of users was about the same as the number of developers. After that, MacBU concentrated once again on the next Office release, and MacIE has been well and truly and permanently dead ever since.

Over the whole sad journey, the single most surprising thing I ever discovered was from a small conversation that went:

Me: "Look, if it makes sense to devote dozens of people to WinIE, then surely it makes sense to devote half a dozen to MacIE!"

Higher-up: <confused look> "There aren't dozens of people on WinIE. WinIE had some great people on it! We need those great people on products that make money!"

Me: "Then why on earth did we pursue IE in the first place? Just so that the DOJ would sue us?"

Higher-up: <confused look>

Some day I hope to get a proper answer on our motivation to do WinIE and MacIE in the first place. It seems to be that we were scared of not having control of the HTML standard. And indeed, now that Firefox is gaining traction, Microsoft has added more people to WinIE again.

Jimmy Grewal also has a blog post about this entitled End of an era: Mac Internet Explorer where he writes

This announcement has sparked some debate on Slashdot, which was inevitable. Omar pointed me to a comment to this by our former co-worker Jorg Brown, who now works for Google, which I’ll quote below:
... [see above excerpt]
A lot of what he says is true; but the story is more complex than this and there were many other factors that came into play. Issues which he doesn’t cover…primarily because he wasn’t working on the product much until the last few months of development:

  • - Mac IE was the first real browser running on Mac OS X. We had it running on Developer Preview 2 and it shipped on the Public Beta CD-ROM. That was a great engineering achievement but it came at a very high price. Developing for OS X in those early days was a nightmare and we spent so much time struggling with OS bugs and changing APIs that precious time that could have been used to improve the product was wasted just trying to maintain compatibility with each new beta release of OS X.

  • - Apple was a pain in the ass sometimes. For a company with such great PR, they really were very unprofessional and treated developers poorly. I know that the OS X transition was tough, but there are so many stories I could tell of stupidity at Apple and policies which made no sense…but I won’t. I’ll just say that Apple had a lot more involvement in the development of Mac IE and it’s eventual end than Jorg gives them credit for. There were times during the last two years of working at Microsoft that I really hated Apple’s management…which was very difficult for me being such a loyal fan of their products and having so many friends who worked there.

  • - No clear direction from our management was the last major factor which Jorg touched upon but is important to mention again. Towards the end, we had some major changes in management at the MacBU and the new team was inexperienced both with the products they were managing and how to deal with Apple. They were further handicapped by lack of clear direction by our execs who were too busy worrying about AOL, the DOJ, and our stock price.

The common thread in both perspectives is that management at Microsoft didn't see much value in continuing with IE on the Mac. Jorg doesn't seem to understand why but the reason seems clearer to me.Microsoft is a platform company. We have built the most popular software platforms on the planet; Microsoft Windows and Microsoft Office. In the 1990s, two technologies/products attempted to take the place of Windows as the world's #1 developer platform. These technologies/products were the Java platform produced by Sun Microsystems and the Netscape Navigator web browser produced by Netscape. Microsoft met both challenges in a variety of ways including making sure that Windows  (a) was the best platform to run Java applications and (b) had the best Web browser on any platform. The goal was simple if Java or the web browser became the platform, then that platform would at the end still be a Windows platform. Of course, some other decisions Microsoft made with regards to competing with Sun and Netscape landed the company in court with billions of dollars in fines and settlements. 

Fast forward to the early 2000s, the browser wars are over and IE is the world's dominant Web browser. In an almost text book example of how monopolies work, Microsoft abandoned innovation in IE in a move that showed that at this point IE was considered a cost center not a revenue generator. It simply doesn't make business sense for Microsoft to invest in a technology that dintermediates it's most popular platform, the Windows operating system. This should sound familiar to you if you've read The Innovators Dilemma.

It's now the mid-2000s and the Web browser landscape has changed. Technologies such as DHTML and IXMLHttpRequest which were invented by Microsoft to make IE the best developer platform on the Web have been adopted by competitors like Google and rival Web browsers like Mozilla. Despite our best efforts, the Windows platform is being routed around and even worse it is by technologies we invented. In this case Microsoft has been hoisted by its own petard

These developments have caused renewed interest in IE [at least on Windows] by Microsoft which is why I went from two years of being a Microsoft employee and not believing an IE team existed to reading the IE blog which makes it seem that there is now a veritable army of developers working on IE. The only problem is that I expect that history will repeat itself. What happens when IE reaches feature parity with Mozilla? Will we have to wait until Windows Blackcomb until we see Internet Explorer 8? Given how Microsoft [and specifically the Windows division] works this isn't as crazy an idea as it sounds. 

I can think of two ways to prevent history from repeating itself. The first is that Microsoft officially disbands the IE team after IE 7. The second is that Microsoft transfers the IE team to a product group that actually depends on browser innovative to make money such as MSN Windows Live. We haven't innovated in the browser for almost a decade. IE 5 was the last truly innovative release. Ex-IE team members like Scott Berkun who wrote the classic How to build a better web browser show exactly stagnant the world of Web browser innovation has been this century. Given that Microsoft views IE as a defensive option to make Windows an enticing product, there is less incentive to make it the ultimate browsing experience as products whose bread and butter is the Web browser. Why do you think there are so many Google employees working on Mozilla?

Microsoft should either cede innovation in the Web browser to Mozilla/Google or make IE more than just "icing on the Windows user experience cake"by transfering the product to a team whose bottom line depends on browser innovation. Of course, I doubt that my words will be taken seriously by folks at Microsoft [except as a reason to send my boss or his boss angry mail] but this needs to be said.


Categories: Life in the B0rg Cube

December 19, 2005
@ 05:56 PM

Robert Scoble has a post entitled Riya not recognized by Google where he recommends that Microsoft look into purchasing Riya.com. He writes

I’ve heard many rumors about Riya over the past few weeks. One strong rumor, reported by Om Malik, among others, was that Riya was getting purchased by Google.

I know our M&A guys had met with Riya too and had passed on the deal after negotiations got too expensive (translation someone else had bid more than we were willing to pay). So, I was suprised that during the past few days I had heard that Riya’s deal with Google wasn’t going to happen.

Today Munjal, Riya’s CEO, said on his blog that they were going to continue on as an independent firm and that the rumors are incorrect.

This is actually very good for Microsoft and Yahoo. Why? Cause this team is high quality and the technology is great (I’ve been using the alpha recently and like it a lot).

Now, why doesn’t Microsoft purchase them? Well, I’ve been in contact with our M&A folks. We have a lot of NIH syndrome here cause we have similar technology that our research teams have developed. I’ve seen our photo/face recognition capabilities and they are pretty cool too and, indeed, are better in some areas and not as good in others.

I have a couple of opinions here but mostly it is advice to Robert given that I've been recently involved in acquisition related discussions as part of my day job. My main thought about Google passing on Riya is that I expected this given that they demoed their in-house image recognition software at the Web 2.0 conference. Thus purchasing Riya would primarily be about augmenting their internal development team which reduces the value of the deal to Google.

From a Microsoft perspective, I'd expect to see a bunch of NIH as well. We have folks who are doing quite a lot in the area of improving digital photograph at Microsoft Research. You can read about some of the work in articles like MSR's Life of a Digital Photo or view demos about interesting work in Object class recognition from the Cambridge arm of Microsoft Research. I don't think I've seen anything as cool as the stuff demoed by Riya and Google but we do have some interesting stuff cooking at Microsoft Research nonetheless. 

The problem for Scoble is finding a product team that actually thinks what Riya is doing is valuable enough to make it worth however many tens of millions of dollars the VCs think the company is worth. Simply saying "What they are doing is cool" isn't enough. The folks at MSR face the same problems and the fact that there aren't lots of transitions from cool research demo to Microsoft product shows just how difficult this process can be. 


Categories: Technology

December 17, 2005
@ 05:15 PM

A friend of mine called me yesterday to ask for my opinions on the fact that Meebo just got a few million dollars in funding. For those not in the know, Meebo is an AJAX version of Trillian. And what is Trillian? It's an instant messaging client that supports AIM, ICQ, MSN, Yahoo Messenger, and IRC.

In his post How Much Did Meebo Get? Om Malik asks

Here is the rub: Since the company basically aggregates all four major IM networks in a browser, all the four major IM owners - AMYG are out of the acquisition game. One of them buys the company, the others shut down access to their respective networks. The very quality that makes Meebo attractive to end-users will make it difficult for them to be acquired. But there is one option: eBay. When all fails, you know who to call. Skype did. Interactive Corp is another long shot, but they are bargain hunters not premium payers.

Regarding acquisitions, there are three reasons why one of the major players would buy a company; users, technology and people. Unless the start up is in a brand new market that the major player isn't playing in, buying a company for its users is usually not the case. This is because big players like Google, Yahoo! and Microsoft usually either have orders of magnitude more users than the average 'popular' startup or could get just as many or more users when they ship a rival service. The more common reason for a big player like Microsoft or Yahoo! buying a company is for exclusive technology/IP and for the development team. Yahoo! buying del.icio.us or Flickr isn't about getting access to the 250,000 - 300,000 users of these services given that they have less users than the least popular services on Yahoo!'s network. Instead it's about getting people like Joshua Schachter, Caterina Fake and Stewart Butterfield building the next generation of Yahoo!'s products. Don Dodge covers this in slightly more detail in his post Microsoft will acquire my company.

Let's talk about Meebo specifically. The user base is too small to be a factor so the interesting things are the technology and the people. First we should look at the technology. An AJAX instant messaging client isn't novel and companies like Microsoft have been providing one for years. A framework built on reverse engineering IM protocols is cool but not useful. As Om Malik points out, the major players tolerate activities like companies like Meebo & Trillian because it is counterproductive for [example] AOL suing a tiny company like Trillian for misusing its network. On the other hand, they wouldn't tolerate it from a major player like Microsoft primarily because that becomes a significant amount of traffic on their network and licensing access becomes a valid revenue generating scenario. Thus, the technology is probably not worth a great deal to one of the big players. That leaves the people,  according to the Meebo team page there are three people; a server dev, a DHTML/AJAX dev and a business guy (likely to be useless overhead in an acquisition). The question then is how many million dollars would Google, Yahoo! or Microsoft think is worth for the skills of both [most likely excellent] developers? Then you have to factor in the markup because the company got VC funding...

You can probably tell that I agree with Om Malik that it is unlikely that this company would be of interest to any of the four major IM players.

If you are building a Web startup with the intention of flipping it to one of the majors, only three things matter; technology/IP, users and
the quality of your technical team. Repeatedly ask yourself: would Microsoft want our users? would Google want our technology? would Yahoo! want our people?

It's as simple as that.


Categories: Technology

Clemens Vasters has written two interesting atricles on building RESTful services using Indigo Windows Communication Foundation entitled Teaching Indigo to do REST/POX, Part 1 and Teaching Indigo to do REST/POX, Part 2. His first articles begins

A little bit more than half a year ago I got invited to a meeting at Microsoft in Redmond and discussed with Steve Swartz, Yasser Shohoud and Eugene Osovetsky how to implement POX and REST support for Indigo. ... I witnessed the definition a special capability for the HTTP transport that I am exploiting with a set of Indigo extensions that I’ll present in this series of blog posts. The consensus in the meeting was that the requirements for building POX/REST support into the product weren’t generally clear enough in the sense that when you ask 100 people in the community you get 154 ever-changing opinions about how to write such apps. As a consequence it would not really be possible to define a complete programming model surface that everyone would be happy with, but that a simple set of hooks could be put into the product that people could use to build programming models rather easily.

And so they did, and so I did. This new capability of the HTTP transport first appeared in the September CTP of Indigo/WCF and surfaces to the developer as properties in the Message class Properties collection or the OperationContext.Incoming/OutgoingMessageProperties.

If you are using the Indigo HTTP transport on the server, the transport will always stick a HttpRequestMessageProperty instance into the incoming message properties, which provides access to the HTTP headers, the HTTP method (GET, POST, PUT, etc.) and the full query string. On the client, you can create an instance of this property class yourself and stick it into any outbound message’s properties collection  and, with that, control how the transport performs the request. For sending replies from the server, you can put a HttpResponseMessageProperty into the message properties (or, again, into the OperationContext) and set the HTTP status code and description and of course the HTTP reply headers.  

And since I have nothing better to do, I wanted to know whether this rather simple control feature for the HTTP transport would indeed be enough to build a POX/REST programming model and application when combined with the rest of the Indigo extensibility features. Executive Summary: Yes.

One of the first work items I got assigned when I joined my current team at MSN Windows Live was responsibility for our Indigo migration. This was pretty ambiguous and it turned out this was a place holder work item for "Figure out why we should move to Indigo and when". So I talked to our devs for a while and after thinking about where I saw our services evolving it seemed there were two main things we wanted from Indigo; better performance and support for a more diverse view of what it meant to be a Web service. 

Why we need better performance is pretty straightforward. We provide services that are depended on by hundreds of millions of users across a variety of MSN properties including Hotmail, MSN Messenger, and MSN Spaces as well as newer Windows Live properties like Windows Live Favorites and Windows Live Fremont. The more performance we can squeeze out of our services stack, the better things look on our bottom line. 

Then there is support for a broader view of what it means to be a Web service. When I worked on the XML team, I used to interact regularly with the Indigo folks. At the time, I got the impression that they had two clear goals (i) build the world's best Web services framework built on SOAP & WS-* and (ii) unify the diverse distributed computing offerings produced by Microsoft. As I spent time on my new job I realized that the first goal of Indigo folks didn't jibe with the reality of how we built services. Despite how much various evangelists and marketing folks have tried to make it seem otherwise, SOAP based Web services aren't the only Web service on the planet. Technically they aren't even the most popular. If anything the most popular Web services is RSS which for all intents and purposes is a RESTful Web service. Today, across our division we have services that talk  SOAP, RSS, JSON, XML-RPC and even WebDAV. The probability of all of these services being replaced by SOAP-based services is 0. I remember sending mail to a number of folks on the Indigo team about this disconnect including Doug Purdy, Don Box, Steve Swartz and Omri Gazitt. I remember there being some initial resistance to my feedback but eventually opinions started to thaw and today I'm glad to read posts where Doug Purdy calls himself one of the original REST-heads in the big house.

Anyway, the point is that there is more than one way to build services on the Web. Services frameworks like Indigo Windows Communication Foundation need to support this diverse world instead of trying to shove one solution down the throats of their customers. We at Microsoft now understand this, I hope eventually the rest of the industry does as well. 


Categories: XML Web Services

Alan Kleymeyer has a post entitled NewsGator Online where he writes

I've switched my online news aggregator from Bloglines to Newsgator.  First, I wanted to try it out and compare it to Bloglines.  I like the interface better, especially in how you mark things as read.  I've swithched for good.  I mainly switched so that I can continue using RSS Bandit and get the benefit of syncing between it and an online news aggregator (supported in latest RSS Bandit release)

Alan's post describes exactly why creating APIs for your online service and treating it as a Web platform and not just a web site is important. What would you rather use, a web-based aggregator which provides limited integration with a few desktop aggregators (i.e.Bloglines) OR a web-based aggregator which provides full integration with a variety of free and payware aggregators including RSS Bandit, NetNewsWire and FeedDemon? Building out a Web platform is about giving users choice which is what the NewsGator guys have done by providing the NewsGator API.

The canonical example of the power of Web APIs and Web platforms is RSS. Providing an RSS feed liberates your readers from the limitations of using one application (the Web browser) and one user interface (your HTML website) to view your content. They can consume it on their own terms using the applications that best fit their needs. Blogging wouldn't be as popular as it is today if not for this most fundamental of web services.

The topic of my ThinkWeek paper was turning Web sites into Web platforms and I was hoping to get to give a presentation about it at next year's O'Reilly Emerging Technology Conference but it got rejected. I guess I'll just have to keep shopping it around. Perhaps I can get it into Gnomedex or Mix '06. :)


December 15, 2005
@ 06:25 PM

Don Demsak has a post entitled XSLT 2.0, Microsoft, and the future of System.Xml which has some insightful perspectives on the future of XML in the .NET Framework

Oleg accidentally restarted the XSLT 2.0 on .Net firestorm by trying to startup an informal survey.  Dare chimed in with his view of how to get XSLT 2.0 in .Net.  M. David (the guy behind Saxon.Net which let .Net developers use Saxon on .Net) jumped in with his opinion.

One of the things that I’ve struggled with in System.Xml is how hard it is sometimes to extend the core library.  The XML MVPs have done a good job with some things, but other things (like implementing XSLT 2.0 on top of the XSLT 1.0 stuff) are impossible because so much of the library is buried in internal classes.  When building a complex library like System.Xml, there are 2 competing schools of thought:

  1. Make the library easy to use and create a very small public facing surface area.
  2. Make the library more of a core library with most classes and attributes public, and let others build easy (and very specific) object models on top of it.

The upside of the first methodology is that it is much easier to test, and the library just works out of the box.  The downside is that it very hard to extend the library, so it can only be used in very specific ways.

The upside of the second methodology is that you don’t have to trying to envision all the ways the library should be used.  Over time others will extend it to accomplish things that the original developers never thought of.  The downside is that you have a much larger surface area to test, and you are totally reliant on other projects to make your library useful.  This goes for both projects internal to Microsoft and external projects like the Mvp.Xml lib.

The System.Xml team has tended to use the first methodology, where the ASP.Net team tends to build their core stuff according to the second methodology, and then have a sub-team create another library using the first methodology, so developers have something to use right out of the box (think of System.Web.UI.HtmlControls as the low level API and System.Web.UI.WebControls as the higher level API).  The ASP.Net team builds their API this way because, from the beginning, they have always envisioned 3rd parties extending their library.  At the moment, this is not the case for the System.Xml library.  But the question is, should System.Xml be revamped and become a lower level API, and then rely on 3rd parties (like the Mvp.Xml project) to create more specific and easier to use APIs?  Obviously this is not something to be taken lightly.  It will be more costly to expose more of the internals of System.Xml.  But, if only the lower level API was part of the core .Net framework, it may then be possible to roll out newer, higher level, APIs on a release schedule different then the .Net framework schedule.  This way projects like XSLT 2.0 could be released without having to what for the next version of the framework.

 I’ve always been of the opinion that XSLT 2.0 does not need to be part of the core .Net framework.  Oleg doesn’t believe that the .Net open source community is as passionate as some of the other communities, so he would like to see Microsoft build XSLT 2.0.  I’d rather see the transformation of the System.Xml team into more of an ASP.Net like team.  If .Net is the future of development on the Windows platform, and XML is the future of Microsoft, then the System.Xml team needs to grow beyond its legacy as just an offshoot of the SQL Server team.  The System.Xml team still resides in the SQL Server building.  Back before .Net, the System.Xml was known as the SQL Web Data team, and unfortunately, still carries some of that mentality.  Folks like Joshua Allen and Dare (who are both not on the team anymore) fought to bring the team out from the shadows of SQL Server.  With new XML related groups, like XLinq and Windows Communication Framework, popping up within the company the System.Xml group is at a major crossroads.  They will either grow (in status and budget) and become more like the ASP.Net or they will get absorbed into one of the new groups.

 I’d prefer to see the System.Xml team grow and become full partners with teams like ASP.Net and the CLR team.  I’d like to see the XML based languages become first class programming languages within the Visual Studio IDE.  That means not only using things like XSLT and XML Schema as dynamic languages, but also be able to compile them down to IL and compiled with the other .Net languages.  I want to be able to create projects that contain not only VB or C#, but also XSLT and XML Schema (to name a couple), and have them compile into one executable.  Then developers can use things like XSLT 2.0, or the next in vogue XML based language, and take advantage of that language’s unique benefits, without having to choose between a compiled procedural language (like C# or VB) and dynamic functional languages like XSLT.  Linq is starting to bring in more of the functional programming style to the average procedural programmer, so I can start to see the rise public awareness of functional programming.  It is only a matter of time before the average programmer feels as comfortable with functional programming as they do with procedural programming, so we need to look towards including these languages within the Visual Studio IDE (which then leads into my discussion about evolving Visual Studio into more of an IDE Framework, and extended with add-ins.)

There is a lot of stuff which I agree with in Don's post which is why I forwarded it to some of the folks on the XML team. I'll be having lunch over there today to talk about some of the topics it raised

Don does gloss over something when it comes to the decision between whether Microsoft should implement a technology like XSLT 2.0 or whether we should just make it easy for third parties to do so. The truth is that Microsoft now has a large number of products which utilize XML-related technologies. For example, implementing something like XSLT 2.0 isn't just about providing a new version of the System.Xml.Xsl.XslCompiledTransform Class in the .NET Framework. It's also about deciding whether to update the XSLT engine used by Internet Explorer to support XSLT 2.0 (which is an entirely different code base), it's about adding support to the XSLT debugger in Visual Studio to support XSLT 2.0, and maybe even updating the Biztalk Mapper. Users of Microsoft products expect a cohesive and comprehensive experience. In many cases, only Microsoft can provide that experience (e.g. supporting XSLT 2.0 across our entire breadth of technologies and products that use XSLT). It was a really tough balance deciding what we should make extensible and what was too expensive to make extensible since we'd probably be the only ones who could take proper advantage of it when I was on the XML team. I'm glad to see that some of our MVPs understand how delicate of a balancing act shipping platform technologies can be.


Categories: Life in the B0rg Cube | XML

I'm a day late to blog this but it looks like we announced releases in both the consumer and business instant messaging space yesterday.

From the InfoWorld article Microsoft uses Ajax to Web-enable corporate IM we learn

Microsoft Corp. Tuesday released a Web-based version of its corporate instant-messaging software that gives users access when they are working remotely or from non-Windows computers. Gurdeep Singh Pall, a Microsoft corporate vice president, unveiled the product, Office Communicator Web Access, in a keynote at the Interop New York 2005 show.

Office Communicator Web Access includes support for Ajax (Asynchronous Javascript and XML), a programming technology that enables developers to build applications that can be altered dynamically on a browser page without changing what happens on the server. The product provides a Web front end to Microsoft's Office Communicator desktop application, and is available to customers of Live Communications Server 2005 for immediate download at www.microsoft.com/rtc, said Paul Duffy, a senior product manager at Microsoft.

I'm confused as to why InfoWorld feels the need to mention AJAX in their story. It's not like when other products are announced they trumpet the fact that they are built using C++ or ASP.NET. The AJAX hype is definitely getting ridiculous.

From the blog post Windows Live Messenger Beta - Released from the Windows Live Messenger team's blog we learn

 Windows Live Messenger Beta is now available for use and testing to a limited set of users in the US, UK, Japan, Australia, Canada, China, France, Germany, Brazil, Korea, Netherlands, and Spain. More and more of you will be invited to join over the coming weeks/months.

They also have a blog post on the Official Feature List for Windows Live Messenger. Unfortunately, none of the features I'm working on are in this release. I can't wait until the features I'm working on finally get out to the public. :)


Categories: Social Software | Windows Live

December 14, 2005
@ 05:44 PM

An artist's transition from gangsta rapper to pop star is always a weird one for fans. For example, there was a joke on this week's episode of the Boondocks about how Ice Cube "the guy who makes family movies" used to be a hard core gangsta rapper. I've personally been amused by how the subject matter of their songs changes as they realize that their fan base is dominated by prepubescent and teenage suburbanites as opposed to hip hop heads from the 'hood. 

On the album Blueprint 2: The Gift & The Curse Jay-Z has a song called Poppin' Tags which is about going to the mall and shopping. The subject matter of the song is the kind of thing you'd expect from Hilary Duff not Jigga. 

However 50 Cent has Jay-Z beat when it comes to songs targetted at the teenage mallrat crowd. On the soundtrack to his movie Get Rich or Die Tryin' 50 has two songs that belie his status as a gangsta rapper. There's the poorly crooned Window Shopper about how 50 Cent gets to go to the mall to buy stuff you can't afford. Then there's Best Friend where he begs to be some girl's "best friend" if the other guy in her life is "just a friend". 

But it gets worse.

Mike Torres sent me a link to a post entitled 50 Cent Caught Red Handed which is excerpted below

Remember that story about 50 Cent performing at some little girl's bat mitzvah? Yeah, you wish it didn't really happen. Nothing says hardcore gangster rapper like a teenie-bop white girl dancing to your music with two hundred of her closest white teenie-bop friends.

More pictures from the $500,000 bat mitzvah after the jump.

UPDATE: You can see all the photos from the bat mitzvah here.

Keep it real, Fiddy.


Categories: Music

December 14, 2005
@ 02:21 AM

One of the interesting things I've noticed due to the discussion about Yahoo!'s purchase of del.icio.us is how differently success is judged in the post-dotbomb technology startup scene. Specifically, I'll focus on two posts that gave me cause to pause this afternoon.

In his post Learning from mistakes Anil Dash writes

Best post I've seen today: Ari Paparo talks about the differences between del.icio.us and Blink. Blink was Ari's startup during the bubble, which raised $13 million (!) to build an online bookmarking service, but didn't take off with users.

The only way any of us gets to be a successful entrepreneur is by learning from others' mistakes, yet a lot of business culture focuses around never admitting that errors are ever made. So kudos to Ari, not just for being brave enough to be self-critical, but for helping a lot of new aspiring entrepreneurs to succeed

The only quibble I'd have is that Ari presents del.icio.us as having succeeded already. Josh and his team at del.icio.us have built a great app, but for as popular as they are with geeks, the hard work is to bring the concept of social bookmarking (or, if you prefer, a shared recollection tool) to a larger audience.

In his post Getting it right Ari Paporo writes

Congratulations to Josh on the del.ico.us acquisition. Yahoo will make a great partner for the bookmarking service.

Now a little part of me is cringing as I write this. Having founded a bookmarking company in 1999 with pretty much the exact same vision as the new crop of services, I’ve got to feel, well, a little stupid. (or angry, or depressed, or whatever). Maybe writing about it will make me feel better and maybe even help me make a point or two about product development.

When we founded Blink.com (no link love, it’s a crappy search site now) the founders and I imagined a self-reinforcing product cycle:

1. Consumers needed portable bookmarks so they wouldn’t lose them, would be able to access them from any computer, and could share them with friends or coworkers;

2. As part of the process of bookmarking sites and organizing them into "folders" users would be indicating a measure of quality and connectedness among the URLs;

3. Profit!

OK, step 3 was a little more complicated. But the essence was that we would use the personal-backup product attributes to create a public search engine and "discovery engine" (I believe the marketing folks wanted to use that phrase!) based on user bookmarks.

This really shouldn’t sound too different from what del.ico.us was able to do, and we had something like $13 million to play with to make it happen. Not to mention that there were others with the same idea. Remember Backflip? So (besides the money),why did we fail and del.ico.us and the other Web 2.0 companies succeed?

I don’t think it was that we were "too early" or that we got killed when the bubble burst. I believe it all came down to product design, and to some very slight differences in approach.

To start, we launched Blink with a bevy of marketing dollars and a message very much focused on the individual storage benefits. We were very successful at attracting users (at its height Blink has 1.5 million members, del.ico.us currently has 300,000) and getting them to import their bookmarks into our system.

What I find interesting about this pair of posts is the thought that a company that had 5 times the user base of del.icio.us could be considered a failure while del.icio.us is not. This makes me wonder what defines success here...That the VCs made a profit? I assume that must have been the case with the del.icio.us sale while it clearly was not with the original Blink.com service. Perhaps it's that the founders end up as millionaires? What ever it is, it definitely doesn't seem to be about users.

I tend to agree with Anil Dash, del.icio.us isn't yet a success except for being successful at making the founders and VCs a good return on their investment. If a service can grow to be 5 times as large and still be considered a failure then I think it is safe to say that calling del.icio.us a success is at best premature.

My question for all you budding entrepreneurs out there, what are your definitions of success and failure?


Categories: Social Software

For the developers out there who'd like to ask questions about or report bugs in our implementation the MetaWeblog API for MSN Spaces  there is now a place to turn.

The MSN Spaces Development forum is where to go to ask questions about the MetaWeblog API for MSN Spaces, file bug reports and discuss with members of our developer community or the Spaces team about what you'd like to see us open up next.

There is even an RSS feed so I can keep up to date with recent postings using my favorite RSS reader. If you are interested in our API story, you should subscribe.


Categories: Windows Live

December 13, 2005
@ 06:27 PM

Nicholas Carr has a post entitled Sun and the data center meltdown which has an insightful excerpt on the kind of problems that sites facing scalability issues have to deal with. He writes

a recent paper on electricity use by Google engineer Luiz André Barroso. Barroso's paper, which appeared in September in ACM Queue, is well worth reading. He shows that while Google has been able to achieve great leaps in server performance with each successive generation of technology it's rolled out, it has not been able to achieve similar gains in energy effiiciency: "Performance per watt has remained roughly flat over time, even after significant efforts to design for power efficiency. In other words, every gain in performance has been accompanied by a proportional inflation in overall platform power consumption. The result of these trends is that power-related costs are an increasing fraction of the TCO [total cost of ownership]."

He then gets more specific:

A typical low-end x86-based server today can cost about $3,000 and consume an average of 200 watts (peak consumption can reach over 300 watts). Typical power delivery inefficiencies and cooling overheads will easily double that energy budget. If we assume a base energy cost of nine cents per kilowatt hour and a four-year server lifecycle, the energy costs of that system today would already be more than 40 percent of the hardware costs.

And it gets worse. If performance per watt is to remain constant over the next few years, power costs could easily overtake hardware costs, possibly by a large margin ... For the most aggressive scenario (50 percent annual growth rates), power costs by the end of the decade would dwarf server prices (note that this doesn’t account for the likely increases in energy costs over the next few years). In this extreme situation, in which keeping machines powered up costs significantly more than the machines themselves, one could envision bizarre business models in which the power company will provide you with free hardware if you sign a long-term power contract.

The possibility of computer equipment power consumption spiraling out of control could have serious consequences for the overall affordability of computing, not to mention the overall health of the planet.

If energy consumption is a problem for Google, arguably the most sophisticated builder of data centers in the world today, imagine where that leaves your run-of-the-mill company. As businesses move to more densely packed computing infrastructures, incorporating racks of energy-gobbling blade servers, cooling and electricity become ever greater problems. In fact, many companies' existing data centers simply can't deliver the kind of power and cooling necessary to run modern systems. That's led to a shortage of quality data-center space, which in turn (I hear) is pushing up per-square-foot prices for hosting facilities dramatically. It costs so much to retrofit old space to the required specifications, or to build new space to those specs, that this shortage is not going to go away any time soon.

When you are providing a service that becomes popular enough to attract millions of users, your worries begin to multiply. Instead of just worrying about efficient code and optimal database schemas, things like power consumption of your servers and data center capacity become just as important.

Building online services requires more than the ability to sling code and hack databases. Lots of stuff gets written about the more trivial aspects of building an online service (e.g. switch to sexy, new platforms like Ruby on Rails) but the real hard work is often unheralded and rarely discussed.


From the press release Microsoft and MCI Join to Deliver Consumer PC-to-Phone Calling we learn

REDMOND, Wash., and ASHBURN, Va. — Dec. 12, 2005 — Microsoft Corp. and MCI Inc. (NASDAQ: MCIP) today announced a global, multiyear partnership to provide software and services that enable customers to place calls from a personal computer to virtually any phone. The solution, MCI Web Calling for Windows Live™ Call, will be available through Windows Live Messenger, the upcoming successor to MSN® Messenger, which has more than 185 million active accounts around the world. The solution combines Windows Live software, advanced voice over Internet Protocol (VoIP) capabilities and the strengths of MCI’s expansive global network to give consumers an easy-to-use, convenient and cost-effective way to stay connected.

MCI and Microsoft are testing the service as part of a Windows Live Messenger limited beta with subscriptions initially available in the United States, and expect to jointly deliver the PC-to-phone calling capabilities to France, Germany, Spain and the United Kingdom in the coming weeks. Once subscribed to the service, customers can place calls to and from more than 220 countries with rates starting at $.023 per minute to the U.S., Canada, the U.K. and Western Europe during the beta testing period. Upon sign-up, MCI Web Calling customers will receive up to one hour of free calls. Final pricing will be determined when the product officially launches in 2006.

Another sweet Windows Live offering already in beta. You can find a screenshot of the upcoming functionality in the blog post Windows Live Call & MCI (Part II). I'll definitely be interested in trying out this feature once it ships. I used to PC-to-SMS feature all the time to send text messages to my girlfriend especially when I'm out of town. Extending this to phone calls would be great for calling family overseas.


Categories: Windows Live

I've been surprised to see several weblogs report that MSN Spaces 27 million blogs with over 7.6 million active bloggers. What I found surprising wasn't the inaccurate data on the number of weblogs or active users that we have. The surprise that these accurate sounding numbers were 'interpreted' from an off hand comment I made in my blog. The source of this information seems to be this post in the Blog Herald entitled MSN Spaces now has 27 million blogs and over 7.6 million active users: Microsoft which states

Microsoft’s blogging service has grown from an estimaited 18 million blogs in October to 27 million blogs and at least 7.6 million active bloggers, according to Dare Obasanjo from Microsoft in a post discussing server issues.

The service still remains in third position amongst blog providers, with Xanga and MySpace both believed to be hosting 40 million blogs each.

(note: calculations based on this line: "I never expected [Spaces] that we'd grow to be three times as big [as Live Journal] and three times as active within a year.")

I've been pretty surprised at the number of blogs I've seen quoting these numbers as facts when they are based on such fuzzy techniques. For the record we don't have 27 million blogs, the number is higher. As for our number of active users, that depends on your definition of active. Using one definition, we are over three times as active as LiveJournal. That's what I meant. 


Categories: Ramblings

I bumped into Irwin Dolobowsky a few weeks ago and he told me that he now worked on Windows Live Favorites. Irwin used to work on the XML team at Microsoft with me and in fact he took over http://msdn.microsoft.com/xml when I left the team last year. I'm glad to see that I'll be working closely with a couple more familiar faces.

Yesterday he let me know that they've started a team blog at http://spaces.msn.com/members/livefavorites. He's already started addressing some of the feedback from their early adopters such as his post on the Number of Favorites Limit. Check it out.


Categories: Windows Live

Our implementation of the MetaWeblog API for MSN Spaces is now publicly available. You can use the API to create, edit and delete blog posts on your space. The following blogging applications either currently work with our implementation of the MetaWeblog API or will in their next release

  1. W.Bloggar
  2. Blogjet
  3. Ecto
  4. Zoundry
  5. Qumana
  6. Onfolio
  7. Elicit
  8. PostXING
  9. Pocket Blogger
  10. Diarist - PocketPC

I have also provided a pair of tutorials for managing your MSN Spaces blog using desktop blogging tools, one on using Blogjet to manage your MSN Spaces blog and the other on using W.Bloggar to manage your blog on MSN Spaces.

The following information is for developers who would like to build applications that programmatically interact with MSN Spaces.

Supported Blogger and MetaWeblog API methods:

  • metaWeblog.newPost (blogid, username, password, struct, publish) returns string
  • metaWeblog.editPost (postid, username, password, struct, publish) returns boolean
  • metaWeblog.getPost (postid, username, password) returns struct
  • metaWeblog.getCategories (blogid, username, password) returns array of structs
  • metaWeblog.getRecentPosts (blogid, username, password, numberOfPosts) returns array of structs
  • blogger.deletePost(appkey, postid, username, password, publish) returns boolean
  • blogger.getUsersBlogs(appkey, username, password) returns array of structs
  • blogger.getUserInfo(appkey, username, password) returns struct

Unsupported MetaWeblog API methods:

  • metaWeblog.newMediaObject (blogid, username, password, struct) returns struct

NOTE: The appKey parameter used by the deletePost, getUsersBlogs and getUserInfo method is ignored. MSN Spaces will not require an application key to utilize its APIs.

Expect to see more information about the MetaWeblog API for MSN Spaces on http://msdn.microsoft.com/msn shortly. We also will be providing a forum to discuss the APIs for MSN Spaces at http://forums.microsoft.com/msdn in the next few days. If you have questions about using the API or suggestions about other APIs you would like to see, either respond to this blog entry or send me mail at dareo AT microsoft DOT com. 


Categories: Windows Live

The following is a tutorial on posting to your blog on MSN Spaces using the W.Bloggar desktop blogging application.

  1. Create a Space on http://spaces.msn.com if you don't have one

  2. Go to 'Edit Your Space->Settings->Email Publishing'

  3. Turn on Email Publishing (screenshot below)

  4. Choose a secret word (screenshot below)

  5. Download and install the latest version of W.Bloggar from http://www.wbloggar.com

  6. Go to File->Add Account

  7. On the next screen, answer "Yes, I want to add it as a new account" when asked whether you already have a blog

  8. Select 'Custom' as your blog tool and choose an alias for this account (screenshot below)

  9. Select your Custom Blog Tool Settings as shown (screenshot below)

  10. Specify your provider information as follows; Host=storage.msn.com, Page=/storageservice/MetaWeblog.rpc, Port=443, HTTPS=checked (screenshot below)

  11. Enter your username and password. Your username is the name of your space (e.g. I use 'carnage4life' because the URL of my space is http://spaces.msn.com/members/carnage4life). The password is the secret word you selected when you turned on Email-Publishing on your space. (screenshot below)

  12. Click Finish.

  13. Go ahead and create, edit or delete blog posts on your blog using W.Bloggar


Categories: Windows Live

The following is a tutorial on posting to your blog on MSN Spaces using the Blogjet desktop blogging application.

  1. Create a Space on http://spaces.msn.com if you don't have one

  2. Go to 'Edit Your Space->Settings->Email Publishing'

  3. Turn on Email Publishing (screenshot below)

  4. Choose a secret word (screenshot below)

  5. Download and install the latest version of Blogjet from http://www.blogjet.com

  6. Go to Tools->Manage Accounts

  7. Create a new account where the user name is the name of your space (e.g. I use 'carnage4life' because the URL of my space is http://spaces.msn.com/members/carnage4life). The password is the secret word you selected when you turned on Email Publishing on your space.

  8. On the next screen select "I already have a blog"

  9. Specify your provider information as shown below. Host=storage.msn.com, Page=/storageservice/MetaWeblog.rpc, Port=443, Use SSL=checked (screenshot below)

  10. Keep clicking Next until you are done

  11. Go ahead and create, edit or delete blog posts on your space using Blogjet


Categories: Windows Live

The number one problem that faces developers of feed readers is how to identify posts. How does a feed reader tell a new post from an old one whose title or permalink changed? In general how you do this is to pick a unique identifier from the metadata of the feed item to use to tell it apart from others. If you are using the Atom 0.3 & 1.0 syndication formats the identifier is the <atom:id> element, for RSS 1.0 it is the rdf:about attribute and for RSS 0.9x & RSS 2.0 it is the <guid> element.

The problem is that many RSS 0.9x & 2.0 feeds do not have a <guid> element which usually means a feed reader has to come up with its own custom mechanism for identifying items. In many cases, using the <link> element is enough because most items in a feed map to a single web resource with a permalink URL. In some pathological cases, a feed may not have <guid> or <link> OR even worse may use the same value in the <link> element for each item in the feed. In such cases, feed readers usually resort to heuristics which are guaranteed to be wrong at least some of the time.

So what does this have to do with the Newsgator API? Users of recent versions of RSS Bandit can synchronize the state of their RSS feeds with Newsgator Online using the Newsgator API. Where things get tricky is that this means that both the RSS Bandit and Newsgator Online either need to use the same techniques for identifying posts OR have a common way to map between their identification mechanisms. When I first used the API, I noticed that Newsgator has it's own notion of a "Newsgator ID" which it expects clients to use. In fact, it's worse than that. Newsgator Online assumes that clients that synchronize with it actually just fetch all their data from Newsgator Online including feed content. This is a pretty big assumption to make but I'm sure it made it easier to solve a bunch of tricky development problems for their various products. Instead of worrying about keeping data and algorithms on the clients in sync with the server, they just replace all the data on the client with the server data as part of the 'synchronization' process.

Now that I've built an application that deviates from this fundamental assumption I've been having all sorts of interesting problems. The most recent being that some users complained that read/unread state wasn't being synced via the Newsgator API. When I investigated it turned out that this is because I use <guid> elements to identify posts in RSS Bandit while the Newsgator API uses the "Newsgator ID". Even worse is that they don't even expose the original <guid> element in the returned feed items. So now it looks like fixing the read/unread not being synced bug involves bigger and more fundamental changes than I expected. More than likely I'll have to switch to using <link> elements as unique identifiers since it looks like the Newsgator API doesn't throw those away.



Tim Ewald has an astute post entitled PaulD's new XSD data binding WG where he discusses a recently chartered W3C working group. He writes

Paul responded to yesterday's post to explain the need for the new W3C XML Schema Patterns for Databinding Working Group, which he chairs. He points out that the move by the WS-I to deprecate encoding in favor of literal schema was based on a reasonable argument (that there is no spec for how to translate an XSD in a WSDL - which describes a tree of named structural types - into an input to SOAP encoding - which acts on a graph of unnamed structural types) but that the end result made interop harder because it opened up the door to using all of XSD. I disagree. The WSDL spec opened the door to using all of XSD for both encoded and literal bindings. The work that SOAPbuilders did provided a set of test cases for mapping common types and structures. It did not, however, address questions like “how do you map substitution groups to code using an encoded binding”, something that is completely legal according to WSDL. In other words, the shift from encoding to literal in no way widened the number of databinding cases we had to be concerned about. That's a red herring. The real problem has been the lack of SOAPbuilders-style test suites to cover more of XSD or the lack of a formal specification that narrows XSD to a more easily supported subset (an option that the WS-I discarded).

This is one of those issues which that which I use to blame on the complexity of XSD but have adjusted to also blaming vendors of XML Web services toolkits as well. The core problem is that every vendor of XML Web Services toolkits pretends they are selling a toolkit for programming with distributed objects and tries their best to make their tool hide the XML-ness of the wire protocols (SOAP), interface description language (WSDL) and data types (XSD). Of course, these toolkits are all leaky abstractions made even leakier than usual by the impedance mismatch between XSD and the typical statically typed, object oriented programming language that is popular with the enterprise Web services crowd (i.e. Java or C#).

The W3C forming a working group to standardize the collection of hacks and kludges that various toolkits use when mapping XSD<->objects is an attempt to standardize the wrongheaded thinking of the majority of platform vendors selling XML Web Services toolkits.  

Reading the charter of the working group is even more disturbing because not only do they want to legitimize bad practices but they also plan to solve problems like how to version classes across programming languages and coming up with XML representations of common data structures for use across different programming languages. Thus the working group plans to invent as well as standardize common practice. Sounds like the kind of conflicting goals which brought us XSD in the first place. I wish them luck.


Categories: XML Web Services

December 11, 2005
@ 05:45 PM

I've been following a series of posts on Oleg Tkachenko's blog with some bemusement. In his post A business case for XSLT 2.0? he writes

If you are using XSLT and you think that XSLT 2.0 would provide you some real benefits, please drop a line of comment with a short explanation pleeeease. I'm collecting some arguments for XSLT 2.0, some real world scenarios that are hard with XSLT 1.0, some business cases when XSLT 2.0 would provide an additional value. That's really important if we want to have more than a single XSLT 2.0 implementation...

PS. Of course I've read Kurt's "The Business Case for XSLT 2.0" already.

Update: I failed to stress it enough that it's not me who needs such kind of arguments. We have sort of unique chance to persuade one of software giants (guess which one) to support XSLT 2.0 now.

In a follow up post entitled XSLT 2.0 and Microsoft Unofficial Survey he reveals which of the software giants he is trying to convince to implement XSLT 2.0 where he writes

Moving along business cases Microsoft seeks to implement XSLT 2.0 I'm trying to gather some opinion statistics amongs developers working with XML and XSLT. So I'm holding this survey at the XML Lab site:

Would you like to have XSLT 2.0 implementation in the .NET Framework?

The possible answers are:

  • Yes, I need XSLT 2.0
  • Yes, that would be nice to have
  • No, continue improving XSLT 1.0 impl instead
  • No, XSLT 1.0 is enough for me


Take your chance to influence Microsoft's decision on XSLT 2.0 and win XSLT 2.0 book!

My advice to Oleg, if you want to see XSLT 2.0 in the .NET Framework then gather some like minded souls and build it yourself. Efforts like the MVP.XML library for the .NET Framework shows that there are a bunch of talented developers building cool enhancements to the basic XML story Microsoft provides in the .NET Framework.

I'm not sure how an informal survey in a blog would convince Microsoft one way or the other about implementing a technology. A business case to convince a product team to do something usually involves showing them that they will lose or gain significant marketshare or revenue by making a technology choice. A handful of XML geeks who want to see the latest and greatest XML specs implemented by Microsoft does not a business case make. Unfortunately, this means that Microsoft will tend to be a follower and not a leader in such cases because customer demand and competitive pressure don't occur until other people have implemented and are using the technology. Thus if you want Microsoft to implement XSLT 2.0, you're best bet is to actually have people using it on other platforms or on Microsoft platforms who will clamor for better support instead of relying on informal surveys and comments in your blog.

Just my $0.02 as someone who used to work on the XML team at Microsoft.


Categories: XML

December 9, 2005
@ 08:22 PM

Yahoo! continues to make all the right moves. From the post y.ah.oo! on the del.icio.us weblog we learn

We're proud to announce that del.icio.us has joined the Yahoo! family.  Together we'll continue to improve how people discover, remember and share on the Internet, with a big emphasis on the power of community.  We're excited to be working with the Yahoo! Search team - they definitely get social systems and their potential to change the web. (We're also excited to be joining our fraternal twin Flickr!)

We want to thank everyone who has helped us along the way - our employees, our great investors and advisors, and especially our users.  We still want to get your feedback, and we look forward to bringing you new features and more servers in the future.

I look forward to continuing my vision of social and community memory, and taking it to the next level with the del.icio.us community and Yahoo!

Congrats to Joshua Schachter and the Yahoo! folks. This is definitely a great match.


Categories: Social Software

Thanks to Miguel De Icaza, I found an interesting speech by Harold Pinter which is reprinted in the article Art, Truth and Politics. Parts of the speech ramble at times but there is a particularly potent message which has been excerpted below

The tragedy of Nicaragua was a highly significant case. I choose to offer it here as a potent example of America's view of its role in the world, both then and now.

I was present at a meeting at the US embassy in London in the late 1980s.

The United States Congress was about to decide whether to give more money to the Contras in their campaign against the state of Nicaragua. I was a member of a delegation speaking on behalf of Nicaragua but the most important member of this delegation was a Father John Metcalf. The leader of the US body was Raymond Seitz (then number two to the ambassador, later ambassador himself). Father Metcalf said: 'Sir, I am in charge of a parish in the north of Nicaragua. My parishioners built a school, a health centre, a cultural centre. We have lived in peace. A few months ago a Contra force attacked the parish. They destroyed everything: the school, the health centre, the cultural centre. They raped nurses and teachers, slaughtered doctors, in the most brutal manner. They behaved like savages. Please demand that the US government withdraw its support from this shocking terrorist activity.'

Raymond Seitz had a very good reputation as a rational, responsible and highly sophisticated man. He was greatly respected in diplomatic circles. He listened, paused and then spoke with some gravity. 'Father,' he said, 'let me tell you something. In war, innocent people always suffer.' There was a frozen silence. We stared at him. He did not flinch.

Innocent people, indeed, always suffer.

Finally somebody said: 'But in this case "innocent people" were the victims of a gruesome atrocity subsidised by your government, one among many. If Congress allows the Contras more money further atrocities of this kind will take place. Is this not the case? Is your government not therefore guilty of supporting acts of murder and destruction upon the citizens of a sovereign state?'

Seitz was imperturbable. 'I don't agree that the facts as presented support your assertions,' he said.

As we were leaving the Embassy a US aide told me that he enjoyed my plays. I did not reply.

I should remind you that at the time President Reagan made the following statement: 'The Contras are the moral equivalent of our Founding Fathers.'

The United States supported the brutal Somoza dictatorship in Nicaragua for over 40 years. The Nicaraguan people, led by the Sandinistas, overthrew this regime in 1979, a breathtaking popular revolution.

The Sandinistas weren't perfect. They possessed their fair share of arrogance and their political philosophy contained a number of contradictory elements. But they were intelligent, rational and civilised. They set out to establish a stable, decent, pluralistic society. The death penalty was abolished. Hundreds of thousands of poverty-stricken peasants were brought back from the dead. Over 100,000 families were given title to land. Two thousand schools were built. A quite remarkable literacy campaign reduced illiteracy in the country to less than one seventh. Free education was established and a free health service. Infant mortality was reduced by a third. Polio was eradicated.

The United States denounced these achievements as Marxist/Leninist subversion. In the view of the US government, a dangerous example was being set. If Nicaragua was allowed to establish basic norms of social and economic justice, if it was allowed to raise the standards of health care and education and achieve social unity and national self respect, neighbouring countries would ask the same questions and do the same things. There was of course at the time fierce resistance to the status quo in El Salvador.

I spoke earlier about 'a tapestry of lies' which surrounds us. President Reagan commonly described Nicaragua as a 'totalitarian dungeon'. This was taken generally by the media, and certainly by the British government, as accurate and fair comment. But there was in fact no record of death squads under the Sandinista government. There was no record of torture. There was no record of systematic or official military brutality. No priests were ever murdered in Nicaragua. There were in fact three priests in the government, two Jesuits and a Maryknoll missionary. The totalitarian dungeons were actually next door, in El Salvador and Guatemala. The United States had brought down the democratically elected government of Guatemala in 1954 and it is estimated that over 200,000 people had been victims of successive military dictatorships.

Six of the most distinguished Jesuits in the world were viciously murdered at the Central American University in San Salvador in 1989 by a battalion of the Alcatl regiment trained at Fort Benning, Georgia, USA. That extremely brave man Archbishop Romero was assassinated while saying mass. It is estimated that 75,000 people died. Why were they killed? They were killed because they believed a better life was possible and should be achieved. That belief immediately qualified them as communists. They died because they dared to question the status quo, the endless plateau of poverty, disease, degradation and oppression, which had been their birthright.

The United States finally brought down the Sandinista government. It took some years and considerable resistance but relentless economic persecution and 30,000 dead finally undermined the spirit of the Nicaraguan people. They were exhausted and poverty stricken once again. The casinos moved back into the country. Free health and free education were over. Big business returned with a vengeance. 'Democracy' had prevailed.

But this 'policy' was by no means restricted to Central America. It was conducted throughout the world. It was never-ending. And it is as if it never happened.

The United States supported and in many cases engendered every right wing military dictatorship in the world after the end of the Second World War. I refer to Indonesia, Greece, Uruguay, Brazil, Paraguay, Haiti, Turkey, the Philippines, Guatemala, El Salvador, and, of course, Chile. The horror the United States inflicted upon Chile in 1973 can never be purged and can never be forgiven.

Hundreds of thousands of deaths took place throughout these countries. Did they take place? And are they in all cases attributable to US foreign policy? The answer is yes they did take place and they are attributable to American foreign policy. But you wouldn't know it.

It never happened. Nothing ever happened. Even while it was happening it wasn't happening. It didn't matter. It was of no interest. The crimes of the United States have been systematic, constant, vicious, remorseless, but very few people have actually talked about them. You have to hand it to America. It has exercised a quite clinical manipulation of power worldwide while masquerading as a force for universal good. It's a brilliant, even witty, highly successful act of hypnosis.

I put to you that the United States is without doubt the greatest show on the road. Brutal, indifferent, scornful and ruthless it may be but it is also very clever. As a salesman it is out on its own and its most saleable commodity is self love. It's a winner. Listen to all American presidents on television say the words, 'the American people', as in the sentence, 'I say to the American people it is time to pray and to defend the rights of the American people and I ask the American people to trust their president in the action he is about to take on behalf of the American people.'

It's a scintillating stratagem. Language is actually employed to keep thought at bay. The words 'the American people' provide a truly voluptuous cushion of reassurance. You don't need to think. Just lie back on the cushion. The cushion may be suffocating your intelligence and your critical faculties but it's very comfortable. This does not apply of course to the 40 million people living below the poverty line and the 2 million men and women imprisoned in the vast gulag of prisons, which extends across the US.

The United States no longer bothers about low intensity conflict. It no longer sees any point in being reticent or even devious. It puts its cards on the table without fear or favour. It quite simply doesn't give a damn about the United Nations, international law or critical dissent, which it regards as impotent and irrelevant.

The winners get to write the history books. I wonder what they'll say about this era in a hundred or a thousand years from now.


Categories: Current Affairs

It seems I missed that we launched the Windows Live Favorites for MSN Search Toolbar plugin earlier this week. I was harrassing some members of the Live Favorites team about shipping a toolbar plugin only to find out that they already had. Below is an excerpt from the download page

With Windows Live Favorites for MSN Search Toolbar (Beta) you'll be able to access your favorites from any PC, add new sites easily, find your favorites quickly, and manage them even when you are away from your home PC.

What you can do with it:    

  • Easily import your current favorites from Internet Explorer and MSN Explorer to Windows Live Favorites and use them right away
  • Find favorites quickly based on name, address, or keyword - so finding what you've already discovered on the Web is simple 
  • Add new sites to your Windows Live Favorites with a single click, and access them from anywhere
  • Collect and save new links even when you're away from your home computer

Another Windows Live product marches down the beta path on the road to going gold.


Categories: Windows Live

December 8, 2005
@ 11:17 PM

I just spent some time browsing the questions on the Yahoo! Answers site and as usual Yahoo! shows that they get it. They may build unspectacular user interfaces but they know how to build useful software for users of the World Wide Web.

I find this to be an interesting experiment in providing alternatives to traditional search engines for answering people's questions. Unlike Google Answers, there is no payment required from the person whose question is answered. This raises the question of why anyone would want to answer questions on the site. Whuffie?


Categories: Social Software

December 8, 2005
@ 06:37 PM

From the press release New Windows Live Local Service Delivers State-of-the-Art Advances for Web-Based Mapping and Local Search we learn

REDMOND, Wash. — Dec. 7, 2005 — Microsoft Corp. will introduce a beta version of Windows Live™ Local, an online local search and mapping service that combines unique bird’s-eye imagery with advanced driving directions, Yellow Pages and other local search tools tomorrow, Dec. 8, 2005, at 9:01 a.m. PST. Powered by Virtual Earth™ mapping and location platform, these features give users useful new ways to map and find directions to various locations and better visualize their surroundings from multiple aerial vantage points.

“We believe Windows Live Local sets a new standard for what people can do with maps, directions and local search,” said Christopher Payne, corporate vice president of MSN Search at Microsoft. “The combination of immersive aerial imagery, customizable map annotations, innovative driving directions and the ability to share local search information with others gives users an incredibly powerful and easy way to find what they want and get where they want to go.”

The new service, which will be located at http://local.live.com, contains a range of new capabilities that will be exciting to search and mapping users. The most visible of these features is a new 45-degree bird’s-eye view of major U.S. cities such as New York, Los Angeles, San Francisco, Boston, Seattle and Las Vegas. Covering about 25 percent of the U.S. by population, these bird’s-eye images are captured by Pictometry International Corp. via low-flying airplanes and then integrated with road and satellite maps to simulate 360-degree panoramas that can be viewed from four compass directions. On-screen navigational tools and preview tiles enable navigation between directional views or zooming in closer to a destination. Now people will be able to experience what it’s like to be there, whether they are evaluating a new house to buy, choosing the exact location to meet someone, or just taking a virtual vacation. Over the next couple of years, Microsoft plans to continuously update bird’s-eye, aerial, and road map data and imagery as well as local listings and information.

This is hot. Bird's eye view, integrated driving directions and best of all support for places outside the United States. The Virtual Earth guys have outdone themselves. Now if only we could get some better integration with other Windows Live services. ;)


Categories: Windows Live

December 7, 2005
@ 07:48 PM

The folks at 37 Signals have an insightful blog post entitled Don’t scale: 99.999% uptime is for Wal-Mart which states

Jeremy Wright purports a common misconception about new companies doing business online: That you need 99.999% uptime or you’re toast. Not so. Basecamp doesn’t have that. I think our uptime is more like 98% or 99%. Guess what, we’re still here!

Wright correctly states that those final last percent are incredibly expensive. To go from 98% to 99% can cost thousands of dollars. To go from 99% to 99.9% tens of thousands more. Now contrast that with the value. What kind of service are you providing? Does the world end if you’re down for 30 minutes?

If you’re Wal-Mart and your credit card processing pipeline stops for 30 minutes during prime time, yes, the world does end. Someone might very well be fired. The business loses millions of dollars. Wal-Mart gets in the news and loses millions more on the goodwill account.

Now what if Delicious, Feedster, or Technorati goes down for 30 minutes? How big is the inconvenience of not being able to get to your tagged bookmarks or do yet another ego-search with Feedster or Technorati for 30 minutes? Not that high. The world does not come to an end. Nobody gets fired.

Scalability issues are probably the most difficult to anticipate and mitigate when building a web application. When we first shipped MSN Spaces last year, I assumed that we'd be lucky if we became as big as LiveJournal, I never expected that we'd grow to be three times as big and three times as active within a year. We've had our growing pains and it's definitely been surprising at times finding out which parts of the service are getting the most use and thus needed the most optimizations.

The fact is that everyone has scalability issues, no one can deal with their service going from zero to a few million users without revisiting almost every aspect of their design and architecture. Even the much vaunted Google has had these problems, just look at the reviews of Google Reader that called it excruciatingly slow or the complaints that Google Analytics was so slow as to be unusable.

If you are a startup, don't waste your time and money worrying about what happens when you have millions of users. Premature optimization is the root of all evil and in certain cases will lead you to being more conservative than you should be when designing features. Remember, even the big guys deal with scalability issues.


Categories: Web Development

Two interesting things have been confirmed in the blog post Next Version of Virtual Earth is indeed around the corner. The first is that MSN Virtual Earth is getting renamed to Windows Live Local. The second is that the version I've been using internally that has better driving directions and birds eye view imagery will be shipping shortly. The post states

We've been working to address a lot of feature requests from our users, and personally I'm really happy with how the application has shaped up. Here is a blurb in The Kelsey Group's Local Media Blog about the forthcoming release. Greg Sterling correctly reports that Virtual Earth has become part of the Windows Live Family and will be known as Windows Live Local (WLL). WLL was first shown at the press launch event for Windows Live last month. Greg's comments are based on a presentation MSN Local Search General Manager Erik Jorgensen gave at Kelsey's ILM Conference last week. As part of Erik's presentation he demoed the release build of Windows Live Local. Keep your eyes open - you should be able to start enjoying features like Birds Eye imagery and User pushpins in just a few days.

The screenshots show the bird's eye imagery and as you can see, it is quite sweet. I can't wait for this to ship.


Categories: Windows Live

December 6, 2005
@ 01:59 AM

Brady Forrest has a post entitled Two Weeks of MSN on MSDN where he lists a bunch of upcoming webcasts on about various APIs in the MSN Windows Live platform. Below is an excerpt of his blog post with the upcoming webcasts

MSDN Webcast: The MSN Search Toolbar: Building Windows Desktop Search into Your Applications (Level 200)
Monday, December 5, 2005
1:00 P.M.–2:00 P.M. Pacific Time

MSDN Webcast: Extending Start.com Using Startlets (Level 200)
Wednesday, December 7, 2005
1:00 P.M.–2:00 P.M. Pacific Time

MSDN Webcast: The MSN Search Toolbar: Tips, Tricks, and Hacks to the MSN Search and Windows Desktop Search Platforms (Level 200)
Friday, December 9, 2005
1:00 P.M.–2:00 P.M. Pacific Time

MSDN Webcast: The MSN Search APIs: Building Web Search into Your Applications (Level 200)
Monday, December 12, 2005
1:00 P.M.–2:00 P.M. Pacific Time

MSDN Webcast: Virtual Earth Tips, Tricks, and Hacks (Level 200)
Wednesday, December 14, 2005
1:00 P.M.–2:00 P.M. Pacific Time

MSDN Webcast: MSN Messenger: Extending MSN Messenger with Multi-Person Instant Messaging Applications (Level 200)
Friday, December 16, 2005
1:00 P.M.–2:00 P.M. Pacific Time

You can find more information on MSDN..

If you are interested in building applications that integrate with MSN Windows Live services and applications, you should definitely check out these webcasts. I'll most likely be participating in a webcast when we finally ship the MetaWeblog API implementation for MSN Spaces.


Categories: Windows Live

December 5, 2005
@ 08:06 PM

The time has come again for me to share the top five crunk tracks getting heavy rotation on my iPod. If you like Lil' Jon or the Ying Yang Twinz but don't know what else from the dirty south might tickle your fancy, try some of these tracks.

    1. Grillz - Nelly feat. Paul Wall & St. Lunatics
    2. What U Drankin' On - Jim Jones feat. P.Diddy, Paul Wall & Jha' Jha
    3. Stay Fly - Three Six Mafia feat. Young Buck, Eightball & MJG
    4. Stay Fly (Still Fly Remix) - Three Six Mafia feat. Slim Thug, Trick Daddy & Project Pat
    5. Turn It Up - Chamillionaire feat. Lil Flip

During the intro for Three Six Mafia's Most Known Unknown, they complain that they influenced a lot of what is popular in hip hop today but don't get the credit. These seems very true to me. I remember listening to tracks like Tear the Club Up and Hit a Muthafucka back in the late '90s in Atlanta before any of the ATL rappers really became hot down south. I'd definitely recommend the Most Known Unknown Hits to any fan of the genre.

PS: I don't like D4L's Laffy Taffy but it is the first song that I've seen have about a dozen people doing synchronized dance steps in a club. So it gets an honorary mention.


Categories: Music

December 5, 2005
@ 06:12 PM

Since the release of v. of RSS Bandit last week, I've read a number of posts like Scott Reynolds's complaint in his post I Love Jason Haley In A Totally Manly Non-Threatening Way. Serious. Totally Platonic where he wrote

Also, RSSBandit devs: What the F? I am using Nightcrawler Alpha, which I love, but ctrl+m no longer marks posts are read. It’s ctrl+q. Ghey. That’s a UI no-no if I ever saw one. Worse than going agains expected behavior, changing previously existant behavior arbitrarily just makes users mad. Still a great aggregator, but I’d like to hear a why on that decision.

A number of people have complained that we changed the keyboard shortcut for marking items as read from Ctrl+M to Ctrl+Q. The reason we made the change was because of a bug report we got which pointed out that (i) Outlook Express and Outlook use Ctrl+Q for marking items as read and (ii) Ctrl+Q is easier to hit than Ctrl+M on most keyboards.

Due to all the feedback we've gotten, we'll be including a keyboard shortcut editor [originally developed by Phil Haack] in the bug fix version of RSS Bandit scheduled to ship later this month.


Categories: RSS Bandit

December 2, 2005
@ 06:03 PM

James Robertson has a blog post entitled The $100 notebook where he writes

Here's another breathless story on the $100 (actually, looks like it will be $200) notebook. There's some cool things about this, including the fact that it can be powered by a hand crank. However, there are a number of simple problems too -

  • For the truly poor, access to laptops isn't a solution. Access to clean water is way, way higher on the scale
  • Tech support. Ok - you hand out a few hundred in some remote village. What the heck do the new users do when there are problems?

This is a pie in the sky solution, IMHO. It's like deciding to hand out cheap cars, and only later noticing that there are no gas stations for the recipients to use. I understand that the people behind this are well intentioned - but laptops are only useful when there's a hell of a lot of other infrastructure supporting them. The well intentioned folks behind this plan need to aim a lot lower.

Attitudes like this really, really irritate me. The same way that there are rich people and poor people in the United States, there are also parts of Africa that are less well off than others. It isn't all one freaking desert with bony kids surrounded by flies from South Africa to Algeria. For example, in Nigeria there are probably more cell phones per capita in the major cities than in most parts of the United States. The fact that some people get to use the latest iPods and iBooks in the U.S. doesn't mean there aren't homeless bums eating less than three square meals and sleeping on the streets in the same zip codes. Yet I don't see folks like James Robertson posting about how every homeless person has to be housed and every orphan found foster parents before we can enjoy iPods and laptop PCs.

If the plight of people in Africa bothers you so much instead of criticizing those who are making an honest attempt to help with your "armchair quarterbacking" why don't you contribute some of your time & dollars. Put your money where your mouth is.


Categories: Technology

There's been a lot of recent buzz about Windows Live Fremont in various blogs and news sites including TechCrunch, the Seattle Post-Intelligencer and C|NET News.com. Fremont is the code name for a social market place in the same vein as classifieds sites such as Craigslist. It seems like just yesterday when it all began...

A few months ago, Kurt started a series of meetings to pitch various folks at work about this idea he had for an online marketplace which harnessed the power of one's social networks. At the time Kurt was a PM on MSN Windows Live Messenger and he had codenamed the project "Casbah". The value proposition of 'Casbah' was straightforward. Most people are more comfortable selling or buying stuff from people they know directly or indirectly. The typical classifieds site online does a poor job of supporting this scenario. On more than one occasion, I've wanted to sell stuff when I've moved apartments which I wouldn't have minded selling to a friend or coworker. However listing the items for sale on eBay and dealing with trying to offload my stuff to strangers didn't appeal to me. 'Casbah' was optimized around casual sales between people who knew each other directly or indirectly.

I was involved in the early design meetings and although I was enthusiastic about the idea I assumed that like several other meetings about good ideas I'd sat in on at Microsoft, it would go nowhere. To my surprise, Kurt kept at it and eventually a team was put together to ship 'Casbah' which has been re-christened 'Fremont' after a neighborhood in the Seattle area which has an open market every Sunday.

Enough history, let's talk about what makes Fremont so special. About a year ago, I had my Social Software is the Platform of the Future epiphany. One key aspect of this epiphany was the realization that a lot of interesting scenarios can be enabled if the software I used knew who I cared about and who I was interested in. Powerful social applications like Flickr and del.icio.us are successful partly because of this key functionality. Windows Live Fremont does this for classifieds sites. As a user, you can make Fremont a marketplace for just your social circle. This is enabled by harnessing two social circles; your IM buddies & your email tribe. You can specify that your listings are public, only visible to your IM buddies and/or only visible to people in your email tribe (i.e. are hosted on the same email domain such as '@microsoft.com' or '@gatech.edu'). Similarly, you can specify the same on listings that you view. Basically no matter how many millions of people use the service, my college friends and I can use it as an improved version of the bulletin boards in our dorm hallways without having to deal with awkward sales situations involving people we don't know. 

Of course, this is just scratches the surface. This is part of Windows Live which means you can expect a cohesive, integrated experience with other Windows Live properties and perhaps even an API in the future. It's going to be a fun ride.

I've enjoyed working with the Fremont team so far. It's been great helping them to bring their vision to fruition.


Categories: Windows Live

Charlene Li of Forrester Research has a post entitled Why Microsoft’s classifieds service will be better than Google Base where she writes

I spent some time a week ago with Microsoft discussing their new online classifieds service, code named "Fremont", which is in internal testing now. While the news is out there, I thought I’d provide my take on how this differs from - and in my opinion, is better than -- Google Base. I do this with one HUGE caveat - both of these services are brand new and beta, with Fremont not even available yet.

First, a quick description of Fremont. It looks and acts like a classic online classifieds site. A list of linked categories is on the front page and users can browse or search through the listings. A key difference though is that the listings are turbo-charged - as the poster, you can control who can see them, from everyone to just a select group of people on your MSN Messenger buddy list. If you choose the latter, the next time one of your privileged buddies signs into Messenger, they’ll see a little alert that says you have a set of golf clubs for sale. The categories include the usual suspects - jobs, homes, apartments, cars, and one thing that caught my eye, tickets.

That’s because one of my favorite uses of Craig’s List is to find last minute event tickets to hot shows. I also sometimes find myself in that seller situation - and I would highly prefer to sell or even give away tickets I can’t use to friends than to strangers from Craig’s List. The same goes with clothing - I don’t want to go through the hassle of selling some of used but still very nice clothes online, but I wouldn’t mind organizing an online clothing swap with my girlfriends.

The Microsoft approach reminds me of what Tribe.Net was (is?) trying to do in their effort to socialize classifieds but with one major difference - Microsoft leverages the social network that already exists in a user’s buddy list and address book.

So I look at Fremont and I see a really nice service shaping up. The classifieds interface is familiar - each category has the expected search fields (number of bedrooms in housing, make and year in autos, etc.) and the opening page lays out all of the options in a simple manner similar to Craig’s List’s austere list of links.

Now compare that to Google Base. Honestly, can you imagine your average user trying to make heads or tails out of it? Don’t get me wrong - I love Google Base because of the audacious potential it represents in terms of creating new content for the Web. But in terms of a classifieds service, it will take a lot of application development to get it to the point where the average Joe will be able to use it.

One last point about Fremont - it’s being built on top of the new Windows Live platform, which has as one of its core tenants giving developers the ability to build their own applications. Now this is one of the potential benefits of Google Base as well, but I’d put my chips down in favor of Microsoft actually pulling this one off. Microsoft has a well supported developer network and has come a long way in winning their trust through efforts like Channel 9. Granted, that trust is far from universal but it’s a start.

Unsurprisingly I agree with everything Charlene has to say about Windows Live Fremont. I've been involved with the project to some degree from concept to completion and will be posting some details about the project in my next post.


Categories: Windows Live

In the post Update to Windows Live Mail Beta Imran Qureshi of the Hotmail Windows Live Mail team discusses some of the new features that were just added to the beta. The list of improvements from Imran's post is excerpted below

1. Safety Improvements

We're laser focused in the area of spam and safety with Windows Live Mail and have already made major improvements over other webmail services. Never one to rest on our laurels, we simplified and consolidated the safety experience even more in M4. 

We automatically calculate a safety level for each mail using over eight checks.  The three safety levels are: "Known sender", "Unknown sender" or "Unsafe". You can always click on "Why?" to find out why a mail was marked as such and what you can do to change the safety level of this sender. How much simpler can it be…

(Screenshots: Known sender  |  Known sender (after clicking Why)  |  Unknown sender  |  Unsafe)

Kahuna has already been helping identify Phishing mails to help protect our customers -- now we make them more noticeable so you won’t be duped into clicking on them.

(Screenshots:  Real PayPal Mail  | Phishing Mail pretending to be PayPal)

Oh and if you were one of those people who didn’t like having message  text shown in mails in the Junk Mail Folder, now the default is that message content is not rendered in Junk Mail Folder until you say you want to see it.

(Screenshots: JMF Folder)

2. Fast Search

Vroom, vroom! The new indexed search is fast and it searches message bodies. The UI is the same as M3 but the engine underneath is brand spanking new. We’re rolling it out slowly - not every user will get it right away so be patient.

How you tell if you have the new search engine: If the infobar in your search results does NOT say the following then you have the new search engine:

"Note: At this time, the mail beta searches only the subject and addresses."

3. Spell Check as you Type

Ok this is one feature that turned out better than we thought.  Just start typing in Compose and we’ll check spelling in the background and put red squiggles under words that are misspelled. You can then right-click them and choose from our suggestions, tell us to ignore that word or add the word into the dictionary. 

I know what you’re saying. Big deal, Outlook already does that. Well, we’re the first webmail service I know that does this on the web without installing any software! 

(Screenshots: Spell Check As You Type)

4. Scrollable message and contact list. 

We know you want to see more than 14 messages at a time in the message list. Well now you can see 50 messages at a time.

Why should contacts be any different? In contacts now, you can see all your contacts in one list.

(Screenshots: Scrollable message list   | Scrollable contact list)

5. Configurable reading pane

Now you and I know that reading pane is the best thing to ever happen to webmail. But for some strange reason a few people don’t like it. Well, if you happen to be one of those people you can now turn off the reading pane.

(Screenshots: Configuration options  | reading pane turned off)

6. Resizable panes

Your folder names are long or you like the message list to be wider? Just grab the edges of the panes and resize them to how you like it. Your customization is maintained the next time you login (on the same machine).

(Screenshots: resized panes)

7. Improved Error Message Discoverability

We’ve also made our error message easier to notice by moving them closer to where your eye is, adding icons and changing their color to a more visible color.

(Screenshots: Error message in Contacts)

8. Easier to send mail when you don’t know email address

Admit it. This is how you send mail: You find a mail from that person. You reply to it and then delete the original content of the mail. 

Well if this is you, then Kahuna makes this easy. Find the mail and click on the From address. We start a new mail to that person.

(Screenshots: Clickable sender email address)

Or let’s say you were in contacts to find the email address of some friend. Well, normally you’d copy the email address, go back to mail, click New and then paste in the email address. Well, now you can just drag the contact to the Mail tab and voila, we start an email to that person.

9. Support for browsers other than IE6 & higher

Testers can access all Mail Beta functionality using Internet Explorer version 6.0.   But we know some of you like to use other browsers. With M4 we now also support additional browsers including Firefox, Netscape or Opera. We're still a bit of a work in progress here, so apologies if there are still some glitches (we are focused on making sure core email functionality is solid, but some of the bells and whistles work better with IE6+).

10. Empty Junk Mail Folder or Deleted Items Folder with one click

Lots of users asked for the ability to empty these folders easily. Now you can right-click on them and choose Empty.

(Screenshots: Right-click menu on Junk Mail Folder)

11. Print Messages

Ok, we had this in M3 also but it was hidden in the Actions menu. Now it’s on the main toolbar so you can easily find and click it.

(Screenshots: Print button)

All of these are great improvements especially the Firefox support. I'm trying it out right now and so far so good.


Categories: Windows Live