Mike Arrington has a blog post on TechCrunch entitled Amazon’s War on Statsaholic where he writes

Statsaholic (formerly Alexaholic) launched a year ago and provided much easier access to Alexa traffic data than the Alexa site itself. Statsaholic also had other features Alexa didn’t offer, like embeddable graphs and data smoothing. Others agreed, and soon started linking to Statsaholic instead of Alexa when doing traffic comparisons. At one point, Alexa was the no. 3 search result on Google for “Alexa.”

Statsaholic was not using the Alexa web service to get the data, because Alexa doesn’t offer the graph data via their web service. Amazon, which owns Alexa, could have complained or simply shut them down when it launched, but they didn’t. They actually complimented the service in a post on the Alexa blog last April.
What bothers me about the situation is that Amazon sat on it for a year, complimenting the service along the way (and copying it). Then, just when the service started getting really popular, they took drastic measures to shut it down.

I'm totally perplexed by Arrington's position here. Statsaholic is screenscraping Alexa and building a business on top of that. It seems like a pretty open and shut issue to me. The fact that Amazon didn't bother prosecuting them until they got a lot of traffic just points out that there is little point harassing folks who are abusing your service unless they are consuming a lot of your resources or are taking money out of your pocket. It seems Statsaholic was doing both. 

You'd think a smart startup founder would know better than to build a business model on hotlinking and bandwidth theft. You'd also expect a Web savvy dude like Mike Arrington to know better than blame the victim in such situations. Next thing you know, he'll be flaming websites that block hotlinking to their images via htaccess. Weird. 

PS: Someone in the comments wondered how Mike Arrington would feel if someone created a mashup that showed all of the TechCrunch content minus the obnoxious ads (e.g. http://techcrunchminusads.com). I wonder if Mike would sue if the site started stealing a bunch of his traffic  since it wouldn't load so many ads thus being faster and perhaps included more useful info (e.g. crosslinking TechCrunch posts with PodTech interviews)? 


The Yahoo! Developer blog has an entry entitled Introducing the Yahoo! Mail Web Service which states

While we are certainly proud of the success of Yahoo! Mail, today we are announcing how we are going beyond the boundaries of the Yahoo! network and enabling developers to build new tools or applications around Yahoo! Mail. We are thrilled to announce the open availability of the Yahoo! Mail Web Service, web service for Yahoo! Mail (accessible via SOAP or JSON-RPC) that we previewed to Yahoo! Hack Day attendees. With the Yahoo! Mail Web Service, you can connect to the core mail platform to perform typical mailbox tasks for premium users such as list messages and folders, and compose and send messages (you can also build mail preview tools for free users with limited Web Service functionality). In other words, developers outside of Yahoo! can now build mail tools or applications on the same infrastructure we use to build the highly-scaled Yahoo! Mail service that serves nearly 250 million Yahoo! Mail users today -- users who might want to help you make some cash with your application.
The Yahoo! Mail Web Service is a big release for Yahoo! and the Internet, and it's only the beginning of what you'll be seeing from Yahoo!. Jump into our code samples for Java, .NET, PHP, Perl and Python, and build your dream mail app today, then be sure to give us feedback on your experience so we can continue to make the API even better. Be sure to leverage the online Yahoo! Mail Web Service support group where you can get help from the Yahoo! Mail Web Service team and your fellow developers. We can't wait to see what applications you will build when you add your imagination to the platform. Maybe you want to build an application that backs up Yahoo! mail targeted at a large number of Yahoo! users, or maybe you just want to add a niche feature that makes Yahoo! Mail better for your mom. For inspiration, we've gathered a few applications:

First things first, this is an unprecedented and very welcome move on the part of Yahoo! This is another example of why I think of the three major Web portals, Yahoo! has done the best job of turning their services into a developer platform. I like the fact that the Web service is exposed over multiple protocols and the code samples are in multiple programming languages that run the gamut from enterprise developer fare (Java/.NET) to Web geek fare (Perl/Python/PHP). Mad props to the developer platform folks at Yahoo!, good work.

With that out of the way, there is some stuff that has me scratching my head after taking a look at the Yahoo! Mail Web Service User Guide and API Reference. The first thing that is weird is that although Yahoo! provides SOAP and RESTful JSON web services for accessing one's mail, I still can't get POP access to my Yahoo! mail without shelling out $20/year. After all, GMail has POP access for free and users of the free versions of Windows Live Hotmail can get POP-like access if they use Windows Live Mail desktop although they are restricted to using one mail client.

So I decided to see if the Yahoo! Mail Web Services provides a way around this restriction but found out from the section on "Determining the Users Account Capabilities" that

The Yahoo! Mail Web Service limits the functionality available to free accounts. Premium accounts have no such limits. First call the GetUserData method to get the user’s account type from the web service.
Calls to other APIs will return an error.

So it looks like I actually can't build an application that can be used to read the actual mail messages from my Yahoo! Mail account with the API unless I'm a premium user. Otherwise, all I can do is list the messages but not actually get their content. That makes the APIs a lot less cool than I initially thought. 

Like Sam Ruby and Paul Downey I initially wondered about the choice of exposing a SOAP API but then realized that it may be that they already use SOAP internally so this wasn't that much work for them in that case. I also wondered about the lack of a RESTful non-SOAP XML interface as well but after looking at the complex object models I can see why they went with data formats that are pretty much serialized object models (i.e. Javascript OBJECT Notation & Simple OBJECT Access Protocol) instead of expecting developers to write a bunch of gnarly XML parsing code for processing over a dozen different response formats from the 23 methods in the Yahoo! Mail Web Service.

I suspect that Yahoo! won't get as much traction as they expect with the API until they remove some of the restrictions on non-premium accounts. Even then it does look like there is enough to build Windows and Windows Live gadgets for Yahoo! Mail that show your messages. Except that there is no way to read the mail contents and not even a way to link to the message in way that sends the user to Yahoo! Mail to read its contents. I bet if Yahoo! fixed the latter and perhaps had a rev-share with people who drove clicks back to the Web-based mail client, things would get very interesting. I wonder if Jeremy is listening?


March 28, 2007
@ 05:50 PM

Dave Winer has a post entitled How basic is Twitter? where he writes

So inevitably, a query about the value of namespaces leads you to wonder if there will be TwitterClones, web-based services that emulate the Twitter API, that keep internal data structures similar to Twitter, and most important, peer with Twitter, the same way Twitter peers with IM and SMS systems.

This is as far as I got in my thinking when last night I decided to ask Les Orchard, a developer I know for quite a few years, and who I've worked with on a couple of projects, both of which use the kind of technology that would be required for such a project --

What if there were an open source implementation of Twitter?

Nik Cubrilovic happened to be online at the moment and he jumped in with an idea. Les confessed that he was thinking of doing such a project. I thought to myself that there must be a lot of developers thinking about this right about now. We agreed it was an interesting question, and I said I'd write it up on Scripting News, which is what I'm doing right now.

What do you think? Is Twitter important, like web servers, or blogging software, so important that we should have an open source implementation of something that works like Twitter and can connect up to Twitter? Where are the tough sub-projects, and how much does it depend on the willingness of the developers of Twitter #1 to support systems that connect to theirs?

The problem I see here is that Twitter isn't like web servers or a blogging engine because Twitter is social software. Specifically, the value of Twitter to its users is less about its functionality and more about the fact that their friends use it. This is the same as it is for other kinds of social/communications software like Facebook or Windows Live Messenger. Features are what gets the initial users in the door but it's the social network that keeps them there. This is a classic example of how social software is the new vendor lock-in.

So what does this have to do with Open Source? Lots. One of the primary benefits to customers of using Open Source software is that it denies vendor lock-in because the source code is available and freely redistributable. This is a strong benefit when the source code is physically distributed to the user either as desktop software or as server software that the user installs. In both these cases, any shabby behavior on the part of the vendor can lead to a code fork or at the very least users can take matters into their own hands and improve the software to their liking.   

Things are different on the "Web 2.0" world of social software for two reasons. The obvious one being that the software isn't physically distributed to the users but the less obvious reason is that social software depends on network effects. The more users you have, the more valuable the site is to each user. Having access to Slashcode didn't cause the social lock-in that Slashdot had on geek news sites to end. That site was only overtaken when a new service that harnessed network effects better than they did showed up on the scene (i.e. Digg). Similarly, how much value do you think there is to be had from a snapshot of the source code for eBay or Facebook being made available? This is one area where Open Source offers no solution to the problem of vendor lock-in. In addition, the fact that we are increasingly moving to a Web-based world means that Open Source will be less and less effective as a mechanism for preventing vendor-lockin in the software industry. This is why Open Source is dead, as it will cease to be relevant in a world where most consumers of software actually use services as opposed to installing and maintaining software that is "distributed" to them.  

Granted, I have no idea why Dave Winer would like to build an Open Source competitor to Twitter. The main thing that will come out of it is that it will make it easier for people to build Twitter knock offs. However given how easy it is to roll your own knock off of popular "Web 2.0" social sites (e.g. 23, SuperGu, Uncut, etc) this doesn't seem like a lofty goal in itself. I'm curious as to where Dave is going with this since he often has good insights that aren't obvious at first blush.


Categories: Technology

March 28, 2007
@ 03:34 PM

This post was originally meant to be a response to Mini-Microsoft's blog post entitled Mini, a Devil, and Fine Whine where he seems to imply that there is some sort of class struggle going on at Microsoft and also made some calls for radical transparency. However this morning Mini linked to a blog post entitled For want of a shoe, or time for a new rider? on the MSFTextrememakeover blog which is just fire and has distracted me. If you're a Microsoft watcher [or even better a Microsoft exec] you should go ahead and read it, twice even. Key excerpts that lit my fire


MSFT does not appear to have a clear, honestly customer-focused mission that is understood at all levels. Importantly - and perhaps as a result - employees seemingly aren't in total accord or fully bought into it. If MSFT truly believes in "Your potential. Our passion", then it needs to do more than just pay lip-service to it. It needs to open itself to all that that entails (cross-platform support, not playing lock-in games, etc.) and deliver against it.

I see two concerns here. First, the need to move from a culture of "good enough" to one of "excellence" and "insanely great". I've posted about this before. MSFT has a long-standing approach, ingrained via Gates, of getting something - anything - out to market and then fixing it over time. That worked well for a long time when "free" alternatives weren't prevalent, and when competitors/markets weren't moving as quickly as they are today. Now, it's a lot less successful, and yet MSFT continues to do it and be surprised when it fails.

Stop fighting major wars on multiple fronts simultaneously. It is simply ridiculous for current management to assume that MSFT can fight the biggest and best companies on earth, across a dozen or more battlegrounds, and still hope to prevail. Just take a look at some of the folks MSFT is going up against: SONY (and Nintendo) in gaming, Nokia and many others in mobile, GOOG and YHOO in Search, Everyone from Alcatel to Siemens in IPTV, IBM/Oracle/SAP (and smaller players Salesforce.com. Rightnow, etc.) in ERP and CRM, IBM/Adobe/FOSS in middleware and development, AAPL and most of MSFT's former partners in mobile media, AAPL and GNU/Linux in Operating Systems, and FOSS in personal productivity. Worse, these battles are spreading MSFT too thin, and leaving its core cash cows increasingly vulnerable (would Vista have taken 5 years to develop if management hadn't been distracted with a dozen other battles?).
Public Face

I am sick and tired of MSFT executives "trash" talking competitors in public. This is such a fundamental business tenet that it's an embarrassment to have to even list it.

Like I said, the entire post is really good. As for my response to Mini's Mini, a Devil, and Fine Whine post, here it goes. The kind of people who focus on what the top X% of Microsoft are making are probably not the kind of employees you want to keep around anyway so it seems weird to be catering to them. The concerns that the Microsoft employees whose opinions I value have are all eloquently described in MSFTextrememakeover's post excerpted above. The kind of people who get in a tizzy because some VPs got to attend an expensive award ceremony that they didn't are the kind of whiners and losers you can never make happy so why bother? It's not like they can argue that they are underpaid or that their employment benefits suck. Instead I see it as part of the age of entitlement in America where lots of people believe they deserve to be balling out of control and then gets pissed off when they aren't. The best thing you can do with those kind of people is to show them the door. 


Categories: Life in the B0rg Cube

I'm traveling to Nigeria next week to belatedly celebrate my dad's seventieth birthday and I'm looking for suggestions on what I should read on the trip. It usually takes about 24 hours of traveling for me to get back home; 8 hours flying to London, 10 hour lay over and another 6 hours to Abuja. I usually go through 2 or 3 of Terry Pratchett's Discworld books on my trip but that often isn't enough. The last time I was back home, I also read Malcolm Gladwell's The Tipping Point and the time before that I read Neil Gaiman's American Gods. Both books were interesting and I'm considering reading their sequels (i.e. Blink and Anansi Boys) on this trip.

However I recently stumbled on a list of the 50 most significant Science Fiction and Fantasy works of the last fifty years and I'm considering getting one or two books from that list. Oh yeah, and then there's Jeff Atwood's recommended reading list of books about software development which has a few entries that caught my eye as well.

Given that you now know my taste in in-flight reading material, what books would you recommend gentle reader?


Categories: Personal

Back in the day, before Microsoft realized the power of bloggers to create positive buzz about a product I often had to chime in on discussions about why it was important to take into consideration the opinions of the vocal minority in the blogosphere. One of the things folks like Robert Scoble and myself would point out is that mainstream reporters often get their stories from blogs these days. This means that if a product is getting a lot of good press among a few key blogs it eventually makes it into the mainstream media.

The best example of this I've seen in recent memory is the hype storm around Twitter. The tipping point in mainstream coverage of the service seems to have been when a bunch of bloggers who attended the SXSW conference started using the service. According to Google Trends interest has been increasing steadily in the service but mainstream coverage was lacking until after SXSW. Since then we've had articles like 

Business Week: Twitter: All Trivia, All The Time
Wall Street Journal: Friends Swap Twitters, and Frustration
San Francisco Chronicle: Austin's SXSW festival atwitter over Twitter

With results like this, it is no wonder PR flacks of all sorts are circling blogs and other antisocial media like groupies at a rock concert hoping to infect us with their disease message.


Categories: Social Software

Raymond Chen has a blog post entitled You don't know what you do until you know what you don't do where he writes

I've seen a lot of software projects, and one thing I've learned is that you don't have a product until you start saying "No".

In the early phases of product design, you're all giddy with excitement. This new product will be so awesome. It will slice bread. It will solve world hunger. It's designed for everybody, from the technology-averse grandmother who wants to see picture of her grandkids to the IT manager who is in charge of 10,000 computers. It'll run equally well on a handheld device as in a data center.

When I see a product with an all-encompassing description like this, I say to myself, "They have no idea what their product is." You don't know what you do until you know what you don't do. And the sooner you figure out what you don't do the better, because a product that promises to do everything will never ship.

In my five years at Microsoft, I've seen a bunch of projects fail. Some were public flame outs that are still embarrassing to mention today while others are private mistakes that you'll never hear anyone outside the b0rg cube mention. A few months ago I wrote a blog post entitled Top 5 Signs Your Project is Doomed and since then I've considered a few more entries that should be on the list bringing the total to 10. The list below are common signs that a  software project is doomed. Meeting one or two of these criteria isn't necessarily the kiss of death but three or more and you might as well start circulating your resume. 

  1. Trying to do too much in the first version. See Raymond's point above.

  2. Taking a major dependency on unproven technology.

  3. Competing with an existing internal project that was either a cash cow or had backers that are highly placed in the corporate hierarchy.

  4. The team is understaffed. If you have less people than can handle the amount of work you have to do then the right thing to do is to scale back the project. Practically every other choice leads to failure.

  5. Complexity is one of the goals of the project because "complex problems require complex solutions".
  6. Schedule Chicken

  7. Scope Creep

  8. Second System Syndrome

  9. No Entrance Strategy. When a project can't articulate how it goes from a demo or prototype to being in the hands of end users, there's a problem. This is particularly relevant in the "Web 2,0" world where many startups only strategy for success is getting a mention on TechCrunch and the fact that their service has "viral" features.

  10. Tackling a problem you don't know how to solve. It's pretty amazing how often I've seen this occur.


Categories: Programming

According to the Infoworld article entitled Microsoft names leaders for search-and-ad unit

Microsoft Wednesday named Satya Nadella to lead the newly formed Search and Ad Platform group, the software giant's effort to optimize the advertising revenue-raising potential of its search business.

Nadella, previously corporate vice president for Microsoft's Business Solutions group, will report to Kevin Johnson, president of the Platform and Services Division, the company said in a statement.

I'm not sure this information is accurate since I haven't seen any sign of it on Microsoft Presspass nor has Satya Nadella's corporate profile been updated. However if it is, it would then create three VPs under Kevin Johnson who are in charge of Microsoft's three Web brands; Windows Live, MSN, and Live Search. The org chart representing all the folks who are in charge of Microsoft's online businesses would then be 

if the Infoworld article is accurate.

The only relevance this has to people who read my blog is that it gives a nice visual of where I fit in the org chart. I'm in Blake Irving's group, working on aspects of the Windows Live Platform that powers services used by the Windows Live Experience group. 


Categories: Life in the B0rg Cube

Via Joe Gregorio I found a post entitled Transactionless by Martin Fowler. Martin Fowler writes

A couple of years ago I was talking to a couple of friends of mine who were doing some work at eBay. It's always interesting to hear about the techniques people use on high volume sites, but perhaps one of the most interesting tidbits was that eBay does not use database transactions.
The rationale for not using transactions was that they harm performance at the sort of scale that eBay deals with. This effect is exacerbated by the fact that eBay heavily partitions its data into many, many physical databases. As a result using transactions would mean using distributed transactions, which is a common thing to be wary of.

This heavy partitioning, and the database's central role in performance issues, means that eBay doesn't use many other database facilities. Referential integrity and sorting are done in application code. There's hardly any triggers or stored procedures.

My immediate follow-up to the news of transactionless was to ask what the consequences were for the application programmer, in particular the overall feeling about transactionlessness. The reply was that it was odd at first, but ended up not being a big deal - much less of a problem than you might think. You have to pay attention to the order of your commits, getting the more important ones in first. At each commit you have to check that it succeeded and decide what to do if it fails.

I suspect that this is one of those topics like replacing the operations team with the application developers which the CxOs and architects think is a great idea but is completely hated by the actual developers. We follow similar practices in some aspects of the Windows Live platform and I've heard developers complain about the fact that the error recovery you get for free with transactions is left in the hands of application developers. The biggest gripes are always around rolling back complex batch operations. I'm definitely interested in learning more about how eBay makes transactionless development as easy as they claim, I wonder if Dan Pritchett's talk is somewhere online?

The QCon conference wasn't even on my radar but if Dan Pritchett's talk is indicative of the kind of content that was presented, then it looks like I missed out. Looking at the list of speakers it looks like a conference I wouldn't have minded attending and submitting a paper for. I wonder if there'll be a U.S. version of the conference in the future? 


Categories: Web Development

danah boyd wrote two really interesting posts this weekend that gave an interesting perspective on a couple of headlines I've seen in blogs and mainstream news. Her post on narcissism gave an interesting perspective on stories such as CNN's Study: Vanity on the rise among college students which had me curious for more details when I read it originally. The post on Twitter gives a practical perspective I hadn't considered or seen mentioned in all the blogosphere ravings about the service since the hype storm started after the SXSW conference.

Interesting excerpts from danah boyd's post entitled fame, narcissism and MySpace

For those who are into pop science coverage of academic work, i'd encourage you to start with Jake Halpern's "Fame Junkies" (tx Anastasia). For simplicity sake, let's list a few of the key findings that have emerged over the years concerning narcissism.

  • While many personality traits stay stable across time, it appears as though levels of narcissism (as tested by the NPI) decrease as people grow older. In other words, while adolescents are more narcissistic than adults, you were also more narcissistic when you were younger than you are now.
  • The scores of adolescents on the NPI continue to rise. In other words, it appears as though young people today are more narcissistic than older people were when they were younger.
My view is that we have trained our children to be narcissistic and that this is having all sorts of terrifying repercussions; to deal with this, we're blaming the manifestations instead of addressing the root causes and the mythmaking that we do to maintain social hierarchies. Let's unpack that for a moment.

American individualism (and self-esteem education) have allowed us to uphold a myth of meritocracy. We sell young people the idea that anyone can succeed, anyone can be president. We ignore the fact that working class kids get working class jobs. This, of course, has been exacerbated in recent years. There used to be meaningful working class labor that young people were excited to be a part of. It was primarily masculine labor and it was rewarded through set hierarchies and unions helped maintain that structure. The unions crumpled in the 1980s and by the time the 1987 recession hit, there was a teenage wasteland No longer were young people being socialized into meaningful working class labor; the only path out was the "lottery" (aka becoming a famous rock star, athlete, etc.).

Interesting excerpts from danah boyd's post entitled Tweet Tweet (some thoughts on Twitter)

Of course, the population whose social world is most like the tech geeks is the teens. This is why they have no problems with MySpace bulletins (which are quite similar to Twitter in many ways). The biggest challenge with teens is that they do not have all-you-can-eat phone plans. Over and over, the topic of number of text messages in one's plan comes up. And my favorite pissed off bullying act that teens do involves ganging up to collectively spam someone so that they'll go over their limit and get into trouble with their parents (phone companies don't seem to let you block texts from particular numbers and of course you have to pay 10c per text you receive). This is particularly common when a nasty breakup occurs and i was surprised when i found out that switching phone numbers is the only real solution to this. Because most teens are not permanently attached to a computer and because they typically share their computers with other members of the family, Twitterific-like apps wouldn't really work so well. And Twitter is not a strong enough app to replace IM time.

Read both posts, they are really good. And if you aren't subscribed to her blog, you should be.


The Australian iTWire has a rather biased and misinformed article entitled Official: Microsoft ‘bribes’ companies to use Live Search which contains the following excerpt

Microsoft’s new program is called “Microsoft Service Credits for Web Search” and has been unveiled by John Batelle’s ‘SearchBlog’. The money on offer is significant, especially when multiplied across thousands of PCs. The deal means that companies can earn between US $2 and US $10 per computer on an annual basis, plus a US $25,000 “enrollment credit” which is a nice big wad of cash that will likely need a large-ish, strong and sturdy brown paper bag to hold securely while being passed under the table.  

For companies that have thousands of computers, this could translate into anywhere from US $100,000 to $200,000 per year, which is money that could be put to good use in the IT department or elsewhere in the company.
Former Microsoft employee and blogger Robert Scoble who served as the online face of Microsoft during his three years at the company is not impressed with Microsoft’s moves in deciding to offer companies money to use search.  His arguments are quite valid and boil down to Microsoft really needing to create better products, rather than needing to pay companies to get more traction for Windows Live. After all, Scoble isn’t the first to observe that Google doesn’t need to pay anyone to use its search services – people use them voluntarily because of the quality of the results

The amount of bias in this article is pretty amazing considering that Microsoft is primarily reacting to industry practices created by the Google [which have also been adopted by Yahoo!]. Let me count the ways Google bribes companies and individuals to use their search engine

  1. Google pays AdSense publishers for each user they convince to install Firefox with the Google Toolbar installed. Details are in the documentation for the AdSense Referrals Feature. Speculation on Slashdot was that they pay $1 per user who switches to Firefox + Google Toolbar.

  2. Google paid Dell $1 billion dollars to ensure that Google products are preinstalled in all the computers they sell and the default search engine/home page is set to Google. Details of this deal were even published in iTWire.

  3. Google paid Adobe an undisclosed amount to bundle Google Toolbar [which converts your default search engine in your browser to theirs] with all Adobe products.

  4. Google entered a distribution deal with Sun Microsystems to bundle Google Toolbar [which converts your default search engine in your browser to theirs] with all new installations of the Java runtime.

  5. Google products which converts your default search engine in your browser to theirs are bundled with the Winzip archiving utility. Financial details of the deal were undisclosed.

  6. Google is the default search engine for both the Opera and Firefox browsers. Both vendors get a cut of the search revenue generated from user searches which runs in the millions of dollars.

I could go on but my girlfriend just told me it's time for breakfast and I'm already in trouble for blogging on a Sunday morning. However the above links should be enough to debunk the inaccurate statements in the iTWire article. I guess iTWire's "journalism" is further validation of the saying that you should never let the facts get in the way of a good flame.


Whenever I talk to folks at work about branding and some of our products I usually get two kinds of responses. On the one hand, there are those who think branding is important and we could be doing a better job. Then there are others who believe we should focus on shipping quality products and the rest will fall into place. The second position is somewhat hard to argue with because I end up sounding like I advocate that marketing is more important than shipping quality products. Luckily, I now have two real world examples of the importance of getting branding right for your software even if you do have a quality product.

EXHIBIT A: Topix.net

In a blog post entitled Kafka-esque! Rich Skrenta writes

I'm in the Wall Street Journal today, with a story about our purchase of Topix.com for $1M and the SEO issues related to moving the domain.
Back in 2003 when we were looking for a name, we came across Topix.net. The name 'topix' really fit what we were trying to do, it was a heck of a lot better than the other names we'd come up with. It turned out we could buy the name from a South Korean squatter for $800. So we took it.  Of course I knew we were breaking one of the rules of domain names, which is never get anything besides the .com. But I thought that advice might be outmoded.
Surely, the advice that you had to have a .com wasn't as relevant anymore?

Well, we got our answer when our very first press story came out. This was in March 2004 when we got a front page business section launch story in the Mercury News. They gave us sweet coverage since we were the only startup to come out of palo alto in months (this was just as the dot-com crash was beginning to thaw). Unfortunately, while the article clearly spelled "Topix.net", the caption under our photo -- the most visible part of the story after the headline -- called us Topix.com. Someone had transcribed the name and mistakenly changed the .net to .com, out of habit, I suppose.

Since that time we've built up quite a bit of usage, and much of it return visitors who have bookmarked one of our pages, or become active in our local forums. But still, we continued to have issues where someone will assume a .com ending for the name. Mail gets sent to the wrong address, links to us are wrong, stories incorrectly mention our URL.

Beyond that, as part of some frank self-evaluations we were doing around our site and how we could make it better, and the brand stronger, we ran some user surveys and focus groups. "What do you think of the name?" was one of the questions we asked. The news was good & bad; people actually really liked the name 'topix', but the '.net' was a serious turn-off. It confused users, it made the name seem technical rather than friendly, and it communicated to the world that "we didn't own our own name."

EXHIBIT B: Democracy Player

In a blog post entitled A name change Nicholas Reville writes

This is some big news for us. We are planning to change the name of Democracy Player.

We chose the name ‘Democracy’ almost two years ago when we were first setting up PCF. We knew it was an ambitious name, but we thought that it made a clear statement about how important it is that an open internet TV platform is for our culture.
And, even though I’m about to explain why we need to change it, I’m glad we’ve had this name for the past year. It’s funny that a name like ‘Democracy’ can become a name for software– I think it turned out to be less odd than we expected. When people hear a name, they tend to accept it. And it helped us assert our mission clearly: free, open, and dedicated to democratizing video online. I think conveying that mission so strongly was crucial for us.

But the name also confused a huge number of potential users. In all our debates about whether you could call something ‘Democracy’ and how people would react to the name, we hadn’t realized that so many people would simply assume that the software was for politicians and videos about politics. We hear this response over and over, and it’s a real limitation to our user base.

So we’re changing the name to Miro.


Categories: Windows Live

A number of blogs I read have been talking about Amazon's S3 service a lot recently. I've seen posts from Jeff Atwood, Shelley Powers and most recently Dave Winer. I find it interesting that S3 is turning into a classic long tail service that works for both startups who are spending hundreds of thousands to millions of dollars a year to service millions of users (like Smugmug) to bloggers who need some additional hosting for their cat pictures. One reason I find this interesting is that it is unclear to me S3 is a business that will be profitable in the long term by itself.

My initial assumption was that S3 was a way for Amazon to turn a lemons into lemonade with regards to bandwidth costs. Big companies like Amazon are usually billed for bandwidth using 95th percentile billing, which is explained below

With 95th percentile billing, buyers are billed each month at a fixed price multiplied by the peak traffic level, regardless of how much is used the rest of the time. Thus with the same nominal price, the effective price is higher for buyers with burstier traffic patterns.

So my assumption was that S3 allows Amazon to make money from bandwidth they were already being charged for and not using. As for storage, my guess is that they are either making a miniscule amount of profit or at cost. Where this gets tricky is that, if S3 gets popular enough then all of a sudden it no longer is a way to make money from bandwidth they are being billed for but aren't using but instead impacts their actual bandwidth costs which then changes the profit equation for the service. Without any data on Amazon's cost structure it is unclear whether this would make the service unprofitable or whether this is already factored into their pricing.

On the other hand, Amazon's Elastic Compute Cloud (EC2) isn't something I've seen a lot of bloggers rave about. However it seems to be the service that shows that Amazon is making a big play to be the world's operating system in the sky as opposed to dabbling in providing some of its internal services to external folks as a cost savings measure. With EC2 you can create a bunch of virtual servers in their system and load it up with an Amazon Machine Image (AMI). An AMI is basically a server operating system and the platform components you need on it. Typical AMIs are an instance of a LAMP system (Linux/Apache/MySQL/PHP/Perl/Python) although I did see one AMI that was an instance of Windows 2003 server. You can create as many or as few server instances as you need and are billed just for what you need.

I suspect that the combination of EC2 and S3  is intended to be very attractive to startups. Instead of spending hundreds of thousands of dollars building out clusters of servers, you just pay as you go when you get your monthly bill. There are only two problems with this strategy that I can see. The first is that, if I was building the next Digg, Flickr or del.icio.us I'm not sure I'd want to place myself completely at the mercy of Amazon especially since there doesn't seem to be any SLA published on the site. According to the CEO of Smugmug in his post Amazon S3: Outages, slowdowns, and problems they've had four major problems with S3 in the past year which has made them rely less on the service for critical needs. The second issue is that VC money is really, really, really easy to come by these days judging from the kind of companies that get profiled on TechCrunch and Mashable. If the latter should change, it isn’t hard to imagine dozens of enterprising folks with a couple of thousand dollars in their pockets deciding to go with S3  + EC2 instead of seeking VC funding. But for now, I doubt that this will be the case.  

What I suspect is that without some catalyst (e.g. the next YouTube is built on S3  + EC2)these services will not reach their full potential. This would be unfortunate because I think in much the same way we moved from everyone rolling their own software to shrinkwrapped software, we will need to move to shrinkwrapped Web platforms in the future instead of everyone running their own ad-hoc cluster of Windows or LAMP servers and solving the same problems that others have solved thousands of times already.

I wonder if Amazon has considered tapping the long tail by going up against GoDaddy's hosting services with S3  + EC2. They have the major pieces already although it seems that their prices would need to go down to compete with what GoDaddy charges for bandwidth although I suspect that Amazon's quality of service would be better.


March 15, 2007
@ 03:38 PM

My blog has been slow all day due to an unending flood of trackback spam. I've set up my IIS rules to reject requests from the IP address ranges the attacks are coming from but it seems that this hasn't been enough to prevent the trackback spam from making my blog unusable.

It looks like I should invest in a router with a built in firewall as my next step. Any ideas to prevent this from happening again are welcome.


Categories: Personal

From the press release entitled Microsoft Unites Xbox and PC Gamers With Debut of Games for Windows — LIVE we learn

REDMOND, Wash. — March 14, 2007 — Microsoft Corp. today announced the extension of the Xbox LIVE® games and entertainment network to the Windows® platform, bringing together the most popular online console game service with the most popular games platform in the world. Debuting on May 8, 2007, with the launch of the Windows Vista™ version of the Xbox® blockbuster “Halo® 2,” Games for Windows — LIVE will connect Windows gamers to over six million gamers already in the Xbox LIVE community. Then, launching in June, “Shadowrun™” will for the first time connect Windows gamers with Xbox 360™ players in cross-platform matches using a single service. “UNO®,” releasing later in 2007, will also support cross-platform play between Windows and Xbox 360.

This is pretty cool and I saw some of the demos when I was at CES in January. The funny thing is that one of my coworkers told me that we were announcing this soon but I thought he said "Games for Windows Live" so I thought he meant we were rebranding MSN Games. I didn't realize it was actually "Games for Windows — LIVE". This might get a tad confusing.


Categories: Video Games

Brendan Eich has a post on the Mozilla roadmap blog entitled The Open Web and Its Adversaries which references one of my posts on whether AJAX will remain as the technology of choice for building Rich Internet Applications. He writes

open standards and open source both empower user-driven innovation. This is old news to the Mozilla user community, who have been building and feeding back innovations for the life of the project, increasing over time to include Firefox add-ons and GreaseMonkey user scripts. (BTW, I am pushing to make add-on installation not require a restart in Firefox 3, and I intend to help improve and promote GreaseMonkey security in the Firefox 3 timeframe too.) Without forking, even to make private-label Firefoxes or FlashPlayers, users can innovate ahead of the vendor's ability to understand, codify, and ship the needed innovations.

Consider just the open standards that make up the major web content languages: HTML, CSS, DOM, JS. These mix in powerful ways that do not have correspondences in something like a Flash SWF. There is no DOM built inside the FlashPlayer for a SWF; there's just a display list. There's no eval in ActionScript, and ActionScript features a strict mode that implements a static type checker (with a few big loopholes for explicit dynamic typing). You can't override default methods or mutate state as freely as you can in the browser content model. Making a SWF is more like making an ASIC -- it's "hardware", as Steve Yegge argues.

This is not necessarily a bad thing; it's certainly different from the Open Web.
Dare Obasanjo argues that developers crave single-vendor control because it yields interoperation and compatibility, even forced single-version support. Yet this is obviously not the case for anyone who has wasted time getting a moderately complex .ppt or .doc file working on both Mac and Windows. It's true for some Adobe and Microsoft products, but not all, so something else is going on. And HTML, CSS, DOM and JS interoperation is better over time, not worse. TCP/IP, NFS, and SMB interoperation is great by now. The assertion fails, and the question becomes: why are some single-vendor solutions more attractive to some developers? The answers are particular, not general and implied simply by the single-vendor condition.

I'm surprised to see Brendan Eich conflating "openness" with the features of a particular technology. I'll start with Brendan's assertion that open standards and open source enable user-driven innovation. Open source allows people to modify the software they've been distributed however they like. Open standards like HTTP, FTP and NNTP allow people to build applications that utilize these technologies without being beholden to any corporate or government entity. It's hard for me to see how open standards enable user-driven innovation in the same way that open source does. I guess the argument could be made that open source applications built on proprietary technologies aren't as "free" as open source applications that implement open standards. I can buy that. I guess.

The examples of Firefox add-ons and GreaseMonkey user scripts don't seem to be an example of open source and open standards enabling user-driven innovation. They seem to be examples of why building an application as a platform with a well-designed plugin model works. After all, we have plugins for Internet Explorer, Gadgets for Google Personalized Homepage and Add-ins for Visual Studio which are all examples of user-driven innovation as plugins for an application which are built on a proprietary platform often using proprietary technologies. My point is  

open_source + open_standards != user_driven_innovations;

Being open helps, but it doesn't necessary lead to user driven innovations or vice versa. The rest of Brendan's post is even weirder because he presents the features of Flash's ActionScript versus AJAX (i.e. [X]HTML/CSS/Javascript/DOM/XML/XmlHttpRequest) as the conflict between properietary versus open technologies. Separating content from presentation, dynamic programming languages and rich object models are not exclusively the purvey of "open" technologies and it is disingenious for Brendan to suggest that. 

After all, what happens when Adobe and Microsoft make their RIA platforms more "Web-like"? Will the debate devolve into the kind of semantic hairsplitting we've seen with the OpenXML vs. ODF debate where Microsoft detractors are now attacking Microsoft for opening up and standardizing its XML file formats when their original arguments against the file formats where that they weren't open?

Personally, I'd like to see technical discussions on the best way to move the Web forward instead of the red herring of "openness" being thrown into the discussion. For instance, what are the considerations Web developers should make when they come to the crossroads where Adobe is offering Flash/Flex, Microsoft is offering WPF/E and the Mozilla & co are offering their extensions to the AJAX model (i.e. HTML 5) as the one true way? I've already stated what I think in my post What Comes After AJAX? and so far Adobe looks like they have the most compelling offering for developers but it is still early in the game and neither Microsoft nor Mozilla have fully shown their hands.


Categories: Web Development

March 13, 2007
@ 06:18 PM

Tim Bray has an excellent post entitled OpenID which attempts to separate hype from fact when it comes to the technorati's newest darling, OpenID. He writes

The buzz around OpenID is becoming impossible to ignore. If you don't know why, check out How To Use OpenID, a screencast by Simon Willison. As it's used now (unless I'm missing something) OpenID seems pretty useless, but with only a little work (unless I'm missing something) it could be very useful indeed.

Problem: TLS · The first problem is that OpenID doesn't require the use of TLS (what's behind URIs that begin with https:).
Problem: What's It Mean?
· Another problem with OpenID is that, well, having one doesn't mean very much; just that you can verify that some server somewhere says it believes that the person operating the browser owns that ID.
Problem: Phishing
· This is going to be a problem, but I don't think it's fair to hang it on OpenID, because it's going to be equally a problem with any browser-based authentication. Since browser-based authentication is What The People Want, we're just going to have to fight through this with a combination of browser engineering and (more important) educating the general public
The Real Problem · Of course, out there in the enterprise space where most of Sun's customers live, they think about identity problems at an entirely different level. Single-sign-on seems like a little and not terribly interesting piece of the problem. They lose sleep at night over "Attribute Exchange";once you have an identity, who is allowed to hold what pieces of information about you, and what are the right protocols by which they may be requested, authorized, and delivered? The technology is tough, but the policy issues are mind-boggling.
So at the moment I suspect that OpenID isn't that interesting to those people.

I've been thinking about OpenID from the context of authorization and sharing across multiple social networks. Until recently I worked on the authorization platform for a lot of MSN Windows Live properites (i.e. the platform that enables setting permissions on who can view your Windows Live Space, MSN Calendar, or Friends list from Windows Live Messenger). One of the problems I see us facing in the future is lack of interoperability across multiple social networks. This is a problem when your users have created their friend lists (i.e. virtual address books) on sites like Facebook, Flickr or MySpace. One of the things you notice about these services is that they all allow you to set permissions on who can view your profile or content.More importantly, if your profile/content is non-public then they all require that the people who can view your profile must have an account with their service. We do the same thing across Windows Live so it isn't a knock on them.

What I find interesting is this; what if on Flickr I could add http://mike.spaces.live.com as a contact then give Mike Torres permission to view my photos without him having to get a Yahoo! account? Sounds interesting doesn't it? Now let's go back to the issues with OpenID raised by Tim Bray.

The first thing to do is to make sure we all have the same general understanding of how OpenID works. It's basically the same model as Microsoft Passport Windows Live ID, Google Account Authentication for Web-Based Applications and Yahoo! Browser Based Authentication. A website redirects you to your identity provider, you authenticate yourself (i.e. login) on your identity providers site and then are redirected back to the referring site along with your authentication ticket. The ticket contains some information about you that can be used to uniquely identify you as well as some user data that may be of interest to the referring site (e.g. username). Now we have a high level understanding of how it all works, we can talk about Tim Bray's criticisms. 

On the surface it makes sense that identity providers should use SSL when you login to your account after being redirected there by a service that supports OpenID. However as papers like TrustBar: Protecting (even Naïve) Web Users from Spoofing and Phishing Attacks, SSL/TLS does little to prevent the real security problems on the Web today, namely Web page spoofing (i.e. Phishing) and the large amount of malware on user PCs which could be running key loggers. This isn't to say that using SSL/TLS isn't important, just that it's like putting bars on your windows and leaving the front door open. Thus I can understand why it isn't currently required that identity providers support SSL/TLS. However a little security is better than no security at all. 

What Does It Mean?
I agree with Tim Bray that since OpenID is completely decentralized, websites that support it will likely end up creating whitelists of sites they want to talk to otherwise they risk their systems being polluted by malicious or inconsiderate OpenID providers. See Tim Bray's example of creating http://www.tbray.org/silly-id/ which when queried about any OpenID beginning with that URI instantly provides a positive response without authenticating the user. This allows multiple people to claim http://www.tbray.org/silly-id/BillGates for example. Although this may be valid if one was creating the OpenID version of BugMeNot, it is mostly a nuisance to service providers that want to accept OpenID.

Using susceptibility to phishing as an argument not to use OpenID seems like shutting the barn door when the horse has already bolted. The problem is that security conscious folks don't want users getting used to the idea of providing their username and password for one service whenever prompted by another service. After all, the main lesson we've been trying to teach users about preventing phishing is to only enter their username and password to their primary sites when they type them in themselves not when they follow links. OpenID runs counter to this teaching. However the problem with that teaching is that users are already used to doing this several times a day. Here are three situations from this morning where I've been asked to enter  my username and password from one site on another

  1. Connected Desktop Apps: Google Toolbar prompts me for my Gmail username and password when I try to view my bookmarks. The goal of the Google Account Authentication is to create a world where random apps asking me for my Gmail username and password by redirecting me to the Google login page is commonplace. The same goes for the the various Flickr uploader tools and Yahoo! Browser Based Authentication
  2. Importing Contacts: On Facebook, there is an option to import contacts from Yahoo! Mail, Hotmail, AOL and Gmail which requires me to enter my username and password from these services into their site. Every time I login to Yahoo! Mail there is a notice that asks me to import my contacts from other email services which requires me to give them my credentials from these services as well.
  3. Single Sign-On: Whenever I go to the Expedia sign-in page I'm given the option of signing in with my .NET Passport which happens to be the same username and password I use for all Windows Live and MSN services as well as company health site that has information about any medical conditions I may have.

Given the proliferation of this technique in various contexts on the Web today, it seems partisan to single out OpenID as having problems with phishing. If anything, THE WEB has a problem with phishing which needs to be solved by the browser vendors and the W3C who got us in this mess in the first place.

Attribute Exchange
This usually goes hand in hand with any sort of decentralized/federated identity play. So let's say I can now use my Windows Live ID to login to Flickr. What information should Flickr be able to find out about me from talking to Windows Live besides my username? Should I be able to control that or should it be something that Flickr and Windows Live agree on as part of their policies? How is the user educated that the information they entered in one context (i.e. in Windows Live) may be used in a totally different context on another site. As Tim Bray mentioned in his post, this is less of a technology issue and more a policy thing that will likely differ for enterprises versus "Web 2.0" sites. That said, I'm glad to see that Dick Hardt of Sxip Identity has submitted a proposal for OpenID Attribute Exchange 1.0 which should handle the technology aspect of the problem.

Disclaimer: This is not an endorsement of OpenID by Microsoft or an indication of the future direction of authentication and authorization in Windows Live. This is me brainstorming some ideas in my blog and seeing whether the smart folks in my reader base think these ideas make sense or not. 


Categories: Web Development

March 13, 2007
@ 04:33 PM

One of the links referenced in my recent posting about Wikipedia led me to reread the Wikipedia entry for "Dare Obasanjo". It seems there is still an outstanding issue with my entry according to folks on the Talk page because there isn't a non-blog source (i.e. mainstream media) that verifies that my dad is Olusegun Obasanjo.

For some reason it irritates me that I have a Wikipedia entry with a giant banner that claims I'm lying about my parenthood.Given that I'll be back home in a few weeks to belatedly celebrate my dad's seventieth birthday, I wonder if any Wikipedia savvy folks can point out what kind of "evidence" usually satisfies the bureaucrats on that site. Will a photograph of us together do the trick (if so I already have a few at home I can scan and upload to Flickr)? Will it have to be a photograph printed in a newspaper? Or is the only way that banner comes off is if there is a Nigerian newspaper webpage on the Internet that says he's my dad.

I need to see what strings I have to pull to get my name cleared.


Categories: Personal

March 11, 2007
@ 02:14 PM

Yesterday I went shopping and every store had reminders that daylight saving time begins today. Every year before "springing forward" or "falling back" I always double check the current time at time.gov and the US Naval Observatory Master Clock Time . However neither clock has sprung forward. Now I'm not sure if who I can trust to tell me the right time. :(

Update: Looks like I spoke too soon. It seems most of the clocks in the house actually figured out that today was the day to "spring forward" and I had the wrong time. :)


Categories: Technology

Every once in a while someone asks me about software companies to work for in the Seattle area that aren't Microsoft, Amazon or Google. This is the third in a series of weekly posts about startups in the Seattle area that I often mention to people when they ask me this question.

AgileDelta builds XML platforms for mobile devices that are optimized for low power, low bandwidth devices. They have two main products; Efficient XML and Mobile Information Client. I'm more familiar with the Efficient XML since it has been selected as the basis for the W3C's binary XML format and has been a lynch pin for a lot of the debate around binary XML.  The Efficient XML product is basically a codec which allows you to create and consume XML in their [soon to be formerly] proprietary binary format that makes it more efficient for use in mobile device scenarios. A quick look at their current customer lists indicates that their customer base is mostly military and/or defence contractors. I hadn't realized how popular XML was in military circles.  

AgileDelta was founded by John Schneider and Rich Rollman who are formerly of Crossgain, a company founded by Adam Bosworth which was acquired by BEA. Before that Rich Rollman was at Microsoft and he was one of the key folks behind MSXML and SQLXML. Another familiar XML geek who works there is Derek Denny-Brown who spent over half a decade working as a key developer on the XML parsers at Microsoft.

Press: AgileDelta in PR Newswire

Location: Bellevue, WA

Jobs: careers@agiledelta.com, current open positions are for a Software Engineer, Sales Professional, Technical Writer and Quality Assurance Engineer.


March 10, 2007
@ 03:25 AM

Today I was taking a look at my referer logs and stumbled upon a post entitled TechCrunch Resolution on Wikipedia by Jonathan Stokes which contains the following anecdote

A Brief History

The edit war was prompted by the now famous scandal in which Microsoft paid a Wikipedian to favorably edit Microsoft articles on Wikipedia. Michael Arrington of TechCrunch covered the Microsoft story in a post that was largely sympathetic.

Perceiving unfairness in the issue, Microsoft employee Dare Obasanjo, aka Carnage4Life, retaliated against TechCrunch by adding an extensive criticism section to Wikipedia’s TechCrunch article. He then wrote about his “experiment” on his blog, 25HoursaDay.com.

Ensuing Uproar

Michael Arrington was not happy to be slandered by a Microsoft employee, in response to Microsoft coverage. Obasanjo expressed surprise at Arrington’s response, but did not apologize. I blogged this chapter of the Microsoft controversy.

Judging from his blog comments, Dare does not seem to have a high respect for Wikipedia. He has previously violated Wikipedia rules by anonymously writing his own Dare Obasanjo article on Wikipedia. Humorously, it appears to include inside jokes with other Microsoft employees, such as:

Dare has lunch once a month with Don Box to rinse the SOAP off of Don while Don simultaneously attempts to lather up Dare.

Edit War

With traffic pouring into Wikipeda through TechCrunch and Digg, an all-out edit war ensued between long-time Wikipedians and anonymous vandals. The vandals began attacking the userpages of Wikipedians trying to protect the TechCrunch article. It finally escalated to a point where this anti-TechCrunch user was banned for repeatedly blocking out user pages with disturbing death threats.


Wikidemo came to the rescue by establishing a Wikipedia Mediation. She invited all editors involved to the discussion, even going so far as to invite me on this blog, and Dare Obasanjo on his blog.

Anthony cfc handled the mediation. Notably, none of the controversial IP’s showed up to state their case. With help from Anthony cfcComputerjoe, we have now restored the Wikipedia TechCrunch article, and hopefully made a few minor improvements as well. and

In the process, I earned my first Wikipedia Barnstar for Civility from Anthony cfc. Kind of neat to see Wikipedia in action.

Some days the Daily Show just writes itself. I'm crapping myself in amusement at how seriously these people take this nonsense. I am especially amused by all the bits in red font since they are either borderline libel or just straight up hilarious. And I thought Mike Arrington emailing folks at Microsoft trying to get me in trouble after I apologized on his blog was the most absurd turn this story would take.

It's like Nick Carr wrote in his post Essjay's world, Wikipedia seems to be full of the kind of people who used to play Dungeons & Dragons back in the day and now have difficulty separating the real world from the fantasy world they've created in their heads.


Categories: Personal

March 8, 2007
@ 12:56 PM

My website is going to be down for a few days as I make some changes. While I'm gone you can check out some of these blogs instead

I'll see y'all this weekend.


Categories: Personal

It seems just like yesterday when the tech blogosphere was abuzz with news that analyst Michael Gartenberg was leaving Jupiter Research for Microsoft. So you can imagine my surprise to fire up his blog today to find the post And Back to Analyst… where he writes

This is a difficult post to write. But after  much of thought, I have decided not to remain with Microsoft and I am returning to JupiterResearch as of Monday 3/12.

At my core, I am an analyst. It’s what I do and I do it well and after much thought, I realize I’m just not ready to stop doing that job just yet. I believe Jupiter itself is poised for some amazing things in the future and I’ve invested too much in the company to feel good about walking away at this point. Therefore I have decided to return and I am pleased that I have been welcomed back. My thanks to everyone I have worked with at Microsoft.

Wow, that was quick.


Categories: Life in the B0rg Cube

March 7, 2007
@ 06:47 PM

Marvel comics has been ticking me off for a few months now with their mediocre Avengers Disassembled, House of M and Civil War trilogy but it looks like they finally found a way to push me over the edge. According to MSNBC in Death to ‘America’: Comic-book hero killed off

Captain America has undertaken his last mission — at least for now. The venerable superhero is killed in the issue of his namesake comic that hit stands Wednesday, the Daily News reported.

On the new edition's pages, a sniper shoots down the shield-wielding hero as he leaves a courthouse, according to the newspaper.
In the comic-book universe, death is not always final. But even if Captain America turns out to have met his end in print, he may not disappear entirely: Marvel has said it is developing a Captain America movie.

This reminds me of a headline from the 1990s, Superman killed by falling comic book sales, when DC Comics tried a similar stunt back in the day. The overuse of cross over stories and super hero shockers (like radical changes to a character's history or killing them off) seem to be symptoms of the death of comic books as an entertainment genre. I buy comics from a local comic book store on a monthly basis and I don't think I've ever seen anyone under the age of 25 in the five and a half years that I've been using that store. Well, there was the one time that one of the guys who worked there brought his grandson into work. 

Even though super hero movies featuring A-list and B-list superheroes from Spider-Man to Ghost Rider are making hundreds of millions of dollars at the box office, they are pretty much milking a fan base that grew up with these heroes instead of introducing these characters to a new audience. This is similar to the way that George Lucas milked a fan base that grew up on Star Wars with his series of horrific prequels although in his case I suspect that there probably is a market for Star Wars pre-prequels in another 20 years.

Without a continuous influx of fans who are interested in the source material (i.e. comic books), there won't be a next generation of fans to buy all the overpriced merchandising and special effects laden movies. However I doubt that stunts like this are a good way to get people reading the comic books again, even though it did work when they killed Superman...I was one of the suckers who bought all the books. :)

Although Cap is dead, his memory will live on...on YouTube.


Categories: Comics

March 6, 2007
@ 03:33 PM

The hottest story on Planet Intertwingly today is Rob Yates' blog post entitled Safe JSON which goes over a number of security issues one can face when exposing services using JSON. He goes ahead to describe the following approaches

Approach 1 - Plain JSON: Simply return JSON

Approach 2 - var assignment: Assign the JSON object to some variable that can then be accessed by the embedding application (not an approach used by Yahoo).

Approach 3 - function callback: When calling the JSON Web Service pass as a parameter a callback function.  The resulting JSON response passes the JSON object as a parameter to this callback function.

There are a number of good responses in the comments. The only thing I wanted to point out is that only Approach 1 can really be called using JSON. The other two are "accepting arbitrary code from a third party and executing it" which is such a ridiculously bad idea that we really need to stop talking about it in civilized company. It seems silly to point out specific security or privacy issues with using that approach when the entire concept of building Web services in that manner is fundamentally insecure and downright dangerous.

PS: It looks like Rob Sayre beat me to saying this. :)


Categories: Web Development

March 6, 2007
@ 03:11 PM

I've been thinking about AJAX a lot recently. Between reviewing a book on the subject, reading some of the comments coming out of the Adobe Engage event and chatting with some coworkers at dinner about WPF/E I've had a lot of food for thought.

I'll start with an excerpt from Ted Leung's post entitled Adobe wants to be the Microsoft of the Web

The problem as I see it
I think that a lot (but not all) apps will become RIA’s, and the base platform technology for RIA’s is very important. Too important to be controlled, or designed by any single party. The current vogue toolchain, AJAX, has this property. It also has the property of being a cross platform development nightmare. On the desktop, you commit yourself to a single cross platform library/technology, and then you spend the rest of your time wrestling with it. In AJAX, you have multiple browsers on each platform that you want to support. Not only that, you have multiple versions of each browser.
Enter Flash/Flex. Flash has a great cross platform story. One runtime, any platform. Penetration of the Flash Player is basically the same as penetration of browsers capable of supporting big AJAX apps. There are nice development tools. This is highly appealing.

What is not appealing is going back to a technology which is single sourced and controlled by a single vendor. If web applications liberated us from the domination of a single company on the desktop, why would we be eager to be dominated by a different company on the web?

Most people who've done significant AJAX development will admit that the development story is a mess. I personally don't mind the the Javascript language but I'm appalled that the most state of the art development process I've found is to use Emacs to edit my code, Firebug to debug in Firefox and attaching Visual Studio to the Internet Explorer processes to debug in IE. This seems like a joke when compared to developing Java apps in Eclipse or .NET applications in Visual Studio. Given how hypercompetitive the "Web 2.0" world is, I doubt that this state of affairs will last much longer. There is too much pressure on Web companies to improve their productivity and stand out in a world full of derivative YouTube/MySpace/Flickr knock offs. If one company finds a way to be more productive and/or build richer Web applications the rest of the industry will follow. This is pretty much what happened with Google and AJAX as well as with YouTube and Flash Video. Once those companies showed how much value they were getting from technologies which were once passe, everyone jumped on the bandwagon. This means that it is inevitable that Rich Internet Applications will eventually be built on a platform that provides a better development experience than AJAX does today. The only questions are how quickly will this happen and which technology will replace AJAX?

Ted Leung mentions two contenders for the throne; Flash/Flex and OpenLaszlo. I'll add a third entry to that list, Windows Presention Foundation/Everywhere (WPF/E). Before discussing what it will take for one of these contenders to displace AJAX, I should point out that being "open" has nothing to do with it. Openness is not a recipe for success when it comes to development platforms. According to TIOBE Java is the most popular programming language today and it was a proprietary language tightly controlled by Sun Microsystems. Before that, it was commonly stated that Visual Basic was the most popular programming language and it was a proprietary language controlled by Microsoft. I believe these count as existence proofs that a popular development platform can rise to the top while being controlled by a single vendor. 

So what will it take for an RIA platform to displace the popularity of AJAX besides being able to build richer user interfaces?

  1. Ubiquity: Over 95% of the Web users are using an AJAX capable browser. Any replacement for AJAX must have similar penetration or it's dead in the water. No one wants to turn away customers especialy when it's competitors aren't doing anything that stupid. 

  2. Debug Once, Run Anywhere: The biggest problem with AJAX is that it isn't a single development platform. Being able to write an application and debug it once instead of having a different debugging and runtime experience for Internet Explorer, Firefox and Safari is the killer productivity enhancer. Of course, there will always be differences between environments but if we can move to a world where RIA development is more like cross-platform Java development as opposed to cross-platform C++ development (yes, I know that's an oxymoron) then we all win.

  3. Continuoum of Development Tools: I don't need expensive development tools to become an AJAX developer, however if I feel that I need heavy duty tooling I can buy Visual Studio 2005 then download ASP.NET AJAX to get a fancy integrated development environment. Any platform that expects to replace AJAX  needs to have a continuoum with high quality, free & Open Source tools on one end and expensive, proprietary and "rich" tools at the other. The Java world with it's culture of Open Source tools like Eclipse, JBoss and Hibernate coexisting with overpriced tools from big vendors like IBM WebSphere and BEA WebLogic is the best example of this to date. That way the hackers are happy and the suits are happy as well. 

So far Adobe seems closer than anyone in getting the trifecta. In a year or two, things might look different.


Categories: Web Development

For the current release of RSS Bandit we decided to forego our homegrown solution for providing search over a user's subscribed feeds and go with Lucene.NET. The search capabilities are pretty cool but the provided APIs leave a lot to be desired. The  only major problem we encountered with Lucene.NET is that concurrency issues are commonplace. We decided to protect against this by having only one thread that modified the Lucene index since a lot of problems seemed to occur when multiple threads were trying to modify the search index.

This is where programming with Lucene.NET turns into a journey into the land of Daily WTF style proportions.

WTF #1: There are two classes used for modifying the Lucene index. This means you can't just create a singleton and protect access to it from multiple threads. Instead one must keep instances of two different types around and make sure if one instance is open the other is closed.

WTF #2: Although the classes are called IndexReader and IndexWriter, they are both used for editing the search index. There's a fricking Delete() method on a class named IndexReader.

Code Taken from Lucene Examples

public void  DeleteDocument(int docNum)
lock (directory)

void CreateIndexReader()
if (indexReader == null)
if (indexWriter != null)
indexWriter = null;

indexReader = IndexReader.Open(directory);

void AddDocument(Document doc)
lock (directory)

void CreateIndexWriter()
if (indexWriter == null)
if (indexReader != null)
indexReader = null;


As lame as this is, Lucene.NET is probably the best way to add desktop search capabilities to your .NET Framework application. I've heard they've created an IndexModifier class in newer versions of the API so some of this ugliness is hidden from application developers. How anyone thought it was OK to ship with this kind of API ugliness in the first place is beyond me.


Categories: Programming

Every once in a while someone asks me about software companies to work for in the Seattle area that aren't Microsoft, Amazon or Google. This is the first in a series of weekly posts about startups in the Seattle area that I often mention to people when they ask me this question.

The iLike service from GarageBand.com is one of a new breed of "social" music services which is a category popularized by Last.fm. The service consists of two primary aspects

  1. A website where one can create a profile, add friends, view stats about the music you listen to and see what music is popular among iLike users.
  2. An iTunes plugin which recommends songs from signed and unsigned artists based on what you are listening to and also allows you to see what your friends are currently listening to.

I tried the service and definitely like the concept of getting music recommendations from directly within iTunes. The only downside is that you get samples of the recommended songs (probably the same snippets from the iTunes music store) instead of having the entire recommended song streamed to you. I guess that makes sense since it is a free service and likely makes money via an affiliate program. The company recently got a bunch of funding from Ticketmaster so I expect that they will soon start integrating concert ticket recommendations into their user experience which would explain why they require a zip code when signing up for the service.

The president of iLike is Hadi Partovi who recently left Microsoft for the second time after a stint as a General Manager at MSN where he greenlighted start.com which eventually morphed into the live.com personalized page. One of the key developers of iLike is Steve Rider who was the original developer of start.com.

Press: Seattle Times on iLike

Number of Employees: 25

Location: Seattle, WA (Capitol Hill)

Jobs: jobs@iLike-inc.com, current open positions are for a Web / Server (Ruby) engineer, Software Development Engineer in Test, Web/DHTML engineer, Database engineer, and desktop client engineer


Although this has taken much longer than I expected, the Jubilee release of RSS Bandit is now done and available for all. Besides the new features there are a number of performance improvements especially with regards to the responsiveness of the application.

Major differences between v1.5.0.10 and v1.3.0.42 below

This release is available in the following languages; English, German, Polish, French, Simplified Chinese, Russian, Brazilian Portuguese, Turkish, Dutch, Italian, Serbian and Bulgarian.

Download the installer from RssBandit1.5.0.10_installer.zip . A snapshot of the source code will be availabe later in the week as a source code release.

New Features Major Bug Fixes

Categories: RSS Bandit

March 2, 2007
@ 12:23 AM

I just got a phone call from an RSS Bandit user whose daily workflow had been derailed by a bug in the application. It seems that we were crashing with an ArgumentException stating "Argument already exists in collection" when she tried to import an OPML file. This seemed weird because I always make sure to check if a feed URL exists in the table of currently subscribed URIs before adding it. Looking at the code made me even more confused

f1.lastretrievedSpecified = true;
f1.lastretrieved = dta[count % dtaCount];
_feedsTable.Add(f1.link, f1); /* exception thrown here */

So I looked at the implementations of the ContainsKey() and Add() in my data structure which lead me to the conclusion that we need better unit tests

public virtual bool ContainsKey(String key) {			
  return (IndexOfKey(key) >= 0);

public virtual void Add(String key, feedsFeed value) {
	if ((object) key == null)
		throw new ArgumentNullException("key");

	/* convert the URI to a canonicalized absolute URI */ 
		Uri uri = new Uri(key); 
		key = uri.AbsoluteUri;
		value.link = key; 
	}catch {}

	int index = IndexOfKey(key);

	if (index >= 0)
		throw new ArgumentException(
			"Argument already exists in collection.", "key");

	Insert(~index, key, value);

My apologies to any of our users who have been hit by this problem. It'll be fixed in the final release of Jubilee.


Categories: Programming | RSS Bandit

From the blog post entitled The i'm Initiative and new secret emoticon on the Windows Live Messenger team's blog we learn

Not everyone has the financial ability to give money to the causes they care about. That is where the i'm Initiative steps in - it enables Windows Live Messenger users to make a difference by directing a portion of Messenger's advertising revenue to a cause of their choosing.
Wonderful! How does it work?

  1. Use Messenger 8.1
  2. Add the i'm emoticon to your display name by entering the code of the cause you would like to support 
  3. Send and receive IMs
  4. A portion of the advertising revenue generated by your usage of Messenger will be donated to your cause. So the more IMs you send and receive the more money will be donated to your cause.
How does Messenger even generate revenue\money anyway?

Windows Live Messenger is a free service to users. We do include advertisements in the client that help pay for the service and our salaries. With the i'm Initiative you get to decide where a portion of the revenue goes.

The list of codes to create the emoticon are listed in the blog post. I'm using *9mil in my IM handle. This trend of tying charitable donations to the usage of Windows Live services is interesting. It's kinda cool for our users to feel like they are contributing to the betterment of the world simply by using our software the same way they have every day. Good stuff.


Categories: Windows Live

While I was house hunting a couple of weeks ago, I saw a house for sale that has a sign announcing that there was an "Open House" that weekend. I had no idea what an "Open House" was so I asked a real estate agent about it. I learned that during an "Open House", a real estate agent sits in an empty house that is for sale and literally has the door open so that people interested in the house can look around and ask questions about the house. The agent pointed out that with the existence of the Internet, this practice has now become outdated because people can get answers to most of their questions including pictures of the interior of houses for sale on real estate listing sites.

This got me to thinking about the Old Way vs. Net Way column that used to run in the Yahoo! Internet Life magazine back in the day. The column used to compare the "old" way of performing a task such as buying a birthday gift from a store with the "net" way of performing the same task on the Web.

We're now at the point in the Web's existence where some of the "old" ways to do things are now clearly obsolete in the same way it is now clear that the horse & buggy is obsolete thanks to the automobile. After looking at my own habits, I thought it would be interesting to put together a list of the top five industries that have been hurt the most by the Web. From my perspective they are

  1. Map Makers: Do you remember having to buy a map of your city so you could find your way to the address of a friend or coworker when you'd never visited the neighborhood? That sucked didn't it? When was the last time you did that versus using MapQuest or one of the other major mapping sites.

  2. Travel Agents: There used to be a time when if you wanted to get a good deal on a plane ticket, hotel stay or vacation package you had to call or visit some middle man who would then talk to the hotels and airlines for you. Thanks to sites like Expedia the end result may be the same but the process is a lot less cumbersome.

  3. Yellow Pages: When I can find businesses near me via sites like http://maps.live.com and then go to sites like Judy's Book or City Search to get reviews, the giant yellow page books that keep getting left at my apartment every year are nothing but giant doorstops.

  4. CD Stores: It's no accident that Tower Records is going out of business. Between Amazon and the iTunes Music Store you can get a wider selection of music, customer reviews and instant gratification. Retail stores can't touch that.

  5. Libraries: When I was a freshman in college I went to the library a lot. By the time I got to my senior year most of my research was exclusively done on the Web. Libraries may not be dead but their usefulness has significantly declined with the advent of the Web.

I feel like I missed something obvious with this list but it escapes me at the moment. I wonder how many more industries will be killed by the Internet when all is said and done. I suspect real estate agents and movie theaters will also go the way of the dodo within the next decade.

PS: I suspect I'm not the only one who finds the following excerpt from the The old way vs. the net way article hilarious

In its July issue, it compared two ways of keeping the dog well-fed. The Old Way involved checking with the local feed store and a Petco superstore to price out a 40-lb. bag of Nutra Adult Maintenance dog food. The effort involved four minutes of calling and a half-hour of shopping.

The Net Way involved electronically searching for pet supplies. The reporter found lots of sites for toys and dog beds, but no dog food. An electronic search specifically for dog food found a "cool Dog Food Comparison Chart" but no online purveyor of dog chow. Not even Petco's Web site offered a way to order and purchase online. The reporter surfed for 30 minutes, without any luck. Thus, the magazine declared the "old way" the winner and suggested that selling dog food online is a business waiting to be exploited.

Yeah, somebody needs to jump on that opportunity. :)


Categories: Technology