July 29, 2005
@ 03:06 PM

Mike Torres has a blog post entitled  On "MSN: Social Networking Edition" where he points to news stories about a core piece of a future version of MSN Spaces he and I have been working on for the past few months. Mike writes

On "MSN: Social Networking Edition"

Wow, is this actually true?  If so, I wonder how this will change the way people find and communicate with others in the future.

Quote (emphasis mine)

Microsoft Monitor: MSN: Social Networking Edition
A forthcoming "friend of friends" feature will add personal networks of friends to a MSN Space. In someone's My Space [sic], there will appear pictures of the friends, which can include friends of other friends. Friends can be added from people known or associated with friends or from MSN Spaces searches. Blake used the example of searching for golf blogs. If friends are on MSN Messenger, an icon indicates so.

More on Blake, Yusuf, MSN Messenger file sharing, and "Microsoft's new Web-based mail system" as well.  It is an interesting read... 

Funniest part of the post: "small consumer adoption [of Spaces]".  Joe is usually quick to take a shot at Microsoft without actually learning the facts first.  Come on, Joe...  I think the most recent public number was 17 million spaces created worldwide.  Compare that to any of our competitors in their first 3 years of existence.  Yeah...  small consumer adoption.  Maybe 17 million Roombas created those spaces :)

Mary Jo Foley has a little bit more here: 
Microsoft Watch.  My favorite part:

"Our ability to enter, differentiate and compete has never been stronger," Mehdi told the Wall Street analysts and media representatives who attended the analyst meeting.

Couldn't agree more

Although I've primarily been talking about my work on getting a public API for MSN Spaces off the ground, I also work on our social software platform on the back end as well. Once we ship the next version I'll be able to talk a bit more about some of the design decisions we made and I'll get to see how users end up utilizing the features we've been working in.

I love my day job.


 

Categories: MSN

I've been reading some of the hype around microformats in certain blogs with some amusement. I have been ignoring microformats but now I see that some of its proponents have started claiming that using XML on the Web is bad and instead HTML is the only markup language we'll ever need.

In her post Why generic XML on the Web is a bad idea Anne van Kesteren writes

Of course, using XML or even RDF serialized as XML you can describe your content much better and in far more detail, but there is no search engine out there that will understand you. For RDF there is a chance one day they will. Generic XML on the other hand will always fail to work. (Semantics will not be extracted.)

An example that shows the difference more clearly:

<em>Look at me when I talk to you!</em>

… and:

<angry>Look at me when I talk to you!</angry>

The latter element describes the content probably more accurately, but on ‘the web’ it means close to nothing. Because on the web it’s not humans who come by and try to parse the text, they already know how to read something correctly. No, software comes along and tries to make something meaningful of the above. As the latter is in a namespace no software will know and the latter is also not specified somewhere in a specification it will be ignored. The former however has been here since the beginning of HTML — even before it’s often wrongly considered presentational equivalent I — and will be recognized by software.

This post in itself isn't that bad, if anything it is just somewhat misguided. However Tantek Celik followed it up with his post Avoiding plain XML and presentational markup which boggled my mind. Tantek wrote

The marketing message of XML has been for people to develop their own tags to express whatever they wanted, rather than being stuck with the limited predefined tag set in HTML. This approach has often been labeled "plain XML" or "generic XML" or "SGML, but easier, better, and designed just for the Web".

The problem with this approach is that while having the freedom to make up all your own tags and attributes sounds like a huge improvement over the (mostly perceived) limits of HTML, making up your own XML has numerous problems, both for the author, and for users / readers, especially when sharing with others (e.g. anything you publish on the Web) is important.

This post by no means contains a complete set of arguments against plain/generic XML and presentational markup, nor are the arguments presented as definitive proofs. Mostly I wanted to share a bunch of reinforcing resources in one place. Readers are encouraged to improve upon the arguments made here.

The original impetus for creating XML was to enable SGML on the Web. People had become frustrated with the limited tag set in HTML and the solution was to create a language that enabled content creators to create their own tags yet have them still readable in browsers via stylesheet technologies (e.g. CSS). Over time, XML has failed to take off as a generic document format used by content authors for creating human readable documents on the Web but has become popular as a data format used for machine to machine communications on the Web(RSS, XML-RPC, SOAP, etc) .

Thus any arguments against XML usage on the Web today are really arguing about using XML as a data format since it isn't really used as a document format except for XHTML [and even that is only by markup geeks like Tantek & Anne].

Anyway let's look at some of Tantek's arguments against using XML on the Web...

Tower of Babel Problem

If everyone invents their own tags and attributes, pretty soon you get people calling the same thing by different names and different things by the same name. While avoid both of those occurences completely is very difficult (many of the microformats principles are design to help avoid those problems), downright encouraging authors to make up their own tags and attributes makes it much worse and all you end up with are a bunch of documents that give you the illusion of self-description.

Didn't the XML world solve this with XML namespaces like six or seven years ago?

Temptation of Presentational Markup

What happens all too often when authors or developers make up their own tags is that they choose tags that are tightly tied to a specific presentation rather than abstracting them with semantics. Quite similar to the phenomenon of authors picking presentational class names.

As a casual user of HTML, I personally haven't seen a good explanation of why <strong> is better than <b> so arguments whose entire basis is "presentational markup is evil" don't carry much weight in my book. If I come up with a custom markup format and it has a <bold> element, is that really so evil? I'm pretty sure that the XML formats used by OpenOffice or Microsoft Office contain markup that is presentational in nature whether it is setting font sizes, text colors or paragraph alignemnt. Are they evil or does the fact that they aren't intended for the Web give them a pass?

Preferring Semantic Richness

Sometimes something is a bad idea not just in absolute terms, but also relative to other approaches and solutions.

A while ago I wrote about a semantic richness spectrum on the www-style mailing list which went into a bit more detail. Håkon Wium Lie wrote a paper that both predated my rough summary by a couple of years, and provided a much more thorough analysis.

 Languages with well-known semantics are preferred to proprietary/made-up XML. This is for many reasons, including accessibility, cross-device support, and future user agent support.

This seems to be arguing that instead of cooking up your own custom format you should pick an established format with the semantics you want if one exists. This is regularly practiced in the XML world especially when it comes to the Web so I don't see how this is an argument against using XML.

--

Seriously, I feel like I am in some bizarre alternate universe if having aggregators subscribe to HTML web pages is being advocated as being a better idea than using specialized XML formats like RSS & Atom.

That's it...I'm going back to my vacation. The world has gone too loopy for me.


 

Categories: XML

It looks like MSN Virtual Earth is live. In an attempt to beat everyone to the punch when it comes to building an application using the MSN Virtual Earth API, I present http://www.25hoursaday.com/moviefinder

It requires Javascript (duh) and only works in IE beecause I didn't have time to figure out the Firefox equivalent of innerHTML. Enjoy.

Pardon my brevity, I'm posting this from a kiosk at Heathrow Airport


 

Categories: MSN | Web Development

July 24, 2005
@ 02:40 AM

I'll be leaving for Nigeria in the next couple of days and should be gone for a few weeks. Going home is always fun, I'll have my mom asking me when I'm going to settle down with a nice Catholic Nigerian girl while my dad wonders when I'm going to stop goofing off at Microsoft and go back to school to finish my education.

Of course, the best part about going on vacation is the mountain of email and spec bugs I know will be waiting for me when I get back. :)

 


 

Categories: Ramblings

A couple of MSN betas snuck out into the world this week.

In his post MSN Shopping Officially Launches New Beta Site, Chris Jolley talks about the new http://beta.shopping.msn.com. Some of the new features include

  • RSS Feeds – it remains to be seen how aggressively consumers take to this, but I think it's super-cool how we are embracing an emerging technology.    
  • Ratings & Reviews - allows consumers to see what other consumers think of a product, providing them with increased confidence in their purchase decisions
  • Ability to browse our complete selection through an easy-to-understand taxonomy and powerful search
  • Refine searches with expanded, relevant criteria
    •  By price, rating, popularity, name, best match
    • Multiple Views - Results can be viewed in three different views
  • Usability enhancements:
    • Recently viewed– helpful little personalization feature that helps continuity between sessions
    • Compare prices across stores, find out who has free shipping, hot deals, etc.
    • Clean and consistent UI – The UI is clean and simple and designed to NOT get in the user's way
  • Useful Merchandising – Comprehensive set of merchandising, including the Gift Center , Seasonal Shops & Guides , and more

I met with the PM who owns the RSS feeds feature a few months ago to talk about my scenarios and its great to see that it's finally out there. I use the Amazon RSS feeds all the time and can definitely see myself getting some use out of the news feeds from MSN Shopping. All I can say about the ratings & reviews feature is...FINALLY. I've already begun to scheme about how I can convince Mike that we should totally gank learn from the feature in Yahoo 360° where you can include your reviews on your "space".

The MSN Screensaver beta is also out. Some of the features include

  • Personalize with background photos and news and weather information from MSN or any RSS feeds from websites you choose.
  • Search the Web and click news headlines directly from the Screen Saver.
  • Stay connected with Hotmail, MSN Messenger, and MSN Spaces. Track how many unread Hotmail messages and current Messenger conversations you have, and display blogs and photos from your friends’ MSN Spaces.
  • Some of the features of the screen saver are so useful I'm wondering why they are wasted on an app that typically runs when the user is away from the computer.

    And finally, http://www.start.com/myw3b has some new features. Sanaz Ahari has a blog post entitled new things on start.com... where she writes

    first and foremost, check out our new search results:
    - we now have tabbed results that include web, news and rss , so you don't need to do seperate types of search we just give you all the results and it's all ajax based of course
    - the rss resutls are very cool, cause you can just subcribe by clicking on subscribe and they'll get added to your feed
     
    we've also added themes : ice, granite and ocean... ice is my favorite :)
    super easy to navigate, needless to say all in place and no refreshes required.
     
    we got rid of our not so useful home link and folder view - so now when you click on a folder we don't replace your dashboard with the content of that folder... yes, we listen to our user feedback :)
     
    lastly you can now hit the esc button to close your overlay modules - super usable...

    There's a followup post on the start.com weblog entitled more on start: OPML support and popular feeds which states that they added support for importing OPML files as well.  

    Postscript: Anyone notice what all three of these betas have in common? (Hint: Starts with R ends with S and has an S in the middle).


     

    Categories: MSN

    Shelley Powers posted a comment to a recent post of mine about her experience interviewing for a Microsoft position. She wrote

    Tod, if I sound angry, I am. Bill Gates went in front of Congress and said we should have unlimited H1Bs, because Microsoft just can't find 'good' people in this country.

    I interviewed with Microsoft. As soon as I started talking with the guy, I knew he wasn't interested. Within the first five minutes. I've been around long enough to know when someone isn't interested.

    Microsoft doesn't hire 50 year old men, must less 50 year old women.

    As for interviewing, yes I have interviewed people. And hired them. I've never once had a bad hire. You can tell when you talk to people, their enthusiasm, how they respond to what you say, if they're a good fit. Do they have to have all of the tech background? Not a bit. Oh, I wanted certain things: interest and background with a specific programming language, experience with relational databases, and so on. But as for the nits, most of us can do something called 'learn'. I looked for motivation, interest, fit with the team, experience, but more importantly interest in the job.

    I can understand Shelley's frustration around not being made an offer for a position she feels qualified to handle. Last year, when I decided to leave the XML team I interviewed with a bunch of teams at Microsoft including the newly formed RSS team within the Internet Explorer group. The job would have involved community evangelization around the Longhorn/IE RSS efforts and working on platform APIs for RSS. Given my work on RSS Bandit and the fact that the job I was leaving was designing the core XML APIs in the .NET Framework I thought I was a shoo-in for the job. I wasn't. I didn't get an offer because I wasn't "passionate" or "experienced" enough according to the feedback I got when I inquired as to why I didn't get the job.

    I could have decided to give up because "Microsoft doesn't hire black men" or "Microsoft doesn't hire people in their 20s for high visibility positions" but didn't. Instead I kept interviewing and ended up at MSN working on social software. Personally I think I'm happier at MSN than I would have been on the RSS team, I've gotten to broaden my technical horizons and work with a more diverse set of individuals on a prettier campus. Also I still get to work on RSS stuff both in my personal time and with different teams at Microsoft in an advisor role.

    It's easy to give in to despair. Don't.


     

    Categories: Life in the B0rg Cube

    Jeff Schneider has a post entitled Hey, Don Box where he asks

    All of us in the SOA blogosphere have appreciated your input and insight. However it has come to my attention that Microsoft isn't practicing what they're preaching. I'm hoping that you can tell me that my information is dead wrong.

    I ask one simple favor. Start blogging about how Microsoft is using SOA internally.

    Don works on a platforms team so he is the wrong person to ask about first hand experience at actually deploying XML web services or service oriented applications at Microsoft. As for whether Microsoft uses services internally, this is all we do at MSN. I'm responsible for underlying platform services that are used by MSN Messenger, Hotmail, and MSN Spaces. The fact that you can lock down your space to distinct people in your Hotmail address book or the people who are on your Messenger Allow List (i.e. who you've given access to see your presence online) are all due to our service oriented architecture on our back ends. This would be a lot more difficult if each of these applications was a monolithic app on our back end.

    I'm also responsible for another set of services which are used MSN Member Directory, MSN Groups, and even Windows MarketPlace which I found out recently are going to be used by a lot more MSN properties without them ever having to ping me. When services work the way they are supposed to that's what happens.

    To answer Jeff's question, at least with MSN, Microsoft is definitely practicing what it preaches when it comes to service orientation.


     

    Categories: MSN | XML Web Services

    In a post entitled Atom 0.3 Denouement begins his advocacy for developers to stop supporting Atom 0.3 and states his intent to start flagging such feeds as being invalid in the Feed Validator come the fall. I planned to avoid blogging about his post until I saw the following comment by Mark Pilgrim where he wrote

    Atom 1.0 will shortly be an IETF RFC, which makes it as much of a web standard as HTTP.  Atom 0.3 was just some guys (and gals) dicking around on a wiki.  As it turned out, some guys dicking around on a wiki were able to produce a relatively decent standard, but that isnt saying much given the competition.  Atom 1.0 is a great standard, worthy of the label and worthy of being pushed by standards advocacy groups like WaSP...

    Although what Mark Pilgrim has written is factual it is misleading as well. Although Atom 0.3 was not backed by a standards body (and neither has any flavor of RSS by the way) it still became a de facto standard thanks to the advocacy of people like Mark & Sam. Specifically once Google decided to switch their RSS feeds to Atom 0.3 feeds they used their power as a dominant content producer to force every major aggregator to support Atom 0.3.

    At the time I blogged about how this was a stupid thing to do since it basically guaranteed that there would be two conflicting versions of Atom for the immediate future. Now there are hundreds of thousands to millions of aggregator users who will potentially be screwed when Google decides to switch to Atom 1.0. These end users are sacrificial pawns in what has basically been a battle of [male] geek egos over whether a blog post in an XML feed should be contained in an element named atom:entry or item.

    The only bright light in all this crap is that a few years after everyone else figured it out some of these XML syndication geeks are now realizing that instead of arguing over XML element names it is more interesting to figure out what other kinds of data can be syndicated beyond blog posts and news stories. See Danny Ayers's post Brownian Motion and Bill de hra on Atoms in a small world for examples of some of the Atom geeks finally getting it.

    Better late than never, I guess.


     

    July 21, 2005
    @ 03:44 PM

    During my morning blog reading I stumbled on three blog postings about Microsoft and recruiting which paint an interesting picture.

    1. From Shelley Powers's When We Are Needed

      This essay was inspired in no part by a discussion that occurred at Dori Smiths weblog, when she made the statement about women not being able to find work (linked earlier). In her comments, Robert Scoble said:

      Hmmm, at the same time you say the jobs are disappearing I was just talking with a key manager over on MSN Search and he says he is having trouble finding qualified developers in the United States. I also have had the same feedback from the developer division, the IE group, and quite a few others. And if you think this is a Microsoft thing, you should check with HR people at Google, Yahoo, Cisco, and other Silicon Valley companies. They are all having trouble finding great developers.
      I was angry and blasted Scobles comment, anger inspired in no small part by the implication that corporations such as Microsoft are just begging for people, when most of us know (and as I discussed earlier), this isnt true. Here is a fact, technology unemployment in this country exceeds overall unemployment. And women in technology have an unemployment rate higher than the men.

    2. From Joel Spolsky on June 15, 2005

      Recruiting

      To Gretchen : recruiting successfully isn't only up to recruiters. The best recruiting department in the world can't make people want to work at a company that's moribund, that can't figure out how to ship a compelling upgrade to their flagship OS , or update their flagship database server more than once every five years, that has added tens of thousands of technical workers who aren't adding any dollars to the bottom line, and that constantly annoys twenty year veterans by playing Furniture Police games over what office furniture they are and aren't allowed to have. Summer interns at Fog Creek have better chairs, monitors, and computers than the most senior Microsoft programmers.

      Recruiting has to be done at the Bill and Steve level, not at the Gretchen level. No matter how good a recruiter you are, you can't compensate for working at a company that people don't want to work for; you can't compensate for being the target of eight years of fear and loathing from the slashdot community, which very closely overlaps the people you're trying to recruit, and you can't compensate for the fact that a company with a market cap of $272 billion just ain't going to see their stock price go up . MSFT can grow by an entire Google every year and still see less than 7% growth in earnings. You can be the best recruiter in the world and the talent landscape is not going to look very inviting if the executives at your company have spent the last years focusing on cutting benefits , cutting off oxygen supplies, and cutting features from Longhorn .

    3. From Steven Sinofsky's Welcome to TechTalk!

      I wanted to start this blog to share information and views about how Microsoft recruits and hires college graduates, and what a career at Microsoft is like, at least from one perspective. I invite questions, points, and counter-points. Im excited to use this forum to have a discussion about college hiring at Microsoft. The name TechTalk comes from the series of seminars we do during the summer for interns at Microsoft--one of the most fun times of the year for me is to get to present to this group and learn from them how they feel about the work we're doing and the future of Microsoft.

      By way of introduction, my name is Steven Sinofsky and I am a senior vice president at Microsoft in the Office group. You can read my "official" bio on http://www.microsoft.com/presspass/exec/ssinofsky . Ive worked on Microsoft Office since Office 4.2d (the last 16 bit release). Ive been a program manager and a software design engineer, in addition to a general manager.

    Despite what Shelley thinks I know for a fact that we have a hard time filling positions at work. Whether this is because of a lack of qualified candidates in the US or because the Slashdot crowd hates Microsoft [as Joel puts it] is something I dont know. I do know our product teams spends a lot of time talking to folks who sound good on paper but dont do so hot when we talk to them. As for whether H1B visas are a good thing or not, well I'm here on an H1B visa so I guess I'm biased. :)

    It is good to see high level execs like Sinofsky getting directly involved in recruiting efforts. One of the things that is missing in the hundreds of Microsoft blogs is a sense of why the Microsoft internships are so cool. Looking back at my blog posts from when I was an intern four years ago it is fun to see how I became infected by the B0rg. Having someone like Sinofsky take part in showing off why it is so cool to be an intern at Microsoft is goodness. Microsoft's best hires are usually folks who started off as interns.

    The Office guys definitely rock.

     


     

    Categories: Life in the B0rg Cube

    It looks like Karen will be attending the BlogHer conference. From her post it seems she'll be part of a session with the following abstract

    Women around the world are leveraging blogs to get their message across - whether it be to share their experiences, promoting their business or voicing their opinions. But changing the blogosphere doesn't just happen from blogging about it - change can also happen from the source - those who are building the tools and software. Technology isn't as male-oriented as you might think.

    Did you know that many of the big-name blogging tools have women helping to design and build them? Ever wonder how decisions get made or why things are designed or work the way they do? We invite women interested in helping to shape blogging tools and those currently building blogging tools to participate in this forum. You don't have to be a techie - share your thoughts and gripes on blogging tools today.Tell us what you what to see happen.

    To many people the most familiar female face when it comes to blogging tools is Mena Trott. However there are a bunch of women working on building popular blogging platforms such as MSN Spaces who have been quite influential as well. For example, when I joined MSN the Spaces team had women in key positions; Karen owned the blogging experience, Divya owned photos and Lydia owned profiles. Since then some folks have come and gone and although Divya & Lydia are no longer with us we've added Maya who works on cool top secret stuff and DeEtte who now owns the photo experience.

    Although A-list and tech blogs tend to be filled with testosterone totting geeks pontificating about pointless geekery this doesn't mean that there haven't been women involved bringing one of the biggest revolutions in personal publishing to the world. 

    Hopefully, the BlogHer conference will be a useful way for some of these women to network and find a way to increase the visibility of their efforts. If that is what they want.


     

    Categories: MSN

    From the Microsoft-Watch article Google Pinches Another Microsoft Exec 

    Google continues to hire away top Microsoft talent. But this time, Microsoft is fighting back. On Tuesday, Google announced plans to open a product research-and-development center in China, and said it was appointing former Microsoft vice president Kai-Fu Lee to head the operation. On Wednesday, Microsoft announced it was filing a lawsuit against Lee and Google, claiming breach of both employee confidentiality and non-compete agreement.
    ...
    Other mid-level Microsoft executives have joined Google over the past couple of years, as well. And
    Google opened a product-development office in Kirkland, Wash., late last year. Some industry watchers speculated that Google did so in order to attract more hires from Microsoft, which is headquartered in nearby Redmond, Wash.

    It's the mid-level product and program managers whom Microsoft and other tech companies should guard most jealously, said Directions on Microsoft analyst Michael Cherry.

    "While a lot of people make a big thing about the executives and senior vice presidents that leave, these people rarely ship software," Cherry said. "I think it is a bigger issue when the group program managers and program managers with ten plus years of experience silently leave. No one mourns their departure, but these are the people that take the grandiose architectures and wild-eyed visions and actually make them into products that people can use—and do so in a timely manner.

    "The loss of these silent but hard working employees who keep the teams working together may have a bigger effect on the schedules of products like Yukon and Longhorn, and have a bigger long term impact on the company than any of the growing number of VPs and visionary architects," Cherry added.

    From the Seattle PI article Ex-Microsoft exec sued over Google job

    Microsoft also accused Google of "intentionally assisting Lee."

    "Accepting such a position with a direct Microsoft competitor like Google violates the narrow noncompetition promise Lee made when he was hired as an executive," Microsoft said in its lawsuit. "Google is fully aware of Lee's promises to Microsoft, but has chosen to ignore them, and has encouraged Lee to violate them."
    ...
    Tom Burt, a lawyer for Microsoft, said Lee announced Monday that he was leaving for the Google job and had given no indication that he planned to honor an agreement not to work for a direct competitor for one year.

    "To the contrary, they're saying, 'In your face,'" Burt told The Associated Press.

    Google shot back with a statement saying: "We have reviewed Microsoft's claims and they are completely without merit. Google is focused on building the best place in the world for great innovators to work. We're thrilled to have Dr. Lee on board at Google. We will defend vigorously against these meritless claims."

    Um, OK...

     


     

    Categories: Life in the B0rg Cube

    Andy Edmonds has a post over on the MSN Search blog entitled Tagging Feedback at MSN Search where he talks about the internal app he built that is used to track feature requests and bug reports about MSN Search.

    When the MSN Search team gets feedback or bug reports, each one is "tagged" with multiple keywords/categories which can then be analyzed later by frequency. The example, Andy shows in his post is the tag "ypResults" which is used to categorize featue requests for yellow page hits as part of web search results. With this system the search team can have a simple yet effective way to keep track of their most hot button issues

    Andy showed this to me a few months ago and I thought it was really cool. I'd have loved to have a system like this when I used to work on the XML team to figure out what features/bugs were most often requested by users in a quantitative way.

    Below is a screenshot of the feature (names changed to protect the innocent)


     

    Categories: MSN

    I was at the Anger Management 3 concert last night and it was quite the show. Lil' Jon & The Eastside Boyz were a welcome surprise as the opening act. They cycled through the BME clique hits from "Get Low" to "Salt Shaker" for the 30 minutes they were on stage. The problem with Lil' Jon is that most of the hits you associate him with are collaborations so at concerts you end up getting half the performance not being live for songs such as "Yeah!" or "Lovers & Friends".

    The next set had the entire G-Unit record label including newly signed acts like Mobb Deep & M.O.P. performing for just over an hour. The first part of the G-Unit set sucked because we had to sit through the crap singles from Tony Yayo, Young Buck and Lloyd Banks solo efforts as well as some of the crud from The Massacre. Halfway through it picked up with the better songs from The Massacre (Disco Inferno, Candy Shop), old hits from Get Rich or Die Tryin' (P.I.M.P., In Da Club, Wanksta) and G-Unit's Beg for Mercy (I Wanna Get To Know Ya). M.O.P. did their hit from a few years ago, "Ante Up", and Mobb Deep hit the crowd with "Quiet Storm" without Lil' Kim. There was a momentary infusion of crap when a lot of time was devoted to a new 50 Cent & Mobb Deep song but the show got back on track after that. The G-Unit set was OK but I'd have loved t hear some of their mix tape cuts instead of just mainstream tracks.

    Eminem killed. He made the concert go from OK to fantastic with almost an hour and a half of performances from himself and D12. Even 50 Cent got in on the act when they performed "Patiently Waiting" and "Gatman & Robin". The parts of the show where Eminem riffed with the audience about tabloids, Mariah Carey and Michael Jackson were also golden.

    If this show is going to hit your town you should definitely check it out.


     

    Categories: Music

    My buddy Erik Meijer and Peter Drayton have written a paper on programming languages entitled Static Typing Where Possible, Dynamic Typing When Needed: The The End of the Cold War Between Programming Languages. The paper is meant to seek a middle ground between the constant flame wars over dynamically typed vs. statically typed programming language. The paper is pretty rough and definitely needs a bunch of work. Take the following excerpt from the first part of the paper

    Static typing fanatics try to make us believe that “well-typed programs cannot go wrong”. While this certainly sounds impressive, it is a rather vacuous statement. Static type checking is a compile-time abstraction of the runtime behavior of your program, and hence it is necessarily only partially sound and incomplete. This means that programs can still go wrong because of properties that are not tracked by the type-checker, and that there are programs that while they cannot go wrong cannot be type-checked. The impulse for making static typing less partial and more complete causes type systems to become overly complicated and exotic as witnessed by concepts such as "phantom types" and "wobbly types"
    ...
    In the
    mother of all papers on scripting, John Ousterhout argues that statically typed systems programming languages make code less reusable, more verbose, not more safe, and less expressive than dynamically typed scripting languages. This argument is parroted literally by many proponents of dynamically typed scripting languages. We argue that this is a fallacy and falls into the same category as arguing that the essence of declarative programming is eliminating assignment. Or as John Hughes says, it is a logical impossibility to make a language more powerful by omitting features. Defending the fact that delaying all type-checking to runtime is a good thing, is playing ostrich tactics with the fact that errors should be caught as early in the development process as possible.

    We are interesting in building data-intensive three-tiered enterprise applications. Perhaps surprisingly, dynamism is probably more important for data intensive programming than for any other area where people traditionally position dynamic languages and scripting. Currently, the vast majority of digital data is not fully structured, a common rule of thumb is less then 5 percent. In many cases, the structure of data is only statically known up to some point, for example, a comma separated file, a spreadsheet, an XML document, but lacks a schema that completely describes the instances that a program is working on. Even when the structure of data is statically known, people often generate queries dynamically based on runtime information, and thus the structure of the query results is statically unknown.

    The comment about making programming languages more powerful by removing features being a logical impossibility seems rather bogus and seems out of place in an academic paper. Especially when one can consider the 'removed features' to be restrictions which limit the capabilities of the programming language.

    I do like the fact that the paper tries to dissect the features of statically and dynamically typed languages that developers like instead of simply arguing dynamic vs. static as most discussions of this form take. I assume the purpose of this dissection is to see if one could build a programming language with the best of both worlds. From personal experience, I know Erik has been interested in this topic from his days.

    Their list of features runs the gamut from type inference and coercive subtyping to lazy evaluation and prototype inheritence. Although the list is interesting I can't help but think that it seems to me that Erik and Peter already came to a conclusion and tried to fit the list of features included in the paper to that conclusion. This is mainly taken from the fact that a lot of the examples and features are taken from  instead of popular scripting languages.

    This is definitely an interesting paper but I'd like to see more inclusion of dynamic languages like Ruby, Python and Smalltalk instead of a focus on C# variants like . The paper currently looks like it is making an argument for Cω 2.0 as opposed to real research on what the bridge between dynamic and static programming languages should be.


     

    Categories: Technology

    Robert Scoble has posted aseries ofentries comparing the Bloglines Citations feature with Technorati.com for finding out how many sites links to a particular URL. His conclusion seems to be that Technorati sucks compared to Bloglines which has led to an interesting back & forth discussion between him and David Berlind.

    I've been frustrated by Technorati.com for quite a while and have been quietly using Bloglines Citations as an alternative when I want to get results from a web search and PubSub for results I want to subscribe to in my favorite RSS reader. Technorati seems to lack the breadth of either service when it comes to finding actual blog posts that link to a site and neither site brings up unrelated crap such as blogrolls in their results.

    The only problem with Bloglines is that their server can't handle the load and the citations feature is typically down several times during the day. Technorati has also had similar problems recently.

    At this point all that Technorati seems to have going for it is first mover advantage. Or is there some other reason to use Technorati over competitors like Bloglines or PubSub that I've missed?


     

    From  Omar's post in Sender ID I see that Forbes has an article entitled Microsoft, Yahoo! Fight Spam--Sort Of. The article gives a pretty even handed description of the various approaches both Yahoo! and MSN are taking in dealing with phishing and spam.

    In the article we learn

    While some e-mail services have adopted SenderID, there are still many that have not. According to Cox, the other reason for the false positives is that not all users remain on a single server. “SPF says, ‘All of my mail should come from these servers,’” says Cox. For many of EarthLink’s customers, they can be legitimately on a variety of servers, such as a corporate server, and still send and receive mail using their EarthLink address. For those users, SPF fails.

    EarthLink started testing DomainKeys in the first quarter of 2005 and now signs over 70% of all outgoing mail. Other companies are also testing DomainKeys. Yahoo! Mail claims to be receiving approximately 350 million inbound DomainKeys signed messages per day.

    Critics have accused Microsoft forcing SenderID on the industry without addressing questions about perceived shortcomings. The company drew fresh criticism recently when reports claimed that its Hotmail service would delete all messages without a valid SenderID record beginning in November. While AOL uses SPF, many e-mail systemsdo not. If Microsoft went through with this, for example, a significant portion of valid e-mails would never reach intended Hotmail recipients.

    Microsoft says that Hotmail will not junk legitimate e-mail solely because the sending domain lacks an SPF record. The company says SenderID will be weighed more heavily in filtering e-mails, but will remain one of the many factors used when evaluating incoming e-mail. The company did say that with increased adoption of Sender ID and SPF, it will eventually become a more reliable indicator.

    Both SenderID and DomainKeys filter messages with spoofed e-mail addresses in which the sender has changed the "From:"field to make it look like someone else has sent the e-mail. For example, many phishing scams come from individuals posing as banks. Under the SenderID framework, if the bank has published an SPF record, the receiving server can compare the originating server against the SPF record. If they don’t match, the receiving server flags it as spam. DomainKeys perform a similar comparison but use an encrypted key in each message and the public key unique to each domain to check where the message originated.

    The amount of phony email I get per week claiming to be from Paypal & eBay and requesting that I 'confirm my account info or my account will be cancelled' is getting ridiculous. I welcome any technology that can be used to fight this flood of crap.


     

    Categories: MSN

    July 17, 2005
    @ 05:54 AM

    From Tim Bray's post entitled Atom 1.0 we learn

    There are a couple of IETF process things to do, but this draft (HTML version) is essentially Atom 1.0. Now would be a good time for implementors to roll up their sleeves and go to work.

    I'll add this to the list of things I need to support in the next version of RSS Bandit. The Longhorn RSS team will also need to update their implementation as well. :)

    I couldn't help but notice that Tim Bray has posted an entry entitled RSS 2.0 and Atom 1.0, Compared which is somewhat misleading and inaccurate. I find it disappointing that Tim Bray couldn't simply announce the upcoming release of Atom 1.0 without posting a FUD style anti-RSS post as well.

    I'm not going to comment on Tim Bray's comparison post beyond linking to other opinions such as those from Alex Bosworth on Atom Failings and Don Park on Atom Pendantics.


     

    The list of PDC 2005 sessions is out. The website is rather craptacular since I can't seem to link directly to search results or directly to sessions. However thanks to some inside information from my man Doug I found that if you search for "POX" in the the session track list, you'll find the following session abstract

    Indigo: Web Services for XML Programmers
    If you love XML, you will love this session. Learn how to write services that range from Plain Old XML (POX) to WS-I Basic Profile and WS-* using Indigo. Learn best practices for transforming and manipulating XML data as well as how and when to expose strong-typed views. If you use XML, XSLT, XSD, and serialization directly in your Web services today, this session offers the information you need to successfully migrate your services to Indigo.
    Session Level(s): 300
    Track(s): Communications

    Microsoft's next generation development platforms are looking good for web developers. AJAX support? check. RSS support? check. And now it looks like the Indigo folks will be enabling developers to build distributed applications on the Web using plain old XML (POX) over HTTP as well as SOAP. 

    A number of popular services on the Web expose APIs on the Web using POX (although they mistakenly call them REST APIs). In my post Misunderstanding REST: A look at the Bloglines, del.icio.us and Flickr APIs I pointed out that the Flickr API, del.icio.us API and the Bloglines sync API are actually examples of POX web services not REST web services. This approach to building services on the Web has grown increasingly popular over the past year and it's great that Microsoft's next generation distributed computing platform will support this approach.

    I spent a bunch of time convincing the Indigo folks to consider widening their view of Web services and thanks to open minded folks like Doug, Don & Omri it looks like I was successful.

    Of course, it isn't over yet. The icing on the cake would be the ability to get full support for using REpresentational State Transfer (REST) in Indigo. Wish me luck. :)

    Update: I was going to let the Indigo guys break this themselves but I've been told that it is OK to mention that there will be first class support for building REpresentational State Transfer (REST) web services using Indigo.


     

    Categories: XML Web Services

    July 13, 2005
    @ 01:36 PM

    I stumbled on Bus Monster last week and even though I don't take the bus I thought it was a pretty cool application. There's a mapping application that I've been wanting for a few years and I instantly realized that given the Google Maps API I could just write it myself.

    Before starting I shot a mail off to Chandu and Steve on the MSN Virtual Earth team and asked if their API would be able to support building the application I wanted. They were like "Hell Yeah" and instead of working on my review I started hacking on Virtual Earth. In an afternoon hacking session, I discovered that I could build the app I wanted and learned new words like geocoding.

    My hack should be running internally on my web server at Microsoft before the end of the week. Whenever Virtual Earth goes live I'll move the app to my personal web site. I definitely learned something new with this application and will consider Hacking MSN Virtual Earth as a possible topic for a future Extreme XML column on MSDN. Would anyone be interested in that?


     

    Categories: MSN | Web Development | XML

    Sometime during the past week, the number of downloads of RSS Bandit from SourceForge crossed 100,000 for the most recent release and 300,000 total downloads since the project moved to SourceForge a year and a half ago. This isn't bad for a project that started of as a code sample in an MSDN article a few years ago.

    However even though Torsten and I have been improving the original code for about two years now there is still a bunch of work to do. Some of these areas for improvement were recently pointed out by Jack Vinson in his posts RSSBandit Thoughts and More RSSBandit Experience. Below are his comments and responses from me.

    "Next unread item" means the oldest unread item, rather than the youngest.  This seems to run counter to most of the aggregators, which present the newest unread item.  Interestingly, the "newspaper" view shows items in reverse chronological order.

    I like to read posts in the order they were written especially when a newer post might be a follow up to an older post [as is the case with the latter post by Jack]. In general, I don't think anyone has really complained about this before.

    Space bar goes to "next unread," rather than doing a "scroll" in the current reading pane window when viewing in newspaper mode.  If the reading pane has focus, it will scroll there.  When reading a single post, it does scroll as expected.

    The behavior of going to the next unread item on hitting space bar predates the newspaper view. The problem we had when coming up with newspaper views was how to integrate both features in a way that was intuitive. The main issue here being that if the space bar scrolls you through the newspaper and you scroll half way down then click somewhere else, do you expect that half the posts from the newspaper view should be marked as read or stay unread? We didn't have a good idea of what the right choice would be so we punted on the problem by not scrolling in the newspaper view when you hit space but instead keeping the old "Next Unread" behavior.

    RSS Bandit is much more sensitive to errors in the feeds - more accurately, it tells me that there are errors in some feeds.  They provide a "feed error" folder that lists problems as they arise.  But I see that the feeds it has trouble with work fine elsewhere.  Not good.

    Some of the errors we report really aren't worth showing to end users. Things like HTTP timeouts and the like are really transient issues that are more likely due to the user's network than a problem with the feed. We need to do some filtering of these errors in future releases.

    I can't get the full text on excerpt-only feeds.  This is probably the biggest loss of moving from the old reader.

    If the feed only has excerpts, how do we get the full text of the entry?

    I like the newspaper view, when I select a folder (they call them "categories").  Articles are listed in descending order, but are grouped by feed.  I don't quite understand how the feeds are sorted (it's not by the feed with the most recent article is at the top.)  This is a handy mode for reading unread stuff once or twice a day.

    I like this feature as well. In the next version we'll be adding the ability to flag or mark items as read/unread from the newspaper view. The feeds should be sorted by the order the appear in the tree view.

    RSS Bandit is a stand-alone application, but it uses the Internet Explorer engine to render HTML and XSLT.  By default, it opens links in tabs within the app.  You can also have it open links in the default browser.  I like the tabs in the application.  Now I need to find out if there are keyboard shortcuts for navigating the tabs.

    Tabbed browsing is definitely cool. You can navigate between tabs by pressing the Ctrl+Tab or Shift+Ctrl+Tab keys. It's pretty sweet.

    The BlogJet This plug-in works in the reading windows.  But the BlogJet This plug-in for IE does not work in the tabs that open within RSS Bandit

    Weird. I'm not sure why this is the case but can look into it.

    Email this only emails the URL of the post.  I'd rather it give the entire text (HTML) of the item (along with the URL). 

    I've kind of wondered about this myself but since no one has ever really complained I never changed it. Are there other RSS Bandit users out there that would prefer that "Email This" sent the body of the post and not just the URL as it does today?

    I'm not quite clear on how the user interface is responding. Sometimes I will select a folder/category that has updated feeds, and I will get a view that lists just the new entries. Other times the newspaper will show both new and old entries. The topic list always shows both the new and old.

    For search folders the newspaper view shows all the items in the folder while for regular feeds or folders/categories it shows the unread items.

    One can create search folders to display ONLY unread messages, for example.
    It seems slow, but this is my complaint with many of these apps. Maybe I just read too many feeds. Marking about 80 unread items read (when in the "unread view") took quite a while. Even 28 unread items took 10-15 seconds to "process." This seems to be a memory issue, because the next time I hit "mark all read" in the same usage session, it is much faster.

    I agree that it does seem to take far too long for an operation like "Mark All Read" to be performed in a search folder. I'll work on improving the performance of this for the next version.

    There seems to be no easy way to tell the software that I'm offline and to not bother downloading.

    Go to the File menu and select "Work Offline". We also detect if you select this option directly from Internet Explorer as well.

    When it's checking feeds, it eats a lot of resources. So much so, that I can't even scroll the current window, much less select a new feed to read. (Outlook has been doing the same thing to me lately.)

    Downloading feeds is pretty CPU intensive for us. Not because of the actual downloading of the files but because we run the algorithm that infers relationships across different posts so we can show them as threaded conversations. I hacked on this code during the last release but only made it slightly less CPU intensive. I've considered just having an option to turn off this feature for the folks who'd rather have a more responsive UI than the threaded conversation feature.


     

    Categories: RSS Bandit

    July 12, 2005
    @ 02:14 PM

    My usage of Wikipedia as an online reference has continued to grow over the last couple of months. Although in general most of the entries are well written I've seen a couple of entries that need a bunch of work. Some of these I could actually help with but just don't have time right now. I have a list of these entries and thought it might make sense to post them on my blog on the off chance that someone reading my blog may be interested in updating these entries. 

    The entries I'd like to update if I had some time are

    All of these are entries that I consider incomplete and believe I could flesh out. Also some of the writing is extremely sloppy but I guess that's what you get when anyone with a pulse can edit an entry. 


     

    Categories: Ramblings

    A recent comment left in my blog by someone named Nigel states "Not only is Microsoft unable to create technological tidal waves, it constantly misses the waves produced by others. Aren't you guys learning from the past?"

    After watching a number of recent trends in the software industry I've been struck by how many of them were originally started by Microsoft but later abandoned only to be picked up by the rest of the software industry a few years later.

    EXHIBIT A - XML Syndication (CDF & ActiveDesktop)
    Content syndication  using RSS has emerged as the next big thing. We have Apple's iTunes supporting podcasting, VC funds dedicated to funding startups building on RSS and practically every news organization on the Web sporting RSS feeds.

    However the basic approach and technology behind RSS and XML content syndication was originally proposed by Microsoft with its Channel Definition Format (CDF) and ActiveDesktop technology. As with most aspects of the push technology fad of the late 1990s, usage of the technology languished. However CDF did inspire Dave Winer and Netscape to become interested in content syndication using XML. In 2000, Dan Brickley sent a mail to the RSS-DEV mailing list entitled RSS-Classic, RSS 1.0 and a historical debt which points out that the syndication formats created by Dave Winer & Netscape owed a lot to CDF.

    Of course, the original killer app for RSS has been blogging. Without the rise of blogging it is unlikely that RSS would be as popular as it has become today.

    EXHIBIT B - AJAX (DHTML & XMLHTTP):
    Another popular trend on the Web today is using DHTML and server callbacks to build Web applications This approach has been recently been named Asynchronous Javascript & XML or AJAX for short.  This trend really became hot after Jesse James Garrett penned his article Ajax: A New Approach to Web Applications which highlighted the power of applications built using this approach.

    As Adam Bosworth points out in his post AJAX Reconsidered and Scott Guthrie in his post Atlas Project, the basic building blocks of AJAX (DHTML & the XMLHTTP object) were invented by Microsoft. However as both Adam & Scott point out, the primary focus of building AJAX applications at Microsoft was on targetting business customers with applications like Outlook Web Access. Eventually interest in building rich internet applications at Microsoft swung towards XAML and Avalon and away from DHTML.

    Until Google unleashed GMail, Google Maps and Google Suggest on the Web. Now AJAX is the new hotness.  

    EXHIBIT C - Web APIs & Web 2.0 (Hailstorm)
    If you hang around web development pundits long enough, you'll eventually hear the phrase "Web 2.0". This is a monicker for the increasing trend of treating web sites as web platforms. Every cool website has APIs these days. At Google Web APIs page you can find APIs for Google Maps, Google Search and Google AdSense. At the Yahoo! Developer Network you can find APIs for Yahoo! Maps, Yahoo! Search, Flickr & Yahoo! MyWeb. On the Amazon Web Services page you can find APIs for creating and searching listings on Amazon. At the eBay Developer Program you can find the same for eBay. Smaller sites also have APIs as are evidenced by the del.icio.us API, Bloglines API or the 43 Things API. Then there are all the weblogging APIs and RSS feeds out there that allow users to create and consume content outside of the traditional window of the Web browser.

    Turning web sites into web platforms that can be interacted with from any platform running on any device was a key part of the original vision behind Microsoft's Hailstorm initiative. However there were other parts of the initiative that didn't sit well with potential customers and it was quietly abandoned.

    LESSONS LEARNED?
    I'm not sure whether the aformentioned trends count as "technological tidal waves" but they are definitely significant to how developers and end users utilize the Web. In all three situations Microsoft started with a vision that was quite innovative, hit some roadblocks and scrapped initiatives instead of changing tactics. Eventually our competitors learned from our mistakes and make us look late to the party when we finally get over our initial failure enough to try again.

    I suspect that in a few years, a fourth example that should be added to this list would be comparing Passport to efforts such as the Liberty Alliance. Then again, from reading Kim Cameron's blog it seems that we are trying to switch tactics in the digital identity space instead of giving up.  That is a welcome change.

     

    A few months ago I wrote a blog post entitled What Blog Posting APIs are supported by MSN Spaces? which explained the various options we saw in providing an API that would allow desktop tools and web applications interact with MSN Spaces programmatically. Since I wrote that post I've had a number of people inquire about when we'd provide an API and what form the API will take.

    Our current plan is to provide an implementation of the MetaWeblog API with some methods from the Blogger API while using HTTPS/SSL for security. These APIs are widely supported by various weblog applications and already have a vibrant developer ecosystem around them. The API will enable people to create, edit and delete posts on the blog in their space. One of our goals is to ensure that bloggers who are already using blog posting tools such Blogjet or w.bloggar can use them to interact with MSN Spaces when we launch the API without having to upgrade or switch clients. Similarly web sites which allow users to post to their blogs such as Flickr and Zoto should be able to support our API without making significant changes.

    The launch date of the API is yet to be determined but will be in the near future. In the meantime, we'd like to get developers of blog posting tools and web applications that would like to integrate with MSN Spaces into a beta program to test our implementation of these APIs. If you are a developer of a blog posting tool or web application that wants to use our API and don't mind signing an NDA then you should send me mail at dareoATmicrosoftDOTcom to get into our beta program.  

    If you are interested in us providing other APIs, such as allowing programmatic access to the photo albums or the various lists on a space, I'd also like to hear from you. Please send me mail about your scenario and what platform/device your application will be running on.


     

    Categories: MSN

    My friend Kitty has been working on a bunch of cool projects at work over the past year. She was instrumental in the recently announced PC-to-Mobile Instant Messaging Between MSN Messenger and Vodafone Messenger. Her most recent project as part of our team has been in working with other folks at MSN to get http://rockstar.msn.com/ launched.

    The details are in the recent press release MSN Launches Official Web Site for Mark Burnett Productions’ “Rock Star: INXS,” Giving Viewers New Ways to Engage With Reality Show which is excerpted below

    MSN is giving fans new ways to take part in the reality-show craze by launching http://rockstar.msn.com , the official Web site for Mark Burnett Productions’ "Rock Star: INXS." The show, which aims to find a new lead singer for the multiplatinum rock band INXS, premieres July 11 on the CBS Television Network in the U.S... Rockstar.msn.com extends and enhances “Rock Star: INXS” by giving fans unique opportunities to connect with the contestants and with one another
    ...
    Beginning July 11, viewers can do the following:

    • Vote for their favorite contestants through http://rockstar.msn.com and MSN® Messenger, which allows people to vote while chatting with friends about the show in real time. Wireless voting also will be available.

    • Watch exclusive "Rock Star: INXS" video not seen on TV, available only through MSN Video.

    • Watch streaming video of contestant performances on MSN Video to relive the highlights and the lowlights before casting a vote.

    • Purchase the contestant performances on MSN Music. Not only can fans download and own their favorite musical moments from the show, downloads of the original artists’ versions of contestant performance songs also will be available for purchase on MSN Music.

    • Read contestant blogs on MSN Spaces that tell fans about everything from their backgrounds to what it's really like onstage, offstage and backstage.

    • Chat with other "Rock Star: INXS" fans through MSN Messenger and download special "Rock Star: INXS" emoticons, dynamic display pictures, backgrounds and winks to spice up their instant messaging (IM) conversations.

    • Play rock-and-roll trivia games created by Cranium Inc. for MSN Encarta®.

    • Sign up for MSN Alerts and a weekly newsletter that give fans the scoop on everything related to "Rock Star: INXS."

    • View weekly "Rock Star" photo galleries and rock-and-roll fashion features.

    • Buy merchandise featured on "Rock Star: INXS" through MSN Shopping.

    • Get weekly fashion tips from the "Rock Star: INXS" official show stylist.

    I've been watching a lot more reality TV than I care to admit so it is fun to see that lots of us at MSN are also into this guilty pleasure. Now if only we'd come up with an MSN spin on Being Bobby Brown then my reality TV fix would be complete.


     

    Categories: MSN

    Since Sam Ruby asked, I feel I must oblige. There have been a bunch of posts in Sam's blog pointing out that the RSS parser used by Apple's iTunes handles invalid RSS feeds which in turn encourages content producers to publish invalid RSS feeds which only work in iTunes.

    In the post entitled Insensitive iTunes Sam wrote 

    Mark Pilgrim: it appears that iTunes uses a real, draconian, namespace-aware XML parser... except that namespaces are case-insensitive.

    What’s worse, is that the high profile Disney The Gears Behind the Ears feed appears to be depending on this functionality, as well as on other non- standard element definitions.

    There are a couple of more issues with the iTunes parser mentioned by Mark Pilgrim in the comments to that post. The reason this is actually an issue at all is spelled out by Mark in another response to Sam's post where he wrote

    Am I the only one who doesn’t think this is such a big deal?

    Apple is an 800-lb. gorilla in this space (at least until Microsoft releases an RSS-enabled IE in Longhorn).  iTunes is to podcasting as Internet Explorer is to HTML.  RSS interoperability, at least as far as podcasting goes, now means “works with iTunes.”  Thousands of people and companies will begin making podcasts that “work with iTunes,” but unintentionally rely on iTunes quirks (e.g. Disney’s incorrect namespace).  This in turn will affect every developer who wants to consume RSS feeds, and who will be required to emulate all the quirks of iTunes to remain competitive.

    Apple has effectively redefined the entire structure of an RSS feed, added multiple core RSS elements, made all RSS elements case-insensitive, made XML namespaces case-insensitive, created a new date format, made several previously required attributes optional, and created a morass of undocumented and poorly-documented extensions... to what was already a pretty messy format to begin with.

    Case in point: my Universal Feed Parser, which already has 2751 test cases and is so incredibly liberal that it can parse an ill-formed EBCDIC-encoded RDF feed with regular expressions, will require hundreds of new test cases to cover all the schlock that iTunes accepts.  And I’m one of the lucky ones.

    The supreme irony of all this is that I remember Dave Hyatt (Apple Safari developer) bitching and moaning about all the work he had to do to make Safari emulate the buggy, undocumented behavior of Internet Explorer, and how the world would be so much better if only everything used XML and everyone implemented draconian error handling.  Never mind the fact that the vast majority of problems that iTunes creates have nothing to do with XML well-formedness; iTunes doesn’t even require well-formed XML in the first place.  Utopia, it seems, will have to wait another decade.

    Just like the browser wars I suspect this is going to get a lot worse before it gets any better. Hopefully the folks working on RSS at Apple [and at Microsoft] are paying attention to this discussion and will do the right thing.

    The main problem is that every RSS reader is "liberal" to some degree. The problem that causes is that aggregator developers end up being asked to be bug compatible with some other popular RSS reader. I get complaints that RSS Bandit is more strict than RSS readers like Sharpreader all the time but often resist making changes to copy every quirk in other RSS readers. Once an RSS reader rises to dominance, the definition of what it means to be a valid RSS feed won't be what is in the spec but will be whatever that reader supports. This is what often happens in the software industry from web browsers to C compilers. It's great to see Sam fighting to prevent this from happening in the RSS space and his Feed Validator has gone a long way in preventing this from happening. I can only hope that the iTunes folks realize that it is best for everyone if they favor spec compliance to being liberal in what they receive.


     

    Yesterday while browsing comments on Slashdot I found a link to an article at LinuxToday on Tim O'Reilly's Open Letter: Rethinking the One-Click Patent which contains the following excerpt from a posting by Tim O'Reilly on the Amazon 1-Click patent controversy

    People in many areas of commerce, not just on the Web but also TV and radio (as evidenced by some of our prior art submissions), have put a lot of thought into making the shopping experience quicker and easier. And yet none of these folks really managed to simplify it to the same degree that Amazon did with 1-Click. In the end, we did not have a winner, and it doesn't look as if the prior art submitted can "knock out" the 1-Click patent...So I want to offer Jeff something of an apology. At the same time, at the risk of appearing a "sore loser," I want to reiterate that my fundamental issue with Amazon was never the specific claims of the 1-Click patent. Even if Amazon did create a genuine e-commerce innovation, I maintain that it was still a mistake for them to patent it.

    I remember the hubbub on Slashdot about Amazon's 1-Click patent and Tim O'Reilly's bounty for prior art but don't remember this ever getting posted. So it seems that despite all the claims of "obviousness" from the Slashdot crowd, no prior art could be found. I guess it is true that all innovations look obvious in hindsight.

    Another interesting data point is this post on Slashdot about the various patent lawsuits Amazon is currently fighting. Lots of people like to polarize the debate about software patents but in truth the situation isn't as cut and dried as folks on either side of the debate like to make it seem.


     

    It's been hard to escape coverage of the Live 8 concerts since that's all MTV showed over the weekend and the news channels have been covering it in the mornings while I work out. Events like Live 8 always make me end up feeling ambivalent. One the one hand it is great to see people trying to help with the problems people are facing in Africa and on the other it perpetuates the notion that Africa is the world's charity case. After some consideration, I definitely think my feelings about the concerts are mostly positive. 

    I've seen some blog posts complain that not enough African artists were included in the concerts and others criticising the concerts by asking what good will a rock concert end up influencing the members of the G8.

    My thoughts are similar to those David Weinberger expressed in his post Live 8: Cause or fashion statement? where he wrote

    For me it comes down to this: I can't imagine that people going to a big rock concert will change the mind of any G8 leader, but if Live 8 makes debt relief trendy, I'm all for it. After all, trendiness seemed to have an effect on ending Apartheid in the 80s.

    In a similar vein I echo the sentiment's from the post in Brian's Black Star Journal entitled Development issues and celebrities where he wrote  

    I remember back when Princess Diana got involved in the landmine question. I wondered how those ordinary activists felt. They worked on the issue for years to little effect but then this fancy royal flies in and suddenly it's the cause célèbre du jour.

    But on the other hand, at the end of the day, the Ottawa treaty banning landmines was signed. Most countries (not including the US) do not use landmines anymore. Is it really important who gets credit? As an activist, is it about you or the cause? Do you think any anti-landmine activist would say, "I think we should revoke the Ottawa treaty because it wouldn't have passed without star power"? I hope not. If so, they are not real activists.

    Despite these sentiments I agree with the economists and aid groups cited in variousnews storiesabout Live 8 that at the end of the day what African nations need more than aid and debt cancellation is better governance and to participate more fully in international trade. Better governance simply cannot be overemphasized. In certain nations African governments have really, really screwed things up. For many nations, without regime change, giving more aid is just sending in good money after bad.

    Unfortunately, there are no easy answers.


     

    Categories: Ramblings

    If you like World of Warcraft and cute asian girls, and cheesy commercials, you will love this World of Warcraft Coke commercial from China.

    Via tokyo-genki.com.  


     

    Over the weekend, while watching a Game video I started thinking of various beefs across Hip Hop history and came up with a mental list of my favorite diss tracks. Below is my top 5 list in no particular order.

    1. Wit Dre Day by Snoop Doggy Dogg & Dr. Dre - dissing Tim Dog, Luke & Eazy E
    2. Mama Said Knock You Out by LL Cool J - dissing Kool Moe Dee
    3. Backdown by 50 Cent - dissing Ja Rule (honorable mention to the entire Invasion part II mix tape)
    4. Ether by Nas - dissing Jay-Z
    5. Stomp by Young Buck, Ludacris & T.I. - T.I. and Ludacris dissing each other

    The last entry isn't as good as the others but the fact that both rappers diss each other on the same track is what propelled it onto the list. What does your list look like?


     

    Categories: Music

    Last week was the O'Reilly Where 2.0 Conference where a number of players in the online mapping space including Yahoo!, Google and MSN announced API plans for their various services.

    The Yahoo! Maps Web Services provides a way display a map on the Yahoo! website populated with locations specified by the caller. To specify the locations on the map, one uses an RSS feed where each item in the feed corresponds to a location on the map and its geographical address is specified using a combination of geoRSS extensions and proprietary Yahoo! extensions. Instead of allowing one to POST the RSS feed to the service, the Yahoo! Maps API requires that a URL to the RSS feed is provided instead. This prevents the API from being used by desktop applications easily or by users who don't have access to a web server where they can place XML files online. Clicking on the following URL should show the API in action; http://api.maps.yahoo.com/Maps/V1/AnnotatedMaps?appid=YahooDemo&xmlsrc=http://developer.yahoo.net/maps/sample.xml 

    The Google Maps API allows one to embed Google Maps on specific web pages. To include a map on one's web page, a Javascript file which exposes a complete object model for Google Maps should be included on the target webpage. The Javascript include is of the form

    <script src="http://maps.google.com/maps?file=api&v=1&key=abcdefg" type="text/javascript"></script>

    file=api indicates that the file being returned is the Google API file, v=1 indicates that version 1 of the API is being requested and key=abcdefg is used to specify the developer key being used to access the service.

    Once the script is included, developers can write code such as

    var map = new GMap(document.getElementById("map"));
    map.addControl(new GSmallMapControl());
    map.addControl(new GMapTypeControl());
    map.centerAndZoom(new GPoint(-122.141944, 37.441944), 4);

    which results in an effect similar to those at controls.html being created on the Web page.

    It's interesting to see how radically different the approaches taken by Yahoo! and Google to  provide what is basically the same functionality. The Yahoo! approach seems to me to be more declarative and straightforward than Google's approach. However, Google's approach is definitely a lot more flexible.

    As for MSN, we announced that Virtual Earth will provide an API that will be free for non-commercial use that utilizes both URLs and a JScript Map control. Some of the highlights of the conference presentation are in Chandu Thota's post Where 2.0 and Virtual Earth.


     

    Categories: MSN | XML Web Services