Danah Boyd has an excellent post on the differences between adults and adolescents when it comes to blogging and other forms of online expression. In her post designing for life stages Danah Boyd writes

Identity formation

When youth are coming into a sense of self, they move away from the home and look to the social world to build a socio-culturally situated identity. In other words, they engage in the public in order to make sense of social boundaries/norms and to develop a sense of self in relation to the broader social context. Youth go to the public to see and be seen and they negotiate a presentation of self depending on the reactions of peers and adults. Public performance is about getting those reactions in order to make sense of the world.

A main role of things like MySpace and Facebook is to produce a public sphere in order for youth to negotiate their peers and learn about the social world. People often ask me why teens don't just go out in a physical public. Simply put, they can't. We live in a culture of fear where most parents won't allow their children to go anywhere without supervision. Youth no longer have access to the streets or even neighborhood gathering spots. They are always in controlled locations where the norms are strictly dictated by adults - this is not a public sphere in which teens can make sense of sociability. Thus, they create their own. (Note: the production of a public and its implications is the cornerstone of my dissertation.)...

Contributive Participant in Society

And then we become adults. The bulk of adult-hood is evaluated based on contribution to society, participation, what you can create and do. It's about being a good citizen, laborer, parent. It's about the act of doing things. Your identity gets wrapped up in how you contribute to society ("So, what do you do?"). We ask youth about their hobbies and friends; we ask adults about their jobs and children. When we speak, we think that we have to produce information, be relevant, be efficient, be contributive. (And people wonder why growing up sucks.)

Nowhere is this shift more apparent than blogging land. While youth are doing identity production in terms of sociability, adults are creating new tasks for themselves - documenting, informing, conversing. It's all wrapped up in being part of the conversation, not in simply figuring out who you are.

This is one of the reasons why whenever I see words like blogosphere it makes me laugh. The worlds largest blogging site is probably MySpace. I suspect that MSN Spaces is the second largest although I'd have to ping folks from work to confirm. Both sites have significant populations of young adults (aka teenagers or adolescents). However whenever someone says blogosphere they usually mean some specific subset of blogs such as technology or politics focused blogs. Although A-list bloggers like Doc Searls, Dave Winer and Robert Scoble give the impression that blogging is about amateur punditry that competes with journalism and corporate blogging, the fact is that a large segment of the blogging population are just trying to express their identity and discover themselves online.

People building social software need to understand the needs of both classes of users. In fact, it's actually more complex than that because you often also have to factor in cultural differences as well since the Web is international. If you are interested in blogging and other aspects of social software, you really should subscribe to Danah's blog.


 

Categories: Social Software

It is interesting to see people rediscover old ideas. Robert Scoble has a post entitled Silicon Valley got my attention: the future of Web businesses where he writes

What is Zvents capturing? Well, they know you like football. They know you probably are in San Francisco this weekend. And, if you click on one or two of the events, they know you’re interested in them. Now, what if you see an ad for a pair of Nikon binoculars. If you click on that, then Zvents would be able to capture that as well.

Now, what other kinds of things might football fans, who are interested in binoculars, who are in San Francisco, want to do this weekend? Hmmm, Amazon sure knows how to figure that kind of problem out, right? (Ever buy a Harry Potter book on Amazon? They suggest other books for you to buy based on past customer behavior!!!)

It goes further. Let’s say this is 2007. Let’s say that Google (or Yahoo or MSN) has a calendar "branding" gadget out. Let’s say they have a video "monetization" gadget out. Zvents could build the calendar "branding" gadget into their page. What would they get out of that? Lots of great PR, and a Google (or MSN or Yahoo) logo in everyone’s face. But, they would also know where you’d be this weekend. Why? Cause you would have added the 49ers football game to your calendar. So, they would know where you are gonna be on Sunday. And, that you just bought binoculars. Over time Google/MSN/Yahoo would be able to learn even more about you and bring you even more ads. How?
...
It’s all attention. So, now, what if Zvents and Google shared their attention with everyone through an API. Now, let’s say I start a new Web business. Let’s call it "Scoble’s tickets and travel." You come to my site to book a trip to London, let’s say. Well, now, what do I know about you? I know you were in San Francisco, that you like coffee, that you just bought some binoculars, that you like football. So, now I can suggest hotels near Starbucks and I can suggest places where you’ll be able to use your binoculars (like, say, that big wheel that’s in the middle of London). Even the football angle might come in handy. Imagine I made a deal with the local soccer team. Wouldn’t it be useful to put on my page "49ers fans get $10 off European football tickets."

Four years ago, while interning at Microsoft, I saw a demo about Hailstorm which made me suspct the project's days were numbered. The demo involved a scenario very similar to what Robert describes in his post. Just substitute Zvents with "online CD retailer" and calendar gadget with "upcoming concerts gadget" and that Robert's scenario was the Hailstorm demo I saw.

The obvious problem with this "Attention API" and Hailstorm is that it requires a massive database of customer behavior. At the time, Microsoft's pitch with Hailstorm was that its online retailers and other potential Hailstorm partners should give all this juicy customer data to Microsoft then pay to access it. It doesn't take a rocket scientist to tell that most of them told Microsoft to take a long walk of a short pier.

Now let's use a more concrete example, like Amazon. The folks at Amazon know exactly what kind of movies, music and books I like. It's possible to imagine them making a deal with TicketMaster to show me upcoming concerts I may be interested in when I visit their site. The reverse is also possible, Amazon may be able to do a better job of recommending music to me based on concerts I have attended whose tickets I purchased via TicketMaster.

What's the first problem that you have to solve when trying to implement this? identity. How do you connect my account on Amazon with my account on TicketMaster in a transparent manner? This is one of the reasons why Passport was such a big part of the Hailstorm vision. It was how Microsoft planned to solve the identity problem which was key to making a number of the Hailstorm scenarios work. Almost half a decade later, the identity problem is still not solved.

Identity is just problem #1.

If you scratch at this problem a little, you also will likely find an ontology problem as well. How do I map the concepts in Amazon's database (hip hop CDs) with related concepts in TicketMaster's database (rap concerts)? The Hailstorm solution was to skip solving this because it was all coming out of the same database. However even simple things like mapping Rap to HipHop or Puff Daddy to P. Diddy can be fraught with problems if both databases weren't created by the exact same organization. Trying to scale this across different business partners is a big problem and is pretty much a cottage industry in the enterprise world.

Thus Ontologies is problem #2.

There are more problems to discover as one attempts to build the Attention API and an Attention economy. At least it was a fun trip down memory lane remembering my intern days. :)


 

Categories: Technology

Scott Isaacs has written a series of posts pointing out one of the biggest limitations of applications that use Asynchronous Javascript  and XML (AJAX). The posts are XMLHttpRequest - Do you trust me?, XMLHttpRequest - Eliminating the Middleman, and XMLHttpRequest - The Ultimate Outliner (Almost) . The limitation is discussed in the first post in the series where Scott wrote

Many web applications that "mash-up" or integrate data from around the web hit the following issue: How do you request data from third party sites in a scalable and cost-effective way? Today, due to the cross-domain restrictions of xmlhttprequest, you must proxy all requests through a server in your domain.

Unfortunately, this implementation is very expensive. If you have 20 unique feeds across 10000 users, you are proxying 200,000 unique requests! Refresh even a percentage of those feeds on a schedule and the overhead and costs really start adding up. While mega-services like MSN, Google, Yahoo, etc., may choose to absorb the costs, this level of scale could ruin many smaller developers. Unfortunately, this is the only solution that works transparently (where user's don't have to install or modify settings).

This problem arises because the xmlhttprequest object can only communicate back to the originating domain. This restriction greatly limits the potential for building "mash-up" or rich aggregated experiences. While I understand the wisdom behind this restriction, I have begun questioning its value and am sharing some of my thoughts on this

I encountered this limitation in the first AJAX application I wrote, the MSN Spaces Photo Album Browser, which is why it requires you to add my domain to your trusted websites list in Internet Explorer to work. I agree with Scott that this is a significant limitation that hinders the potential of various mashups on the Web today. I'd also like to see a solution to this problem proposed. 

In his post, Scott counters a number of the reasons usually given for why this limitation exists such as phishing attacks, cross site scripting and leakage of private data. However Derek Denny-Brown describes the big reason for why this limitation exists in Internet Explorer in his post XMLHttpRequest Security where he wrote

I used to own Microsoft's XMLHttpRequest implementation, and I have been involved in multiple security reviews of that code. What he is asking for is possible, but would require changes to the was credentials (username/password) are stored in Windows' core Url resolution library: URLMON.DLL. Here is a copy of my comments that I posted on his blog entry:

The reason for blocking cross-site loading is primarily because of cached credentials. Today, username/password information is cached, to avoid forcing you to reenter it for every http reference, but that also means that script on yahoo.com would have full access to _everything_ in your gmail/hotmail/bank account, without a pop-up or any other indication that the yahoo page was doing so. You could fix this by associating saved credentials with a src url (plus some trickery when the src was from the same sight) but that would require changes to the guts of windows url support libraries (urlmon.dll)

Comparing XML to CSS or images is unfair. While you can link to an image on another sight, script can't really interact with that image; or example posting that image back to the script's host sight. CSS is a bit more complicated, since the DOM does give you an API for interacting with the CSS, but I have never heard of anyone storing anything private to the user in a CSS resource. At worst, you might be able to figure out the user's favorite color.

Ultimately, it gets back to the problem that there needs to be a way for the user to explicitly enable the script to access those resources. If done properly, it would actually be safer for the user than the state today, where the user has to give out their username and password to sights other than the actual host associated with that login.
I'd love to see Microsoft step up and provide a solution that addresses the security issues. I know I've run against this implementation many times.

That makes sense, the real problem is that a script on my page could go to http://www.example.com/yourbankaccount and it would access your account info because of your cookies & cached credentials. That is a big problem and one that the browser vendors should work towards fixing instead of allowing the status quo to remain.

In fact, a proposal already exists for what this solution would look like from an HTTP protocol and API perspective. Chris Holland has a post entitled ContextAgnosticXmlHttpRequest-An informal RFC where he posts on some of the pros and cons of allowing cross-site access with IXmlHttpRequest but having the option to not send cookies and/or cached credentials.


 

Categories: Web Development

October 29, 2005
@ 04:33 PM

Yesterday I was at a Halloween event at a local grade school and I was peeved by at least three things I saw

  1. The lunch menu had "pizza", "cheese sticks and sauce" and "mini cheeseburgers" on it. Feeding growing kids junk food for lunch on a regular basis just seems like starting them off on the wrong foot nutritionally.

  2. The whiteboard in the gym had a list of reasons to excercise and on it cardiorespiratory was incorectly spelled as cardiorespitory several times. .

  3. There were a few bean bag toss games set up. Each kid got the same prize independent on how good or bad they did. The kids who got the bean bag through the difficult holes all 3 times got the same amount of candy as the kid who missed all three. That seems to send the wrong message about competition.

I could actually see myself complaining about one or more of the above if I was a parent with kids at the school. It looks like I'm going to be one of those parents when my time comes.


 

Categories: Ramblings

October 27, 2005
@ 05:53 PM

I read Anil Dashes's post The Interesting Economy a few days ago and didn't think much about it. Below is a key excerpt from his post

Today, Flickr has interestingness, which is a measure of some combination of how many times a picture has been viewed, how many comments it has, how many times it's been tagged or marked as a favorite, and some other special sauce. I suppose revealing the exact mix would encourage even more people to game the system, but the fact that it's not disclosed has led to a number of attempts to reverse-engineer the system. I doubt any of them are/will be successful (Flickr can update/evolve fast enough to change the algorithm if they figure it out) but that's probably going to be an ongoing dialogue.
...
What I'm wondering is, how is Flickr's interestingness different than the economy in Game Neverending? Than Second Life? (Or in Evercrack or Neverwinter or any of the other gaming platforms.) Is interestingness its own reward? Why don't I get to level up or power up when I create something interesting?

More to the point, the in-game economies of these games translate pretty cleanly into real-world cash, with eBay amplifying the efficiency of the currency conversion. And interestingness in other online media (like blogs) is rewarded by cash in a pretty straightforward way; I can sign up for TypePad, check a box to enable text ads, and pay for my account or point the proceeds to my PayPal account when I start getting lots of visitors.

But interestingness in Flickr doesn't pay. At least not yet. Non-pro users are seeing ads around my photos, but Yahoo's not sharing the wealth with me, even though I've created a draw. Flickr's plenty open, they're doing the right thing by any measure of the web as we saw it a year ago, or two years ago. Today, though, openness around value exchange is as important as openness around data exchange.

Since I read this it seems there has been a bunch of blog buzz about Anil's post. I found this out via Robert Scoble's post Anil Wants Flickr to Pay. Robert seems to think that the current trend towards "user generated content" is really about companies exploiting end users for money. I guess I'm biased because I work on services such as MSN Groups and MSN Spaces, but I disagree with Robert and Anil.

Using free services on websites like Flickr is a commercial exchange of goods and services. Flickr gives you a place to host your photos so you can share them with friends and in return they get paid for their services by placing ads around your photos. If you disagree with the terms of the service you can decide to choose another service such as Kodak's EasyShare Gallery (formerly Ofoto).

As with all things there will be some photo albums that will be more popular than others. These photo albums will likely bring in more ad clickthroughs and thus more money than the average photo album. Is this unfair? I don't think so. Is it unfair that my use of Google or MSN's search engines is subsidized by people who click on ads and I don't? Should people who click on more ads than the average user of a search engine be paid for doing so? 

Getting back to Flickr, since using the service is a commercial exchange entered into willingly by both parties I don't see why one could claim it is unfair. I can see the argument that Flickr should figure out how to reward its customers that bring in subtantially more ad revenue than the average user, but that would just be good business sense not something they are obligated to do.


 

Categories: Social Software

From the press release MSN Search Announces MSN Book Search we learn

SAN FRANCISCO — Oct. 25, 2005 — MSN Search today announced its intention to launch MSN® Book Search, which will support MSN Search’s efforts to help people find exactly what they’re looking for on the Web, including the content from books, academic materials, periodicals and other print resources. MSN Search intends to launch an initial beta of this offering next year. MSN also intends to join the Open Content Alliance (OCA) and work with the organization to scan and digitize publicly available print materials, as well as work with copyright owners to legally scan protected materials.

"With MSN Book Search, we are excited to be working with libraries worldwide to digitize and index information from the world’s printed materials, taking another step in our efforts to better answer people’s questions with trusted content from the best sources," said Christopher Payne, corporate vice president of MSN Search at Microsoft Corp. "We believe people will benefit from the ability to not just view a page, but to easily act on that data in contextually relevant ways, both online in the search experience and in the applications they are using."

MSN will first make available books that are in the public domain and is working with the Internet Archive to digitize the material. MSN will then work to extend its offering to other types of offline content. The digitized content will primarily be print material that has not been copyrighted, and Microsoft will clearly respect all copyrights and work with each partner providing the information to work out mutually agreeable protections for copyrights.

If you're keeping track that means all three major search engines (Yahoo!, Google and MSN) have announced book search engines. So far only Google is facing lawsuits from publishers because it plans to digitize copyrighted works unless the copyright holders explicitly opt-out. Expecting that publishers and authors will have to go to each search engine vendor that plans to offer a book search service to explicitly tell them not to redistribute their works seems to be putting an unnecessary burden on copyright holders and runs counter to the spirit of copyrights. 

The lawsuits around Google Print may turn out to be an interesting turning point in how copyright is viewed in the digital era.


 

Categories: MSN

From the post Google Base Was Sort of Live on Google Blogoscoped we learn

Several people report Google Base (as predicted yesterday) went live, or at least, its login-screen. I can’t reach it at the moment as it seems Google took it down again, but Dirson posted a screenshot to Flickr. So what is it? Quoting from Dirson’s screenshot of the login screen:

Post your items on Google.

Google Base is Google’s database into which you can add all types of content. We’ll host your content and make it searchable online for free.

Examples of items you can find in Google Base:

• Description of your party planning service
• Articles on current events from your website
• Listing of your used car for sale
• Database of protein structures

You can describe any item you post with attributes, which will help people find it when they search Google Base. In fact, based on the relevance of your items, they may also be included in the main Google search index and other Google products like Froogle and Google Local.

This reminds me of Amazon Simple Queue Service except they started with a web page instead of an API. I can't imagine that Google Base will be terribly useful without an API so I expect that one will show up with the release or shortly thereafter. I'm still unclear as to why this is an interesting idea although I'm sure some clever geek will find some way to build something cool with it. I also wonder if this will spur Amazon into doing more interesting things with their service as well.

Who would have thought that online lists would be a hot area? I guess it's time for some BigCo to snap up Ta-da Lists for a couple million dollars. ;)


 

Categories: Web Development

Since the release of the installer for the alpha version of the Nightcrawler edition of RSS Bandit we have fixed a number of bugs. We worked on a number of performance issues related to the responsiveness of the application. We also fixed a number of issues with our newsgroup support including fixing issues with password protected newsgroups. We also think we tracked down the issue that led to some items occassionally showing up in the wrong feeds.

You can download the next iteration of the Nightcrawler alpha at RssBandit.1.3.0.36.Nightcrawler.Alpha.zip.

We aren't ready to release a beta version of the installer because we aren't feature complete yet. There are two features that still need to be completely implemented; downloading of Enclosures/Podcasts and notifications of new comments on "watched" posts.

Also some features need fleshing out. Torsten pointed out this morning that I need to add support for RFC 2047 so that we can handle non-ASCII author names and post titles as part of the newsgroup support. I had hoped to find a free library with code that already does that but it seems that the only ones I can find are for sale. I guess writing that code must suck so much that no one wants to give it away for free. There goes a weekend or two. :)

NEW FEATURES IN NIGHTCRAWLER

  • NNTP Newsgroups support: Users can specify a public NNTP server such as news.microsoft.com and subscribe to newsgroups on that server. Permalinks in a newsgroup post point to the post on Google Groups.

  • Item Manipulation from Newspaper Views:  Items can be marked as read or unread and flagged or unflagged directly from the newspaper view. This improves end user work flow as one no longer has to leave the newspaper view and right-click on the list view to either flag or mark a post as unread. 

  • Subscription Wizard: The process for subscribing to newsgroups, search results and web feeds has been greatly simplified. For example, users no longer need to know the web feed of a particular web site to subscribe to it but can instead specify the web page URL and discovery of its web feed is done automatically. 

  • Synchronization with Newsgator Online: Users can synchronize the state of their subscribed feeds (read/unread posts, new/deleted feeds, etc) between RSS Bandit and their account on Newsgator Online. This allows the best of both worlds where one can use both a rich desktop client (RSS Bandit) and a web-based RSS reader (Newsgator Online) without having to worry about marking things as read in both places.

  • Using back and forward arrows to view last post seen in reading pane: When navigating various feeds in the RSS Bandit tree view it is very useful to be able to go back to the last feed or item viewed especially when using the [spacebar] button to read unread items. RSS Bandit now provides a way to navigate back and forth between items read in the reading pane using the back and forward buttons in the toolbar. NEW!!!

  • Atom 1.0 support: The Atom 1.0 syndication format is now supported. 

  • Threaded Posts Now Optional: The feature where items that link to each other are shown as connected items reminiscent of threaded discussions can now be disabled and is off by default. This feature is very processor intensive and can slow down the average computer to the point that is unusable if one is subscribed to a large number of feeds.

  • UI Improvements: Icons in the tree view have been improved to make them more distinctive and also visually separate newsgroups from web feeds.

 

I've been wondering if we shouldn't just lock down this release and push out the podcasting features to the Jubilee release. What do y'all think?


 

Categories: RSS Bandit

October 24, 2005
@ 04:06 PM

I recently read two posts on the official Google blog about the recent hubub around their efforts to digitize books and make them searchable over the Web. The posts are Why we believe in Google Print and The point of Google Print.

My immediate personal reaction was how different Google is from Microsoft when it comes to blogging. On the one hand Google is quick to fire people who don't toe the party line in their blogs while Microsoft encourages its employees to show their individual voices even if they sometimes disagree with the company's party line. On the other hand, Microsoft frowns on employees commenting on pending legal actions such as lawsuits while Google has its employees blogging their side of the story in an official capacity. The common thread here is "controlling the message". Google is all about that.

The other thing that struck me about Google's messaging around Google Print was pointed out by Dave Winer in his post  A turning point for the Web?

It's time to realize that Google is no longer the little company we used to love. They're now a huge company that pushes individuals around like a lot of other huge companies. They need some balance to their power. And it's ridiculous to blindly take their side on every issue. Sometimes they're wrong, and I believe this is one of those times. It's certainly worth considering the possibility that they're wrong.

Here's where the point about controlling the message shows up. By any measure, Google is multi-billion dollar, multinational corporation. However whenever its executives speak, they do an excellent job of portraying the company as if it is the altruistic side project of a bunch of geeky college kids. I don't just mean their corporate slogan of "Do No Evil" although it is one manifestation of this strategy. Better examples are Sergey Brin's comments at the recent Web 2.0 conference  where he states that their motives for creating the Google AdSense program was to help keep content-based websites stay in business. Of course, syndicating ads now brings in about three quarters of a billion dollars in revenue for them a quarter.

So what does this have to do with Google Print? Well, I personally don't buy computer books anymore thanks to the Web and search engines. The last book I bought was Beginning RSS and Atom Programming and that's only because I wrote one of the forewards. The only time I've opened a computer book in the past year was recently when I cracked open the reference section of Dynamic HTML when looking for some JavaScript minutae. If I had a good Web-based search engine for content within the book I wouldn't have needed the book. Also, I've been wanting a cheap or free Integrated Development Environment (IDE) for JavaScript for quite some time. If I'd found an ad for a JavaScript IDE while searching for content within the book in my 'hypothetical book search engine' I definitely would have clicked on it and maybe purchased the IDE. My 'hypothetical book search engine' would wean me completely off of needing to buy computer books while probably making a tidy sum for itself by selling my eyeballs to software companies trying to sell me IDEs, profilers, debuggers and software training. 

My point is that Google Print will likely make the company a lot of money and could cost certain publishes a lot of money in lost sales. Even if it doesn't, the publishing industry will likely cede some control to Google. That's what these lawsuits are about and from that perspective I can understand why various publishers have initiated lawsuits with Google. To frame this as 'the evil publishing industry is trying to prevent us from completing our corporate mission of making information more accessible to users' is disingenuous at best and downright manipulative at worst.

Markets are conversations, to succeed in the marketplace you have to dominate the conversation and control it to suit your needs. Google is definitely good at that.


 

Categories: Current Affairs

October 24, 2005
@ 03:07 AM

The official XBox website recently announced the My XBox service which offers seamless integration between Xbox 360, Xbox Live®, and Xbox.com. The announcement contains the following excerpt

Clinton: Once you've created your gamer card, we give you all kinds of ways to show it off. It'll appear next to all of your posts in the forums, plus you can pop it into your personal Web page or blog, and even display it on your active Windows® desktop.

TriXie: Very cool! I heard a rumor that we'll be able to get some cool Xbox® functionality with MSN Spaces.

Clinton: That's true. When the new version of MSN Spaces launches—around the time Xbox 360 hits store shelves—you'll be able to use Xbox 360 themes on your Space, plus drop in new Xbox modules like your gamer card and your games list.

TriXie: Wow, you guys have thought of everything. I can't wait to get my hands on all these great My Xbox features! Thanks Clinton.

We definitely have some holiday gifts for our millions of users in the next version of MSN Spaces. It is very exciting to think that in the near future millions of people will be using the features I worked on to better express themselves and share content with their friends and family online. As Mike mentioned on his blog, "You can blog on MSN Spaces, but MSN Spaces is not just a blogging service."


 

Categories: MSN

October 21, 2005
@ 03:35 PM

As I mentioned in my previous post on Understanding Web 2.0 the "web 2.0" meme isn't about technology or people, it's about money and hype primarily geared at VCs and big companies looking for small companies to buy so they look hip. The recently launched Flock web browser is one of example of a "Web 2.0" product which looks like it's creators just played a game of buzzword bingo when deciding what to do with their millions in VC funding. It is built on Firefox (bing), integrates with del.icio.us (bing) and Flickr (bing), plus it comes with blog posting (bing!) and RSS reading features (bingo!).

I have to agree with Joel Spolsky's claim that the Architecture Astronauts Are Back when he wrote

I'm starting to see a new round of pure architecture astronautics : meaningless stringing-together of new economy buzzwords in an attempt to sound erudite.

When I wrote my original complaint about architecture astronauts more than four years ago, it was P2P this and messaging that.

"That's one sure tip-off to the fact that you're being assaulted by an Architecture Astronaut: the incredible amount of bombast; the heroic, utopian grandiloquence; the boastfulness; the complete lack of reality. And people buy it! The business press goes wild!"

Now it's tagging and folksonomies and syndication, and we're all supposed to fall in line with the theory that cool new stuff like Google Maps, Wikipedia, and Del.icio.us are somehow bigger than the sum of their parts. The Long Tail! Attention Economy! Creative Commons! Peer production! Web 2.0!

The term Web 2.0 particularly bugs me. It's not a real concept. It has no meaning. It's a big, vague, nebulous cloud of pure architectural nothingness. When people use the term Web 2.0, I always feel a little bit stupider for the rest of the day.
...
Not only that, the very 2.0 in Web 2.0 seems carefully crafted as a way to denegrate the clueless "Web 1.0" idiots, poor children, in the same way the first round of teenagers starting dotcoms in 1999 dissed their elders with the decade's mantra, "They just don't get it!"

I'll do my part. I hereby pledge never again to use the term "Web 2.0" on this blog, or to link to any article that mentions it. You're welcome.

I feel the same way. I am interested in discussions on the Web as a platform and even folksonomies (not tagging) but the marketplace of ideas has been polluted by all this "Web 2.0" garbage. Once again, I've flipped the bozo bit on Web 2.0. Like Joel, you won't see any use of the term on my blog or in items I link to from now on. 


 

Categories: Web Development

It seems some folks at TheServerSide.com have started bashing AJAX because they see it as a threat to Java. This has led to fairly ridiculous posts such as this one entitled But most of all samy is my hero which states

The story is, a myspace user named samy wanted to be popular. He wanted to make his page do things that others couldn’t and in the process devised a cross system scripting (XSS) attack that managed to add his profile to more then a million other users of the system. To do this he used a combination of AJAX and JavaScript.

It is not the intention to make samy even more famous but he has exposed a serious weakness in the AJAX security model. All samy did was figure out how to upload some JavaScript into his profile and this was despite myspace’s best efforts to limit this type of activity.

With respect to security, the web is already a hostile environment. Will a move to use AJAX and JavaScript further enlarge the security holes that already exist? Could myspace have done more to prevent this type of attack and still afford their users the flexibility to manage their pages as they do now?

Even though I haven't looked at the code of the exploit, I think it is fair to say that this issue has little to do with "the AJAX security model" as implied by the author of the post. Any system that accept user input has to worry about how they scrub the data due to malicious users. Not properly scrubbing input data leads to all sorts of security problems including buffer overflows and cross site scripting attacks.

I'd suggest that some of the folks on TheServerSide need to read up on some of the FAQs on cross site scripting attacks before blaming AJAX for problems that have nothing to do with it.


 

Categories: Web Development

October 20, 2005
@ 02:55 PM

A couple of recent stories in the news remind me that there still a ways to go for race relations in America.

From the story A Polling Free-Fall Among Blacks in the Washington Post

In what may turn out to be one of the biggest free-falls in the history of presidential polling, President Bush's job-approval rating among African Americans has dropped to 2 percent, according to a new NBC/Wall Street Journal poll.

The drop among blacks drove Bush's overall job approval ratings to an all-time low of 39 percent in this poll. By comparison, 45 percent of whites and 36 percent of Hispanics approve of the job Bush is doing.

Thanks to Jonathan Marsh for that link. This reminds me of a skit on the Dave Chappelle show where a game show host asks a black guy, "Why didn't black people trust Ronald Reagan?" and he responded "I didn't know we were supposed to trust him in the first place". Of course, it was the right answer.

From the story NBA's dress code blasted in the Miami Herald

The NBA has announced that a dress code will go into effect at the start of the season. Players will be required to wear business-casual attire when involved in team or league business. They can't wear visible chains, pendants or medallions over their clothes.

Jackson, who is black, said the NBA's new rule about jewelry targets young black males because chains are associated with hip-hop culture, and he said the league is afraid of becoming ''too hip-hop.'' In protest, he wore four chains to the Pacers' exhibition game against San Antonio on Tuesday.

Philadelphia's Allen Iverson also was critical of the new rule, which the NBA enacted Monday.

''I feel like if they want us to dress a certain way, they should pay for our clothes,'' he said. "It's just tough, man, knowing that all of a sudden you have to have a dress code out of nowhere.''

Boston Celtics star Paul Pierce agreed that the new rule targeted young, black players.

''When I saw the part about chains, hip hop and throwback jerseys, I think that's part of our culture,'' Pierce said. "The NBA is young black males.''

I guess it's OK for the NBA rosters to be dominated by blacks as long as they don't dress or act "too black". 
 

It's been over a month since we shipped the alpha version of the Nightcrawler release of RSS Bandit. Since then we've fixed a number of annoying bugs and polished a number of our features. An example of the kind of polish we've added since the alpha is shown in the screenshot below.

There are three main classes of subscriptions we now support in RSS Bandit; feeds (Atom or RSS), newsgroups (NNTP) and search results. We made search results a first class subscription type because I suspect that subscribing to search results especially on various blog search engines is only going to increase in popularity. The process for adding a new search engine is still too "techie-focused" for my liking. I'd love it if our users would just be able to add the URL of their search engine of choice and then we check if it supports Amazon OpenSearch, if so then we add it as one of the choices for the Search Results subscription wizard. The current process for adding a search engine whose results can be subscribed to involves users adding a URL showing the query string format of the engine (e.g. http://search.msn.com/results.aspx?q={0}&format=rss where {0} is a place holder that shows where the query string should be inserted).  

I should investigate how many search engines provide an OpenSearch description documents file. If enough of them do, it may be worth the while for our users if we went ahead and supported it. That way they can just add 'http://search.msn.com' to their favorite search engine list and we autodiscover the rest.

The Newsgator API has been a source of mild frustration for me since I added support for it. The existing synchronization features in RSS Bandit involve uploading/downloading a single file containing the state of the application. The Newsgator API assumes that any application using it for synchronization is also using it as a source for RSS feeds. From my perspective this seems to be a very big assumption to make but is understandable when one considers that the original purpose of the API was for their in-house applications. This assumption manifests itself by requiring that to synchronize the state of a feed I'm subscribed to, I need to fetch its feed from Newsgator online. This means that if I'm subscribed to 100 feeds in RSS Bandit, then I might need to download up to 100 feeds from Newsgator Online as part of the synchronization process each time I sync. This makes the synchronization process a lot slower than I expected. I'm now wondering whether we should rethink the user flow for our synchronization step since currently we lock the UI while syncing to prevent users making changes while we are syncing. With synchronization to Newsgator this could take several minutes as opposed to a minute or less with our other synchronization methods. I did make some performance improvements since we shipped the alpha but it still does take a while longer than I like. :(

The winner of the RSS Bandit New Logo design contest has been announced. Congrats to Eric Winchester.

Old Logo:

New Logo:

I'd like to thank all the folks who took the time to submit entries and those who voted for their favorite logos. We greatly appreciate your support.

It is quite likely that Torsten and I will ship a 'refresh' of the alpha installer this weekend. The reason it isn't a beta is that we are not yet feature complete. The code for downloading enclosure/podcasts still isn't all there and I haven't started on my idea for 'watching' posts for new comments. With any luck we should have all this done in the next few weeks.

After the beta, we'll focus primarily on performance issues. We've already fixed a number of issues that were causing lots of CPU usage but our memory consumption still higher than I prefer. I expect that the final version of Nightcrawler will ship during the holiday season.


 

Categories: RSS Bandit

A recent comment on the Groklaw blog entitled Which Binary Key? claims that one needs a "binary key" to consume XML produced by Microsoft Office 2003. Specifically the post claims
No_Axe speaks as if MS Office 12 had already been released and everyone was using it. He assumes everyone knows the binary key is gone. Yet Microsoft is saying that MS Office 12 is more or less a year away from release. So who really knows when and if the binary key has been dropped? All i know is that MSXML 12 is not available today. And that MSXML 2003 has a binary key in the header of every file.
...
So let me close with this last comment on the fabled “binary key”. In March of 2005, when phase II of the ODF TC work was complete, and the specification had been prepared for both OASIS and ISO ratification, the ODF TC took up the issue of “compliance and conformance” testing. Specifically, we decided to start work on a compliance testing suite that would be useful for developers and application providers to perfect their implementations of ODF. Guess who's XML file format was the first test target? Right. And guess what the problem is with MSXML? Right. It's the binary key. We can't do even a simple transformation between MSXML and ODF!

As someone who's used the XML features of Excel and Word, I know for a fact that you don't need a "binary key" to process the files using traditional XML tools. Brian Jones, who works on a number of the XML features in Office, has a post entitled The myth of the Binary Key where he mentions various parts of the Office XML formats that may confuse one into thinking they are some sort of "binary key" such as namespace URIs, processing instructions and Base64 encoded binary data. All of these are standard aspects of XML which one may typically doesn't see in simple uses of the technology such as in RSS feeds.

Being that I used to work on the XML team there is one thing I want to add the Brian's list which often confuses people trying to process XML; the unicode byte order mark (BOM). This is often at the beginning of documents saved in UTF-16 or UTF-8 encoding on Windows. However as the Wikipedia entry on BOM's states

In UTF-16, a BOM is expressed as the two-byte sequence FE FF at the beginning of the encoded string, to indicate that the encoded characters that follow it use big-endian byte order; or it is expressed as the byte sequence FF FE to indicate little-endian order.

Whilst UTF-8 does not have byte order issues, a BOM encoded in UTF-8 may be used to mark text as UTF-8. Quite a lot of Windows software (including Windows Notepad) adds one to UTF-8 files. However in Unix-like systems (which make heavy use of text files for configuration) this practice is not recommended, as it will interfere with correct processing of important codes such as the hash-bang at the start of an interpreted script. It may also interfere with source for programming languages that don't recognise it. For example, gcc reports stray characters at the beginning of a source file, and in PHP, if output buffering is disabled, it has the subtle effect of causing the page to start being sent to the browser, preventing custom headers from being specified by the PHP script. The UTF-8 representation of the BOM is the byte sequence EF BB BF, which appears as the ISO-8859-1 characters "" in most text editors and web browsers not prepared to handle UTF-8.

I wouldn't be surprised if the alleged "binary key" was just a byte order mark which caused problems when trying to process the XML file using non-Unicode savvy tools. I suspect some of the ODF folks who had problems with the XML file would get some use out of Sam Ruby's Just Use XML talk at this year's XML 2005 conference. 


 

Categories: XML

I've been using MSN Virtual Earth for a while now and like it quite a lot. However there is definitely room for improvement and I'm glad to see that the team is soliciting feedback from users on what features they consider most important for the next release. In the post Suggestions for Virtual Earth Release 2? they write

The VE team is rolling on the next release and is interested in your feature requests. Lets make it interesting - You have 10 bucks to spend on features. How would you spend 'em? Post your comments here. My shopping list might look like this -
 
4  Street maps for Italy
3  Driving directions integrated in the application and not linked off to maps.msn.com
3  improved WiFi coverage for the Locate me feature in rural areas.
 
You get the idea. Go ahead, buy your features. Just remember to stay within your budget :-)

If you use VE and have some features you'd like to see in the next release, go ahead and post a comment with your requests. Here's how I'd spend my $10 on features; $2 to expand the Virtual Earth API to include conversions from physical addresses to latitudes & longitudes (aka geocoding), $3 to integrate driving directions into VE as opposed to being linked to http://maps.msn.com as is done today, $4 to add the ability to store my favorite locations in VE, and $1 to add maps of Canada to the service.

So how would you spend your dollars on VE features?


 

Categories: MSN

October 17, 2005
@ 04:51 PM

Every week or so I get a complaint from someone using Safari on Mac OS X complaining about the fact that my blog looked wacky in their browser. I finally got around to fixing the templates used by my blog and now it should look fine in Safari.

The following sites were helpful in showing me what my site looked like in Safari; http://www.danvine.com/icapture/ and http://www.fundisom.com/g5/. Thanks to Martin Dittus for pointing me out to these sites without which I wouldn't have been able to confirm my changes.


 

Categories: Ramblings

Dave Sifry, the CEO of Technorati, has a regular series of posts called The State of the Blogosphere where provides various statistics about the number of blogs Technorati is tracking. In State of the Blogosphere, October 2005 Part 1: On Blogosphere Growth he writes

About 70,000 new weblogs are tracked every day, which is about a new weblog created each second, somewhere in the world. It also appears that blogging is taking off around the world, and not just in English. Some of the significant increases we've seen over the past 3 months have been due to a proliferation of chinese-speaking weblogs, both on MSN Spaces as well as on Chinese sites like blogcn.com .

The growth of the Chinese blogosphere on MSN Spaces is a trend those of us working on Spaces have seen first hand. I wouldn't be surprised if we are one of the biggest blog hosting services for Chinese bloggers. An interesting side effect of this growth is that an increasing number of blogs in the Technorati Top 100 are blogs that are popular with Chinese users of MSN Spaces.

Below is a list of the MSN Spaces on today's version of the top 100 list 

27. spaces.msn.com/members/princesscecicastle
11,999 links from 3,455 sites. View All »

30. Hack MSN Spaces
­Spaces Customization at its Best™
By Devdutt Parikh
12,540 links from 3,329 sites. View All »

41. spaces.msn.com/members/slim
By slim
8,569 links from 2,771 sites. View All »

47. Herramientas para Blogs
Herramientas para spaces. Un blog sobre personalización de los spaces
By mmadrigal madrigal
7,309 links from 2,578 sites. View All »

49. Scott's "SiteExperts" Place
Web developers, Web developers, Web developers! MSN Client architect who shares his thoughts on DHTML, AJAX, Client Frameworks, etc., and how we are engineering MSN properties.
By Scott Isaacs
7,103 links from 2,509 sites. View All »

66. spaces.msn.com/members/flowersummer
6,405 links from 2,118 sites. View All »

71. spaces.msn.com/members/locker2man
By locker2man
5,358 links from 2,026 sites. View All »

74. spaces.msn.com/members/hcy521
6,640 links from 2,007 sites. View All »

It is interesting to note that every space on the Technorati Top 100 list is either Chinese or is about customizing/hacking the MSN Spaces user interface which is popular among our Chinese users. I'd never have guessed that these would be the most popular spaces when we launched the service last year.


 

Categories: MSN

A comment to my post Some Thoughts on the Mini-Microsoft blog struck me as so good that it was worth sharing. So I'm reposting it here so others get to see it

The Lessons of Longhorn
I’ve worked at MS for many, many years in the product groups. I love the company, and have prospered with it. I’m not some disgruntled flunky. I manage a big group, and am committed to doing everything I can to make my group a great place to be and build really compelling products that lots of customers will want to buy. We were and still are a great company in many ways. But we could be even greater.

The Longhorn saga highlights some stark lessons about why employees are pissed off and frustrated with the very top handful of execs. We are all held to very high standards. We write annual commitments, and work very hard to achieve them. If we don’t achieve them, we know we will not be rewarded. We want to do great work, make great products, and be rewarded for it, personally and financially. We don’t shirk from this challenge, we are up to it! But, we expect these rules to apply to everyone, evenly and openly. All the way to the top.

Longhorn will be a good product when it ships, but it will ship two years later than it should have. That extra two years represents what, maybe 8,000 man years of work? At a fully burdened cost of say $150k/head/year that’s $1.2Billion in direct costs of our resources flushed down the toilet. But far worse than those direct costs are the lost opportunity costs of not having the product in market two years earlier and getting started on Vnext.

Who is to blame for this debacle? First BillG himself, for pushing the Windows group to take on huge, extremely difficult technical projects that destabilize all the core parts of the OS, and hold shipping hostage. Even worse, in some cases these efforts seem to be little more than ‘pet’ ideas of Bill’s, with little clear customer value, at least to my understanding. Second, the very top handful of execs in the Windows group are to blame, for placating Bill and not applying the most basic good judgment on engineering and project management. From my perspective, it was clear to nearly every engineer in every product group at MS that Longhorn was badly screwed up, for far too long. But no one at the top would admit it or come to grips with it for far too long. For top product execs as MS, there is a long history of a culture that Bill is right, do what he says, always stay in his good graces no matter what. If you do that, you will likely make a huge fortune. If you don’t, your career at MS is over. I understand the pressure on execs to behave that way and always say ‘Yes’ to Bill. But that’s not the leadership we need. We are not helping anyone with this game, neither customers nor ourselves.

All of us know that if we screwed up like this, we would likely be forced out of our groups, with our reputations as product people shot, and for good reason. But when Bill and Jim et al screw up, nothing happens.

I really want Bill to be man enough to stand up and say, “I made a big mistake. This is what we’ve learned, and this is how we are going to do even better.” Bill is a tremendous thinker, but he is human too, and sometimes can make mistakes. We can’t have a culture that holds he is semi-divine. We need leaders who really lead, pragmatically and effectively, who hold themselves openly to the same standards that we are all held to. That is how we can become an even better company and reach more of our still great potential.



 

Categories: Life in the B0rg Cube

In his post Betty Dylan, Railroad Tavern, Sunday 8PM Jon Udell writes

I wondered why online services like upcoming.org hadn't yet gone viral, and I made a few suggestions, which were well received. But to be honest, the Keene, NH metro in Upcoming is no more lively now -- a day after Yahoo acquired Upcoming -- than it was six months ago.

Case in point: the Betty Dylan band is coming to Keene on Sunday and Monday. I know this because a friend organized the event. But neither of the venues' websites -- Railroad Tavern and Keene State College -- has the information. Nor does the Keene Sentinel. What's more, none of these three websites makes calendar information available as RSS feeds.

Yahoo's acquisition of Upcoming will certainly help move things along. As will the growing visibility of other such services, notably EVDBEventful. But since I expect no single one of these to dominate, or to supplant the existing calendars maintained by newspapers, colleges, and other venues, we have to think in terms of syndication and federation.

RSS is a big part of the story. Calendar publishers need to learn that information made available in RSS format will flow to all the event sites as well as to individual subscribers.

I think, like me, Jon Udell is grabbing a hold on things from the wrong end of the stick. When I first started working on the platform behind MSN Spaces, one of my pet scenarios was making it easier to create blog posts about events then syndicating them easily. One of the things I slowly realized is that unlike blogging which has killer apps for consuming syndicated content (RSS readers) there really isn't anything similiar for calendar events nor is there likely to be anything compelling in that space in the near future. The average home user doesn't utilize calendaring software nor is there incentive to start using such software. Even if every eventing website creates RSS feeds of events, the fact is that my girlfriend, my mom and even me don't maintain calendars which would benefit from being able to consume this data.

The corporate user is easier since calendaring software is part of communications clients like Outlook and Lotus Notes. However those aren't really the targets of sites like Upcoming or Eventful, however I suspect those are their best bets for potential users in the near term.  


 

October 16, 2005
@ 05:46 PM

Richard McManus has a blog post on his Read/Write Web blog entitled craigslists gets heavy with Oodle where he writes

Uber classifieds site craiglist has requested that Oodle , a classifieds 'meta' search engine, refrain from scraping its content. This has the potential to be the first high-profile case of a mash-up site being slapped for taking another site's content.

In a recent ZDNet post , I wrote that the business models for Web 2.0 mash-ups are beginning to ramp up. Some of the revenue possibilities for sites like Oodle are advertising, lead generation and/or affiliates, transactional, subscription.

Oodle wrote on their blog that they send craigslist "free traffic" and they "don't compete with them by taking listings." John Battelle said that craigslist's response to Oodle "feels counter to the vibe craigslist has always had".

This reminds me of the panel on business models for mash-ups at the recent Web 2.0 conference. One of the things that had me scratching my head about the panel is that it seemed to have skipped a step. Before you talk about making money from mash-ups, you have to talk about how people providing the data and services mash-ups are built on make money.

Providing data and services isn't free; servers cost money, system administrators and other operations folks cost money, and even the bandwidth costs money.  Simply claiming to be sending a service "free traffic" may not justify the cost to them of supporting your service and others like it. Then there's the fact that the site may have strategic reasons for not wanting their data syndicated in other sites.

A good example that comes to mind is eBay. Although eBay has an extensive API for developers who want to consume its data and services yet they aggressively defend themselves against screen scraping services to the extent that they created a legal precedent for services based on screenscraping their site to be considered tresspassers. On the one hand this seems schizophrenic but the fact is that it makes sense from a business perspective for eBay to do this.

Personally, I think more companies need to be like eBay and decide where it makes business sense to open up their website or service as a web platform. As part of my ThinkWeek paper on MSN as a Web Platform I've tried to create a taxonomy that can be used as a starting point for web companies to decide where they should consider being open. This taxonomy will  also be part of my talk at O'Reilly's Emerging Technology Conference if my proposal gets accepted. 


 

Categories: Web Development

I've been to two O'Reilly Conferences this year and both times I've been struck by the homogeneity of the audience. Most of the speakers and attendees are white males in their mid-twenties to mid-thirties. There are few blacks, women, indians or east asians. Much fewer than I'm used to seeing during my typical workday or at other conferences I have attended. Shelley Powers has mentioned this before in posts such as Maids, Mommies, and Mistresses  but today was the first time I've seen this commented on by one of the folks I'd consider to be in the 'inner circle' of the O'Reilly Conference set.

In his post What it's like at Web 2.0 Anil Dash writes

So, there's the Old Boy's Club. And surprisingly, there's a 50-50 ratio of wanna-bes to real successes within that club. But the unsurprising part is probably what the makeup of that club looks like. Web 2.0 might be made of people, as Ross Mayfield said, but judging by the conference, Web 2.0 is pretty much made of white people. I'm not used to any event in a cosmopolitan area being such a monoculture.

Now, the folks who organized Web 2.0 are good people whom I genuinely believe want their event to be inclusive. But the homogeneity of the audience doesn't just extend to ethnicity, it's even more evident in the gender breakdown. There are others who've covered this topic better than me, but it's jarring to me not merely because the mix was such a poor representation of the web that I know, but because I think it's going to come back and bite the web in the ass if it doesn't change eventually.

See, it's not just making sure the audience and speakers represent the web we're trying to reach, but the fact that Bay Area tech conferences are so culturally homogenous is dangerous for the web industry. When people talk about buying a song on the iTunes music store, they're still using some tired Britney Spears example, or if they're under 35 or so, they might mention Franz Ferdinand. This is not an audience in touch with Bow Wow or Gretchen Wilson, even though they've sold millions of trackcs. When they talk about television, they're talking about broadcasting Lost or Desperate Housewives, but they're not aware of Degrassi or Ultimate Fighting. Worse, I met a number of people who were comfortable with being culturally illiterate about a great many people who live right here in the U.S.; I can't imagine how they would reach out to other cultures or countries.

I've been quite surprised by how much O'Reilly conferences fail to reflect the diversity of the software industry as I've experienced it, let alone the Web at large. This is "Web 2.0"? I surely hope not.


 

Categories: Ramblings

I've been watching this unfold at work and it's great to know it's now official. From the press release Microsoft and Yahoo! Announce Landmark Interoperability Agreement to Connect Consumer Instant Messaging Communities Globally we learn

SUNNYVALE, Calif., and REDMOND, Wash. — Oct. 12, 2005 — Yahoo! Inc. (Nasdaq: YHOO) and Microsoft Corp. (Nasdaq: “MSFTâ€) today announced a landmark agreement to connect users of their consumer instant messaging (IM) services on a global basis. The industry’s first interoperability agreement between two distinct leading global consumer IM providers will give MSN® Messenger and Yahoo!® Messenger users the ability to interact with each other, forming what is expected to be the largest consumer IM community in the world, estimated to be more than 275 million strong.

Being able to instant message between IM communities is one of the features most requested by MSN Messenger and Yahoo! Messenger users, and Microsoft and Yahoo! share a commitment to provide IM interoperability while keeping consumer security and privacy first and foremost. In addition to exchanging instant messages, consumers from both communities will be able to see their friends’ online presence, share select emoticons, and easily add new contacts from either service to their friends’ list, all as part of their free IM service.* Yahoo! and Microsoft plan to introduce these interconnectivity capabilities between MSN Messenger and Yahoo! Messenger to customers around the world in the second quarter of 2006, and in doing so expect to help make IM an even more useful part of consumers’ online communications and communities.

This is really good news and a step in the right direction with regards to interoperability across instant messaging applications. Now I have to go nag the folks across the hall about what this means for folks like me who use our "@yahoo.com" email addresses as our Messenger sign-in names. I already had to switch once from my "@microsoft.com" address when Microsoft started using Live Communication Server internally. 

I'll see what I can find out from folks when they get into work later today and report back.


 

Categories: MSN

More info keeps spilling out about the beta of the Hotmail "Kahuna" release. If you go to http://join.msn.com/mailbeta/features, you'll get an overview of the new features in the next version of Hotmail including screenshots. The list includes

As the saying goes a picture is worth a thousand words. It's a lot easier to appreciate the work that's gone into the next version of Hotmail when you actually see it. Even better is using it, so don't forget to sign up for the beta by going to http://www1.imagine-msn.com/minisites/hotmail/Default.aspx.


 

Categories: MSN

October 11, 2005
@ 02:32 PM

I never got to try out Google Reader last week because the service was too slow, so I gave it a shot again this morning. My thoughts on the application pretty much identical to Dave Winer's thoughts on the application where he wrote

I tried the Google news reader again, this morning, after it had loaded all my feeds (it seems to take quite a few hours to do that).This is the second blog-related product they've come out with recently that appears not to have been touched by human beings before it was introduced to the world (the other was the ridiculous blog search). I think they need to start using their own stuff before releasing it. And maybe look at the competition for ideas. When you're first into a market there's an excuse for being so wrong. But the first of this kind of software shipped six years ago. To give you a comparison, Visicalc shipped in 1979. By 1985 we had been through two generations of spreadsheets with Lotus 1-2-3 and Excel. Google's reader is a huge step backward from what was available in 1999. The arrogance is catching up with them.

I actually tried writing my own review but gave up because it kept seeming too negative and I try not to snipe at products made by our competitors. Still, I am stunned that they let this application out of door in the shape it's in.


 

October 10, 2005
@ 09:19 PM

I've seen a couple of complaints online from people who saw the video of the Hotmail "Kahuna" release but couldn't get into the beta. The beta is now open to the general public. If you'd like to sign up for the beta, just click on the link below and follow the steps listed

http://www1.imagine-msn.com/minisites/hotmail/Default.aspx

I'm totally digging the beta and have definitely been impressed with the improvements in the service. Kahuna is gearing up to be an excellent release.


 

Categories: MSN

October 10, 2005
@ 03:16 PM
From the post "Darkness went with them, and they cried with the voices of death. " on the Making Light blog

The nine Senators who voted against the anti-torture amendment:

  1. Sen. Wayne Allard [R-Colorado]
  2. Sen. Kit Bond [R-Missouri]
  3. Sen. Tom Coburn [R-Oklahoma]
  4. Sen. Thad Cochran [R-Mississippi]
  5. Sen. John Cornyn [R-Texas]
  6. Sen. James Inhofe [R-Oklahoma]
  7. Sen. Pat Roberts [R-Kansas]
  8. Sen. Jeff Sessions [R-Alabama]
  9. Sen. Ted Stevens [R-Alaska]
Henceforth to be known as the Nazgul.

(Meme via Jim Henley.)

If you haven't been following this story you can catch up on it in the Telegraph news article entitled Bush will veto anti-torture law after Senate revolt.


 

October 9, 2005
@ 11:45 PM

John Montgomery has a blog post entitled Why Ning? where he asks

Not to complain or anything, but I don't get Ning. For the past fifteen minutes, I've been clickning through hotornot-like scenarios. Some of them are hysterical (try "Which driver has the smaller penis?") and some are mundane (Which is the better beer?). But I'm looking for why this is the next bit thning and can't figure it out.

The potential that Ning presents is counterbalanced by how hokey it is. On the one hand, it is an attempt to cash in on the "Web 2.0" hype by creating a build-your-own-web2.0-website toolkit in much the same way build-your-own-eCommerce-website toolkits were popular a few years ago. I imagine the scenario outlined in Dave Winer's post Editorial: Ning harkens back to 1999 is closer to the truth than we suspect.

On the other hand, Ning points to the next stage in the evolution of building mash ups and web platforms. Forrester Research's Charlene Li has a post entitled The Roll-Your-Own Mash-up Challenge where she writes

Im at Web 2.0, which is just a great conference. One of the hot discussions is around mash-ups (def) which combines the functionality from two different applications into a new one. One of the best ones is housingmaps.com, which combines apartment rentals etc., from Craigs List and Google Maps.

To me, this is the next step in the "social computing", Web 2.0, or whatever-you-call-it evolution. First, we had personalized content a la RSS-generated content on My Yahoo! At the same time, widgets, which are now hitting their stride, gave us our own customized set of applications on the desktop

The next step now is creating customized applications that can be paired with the content of our choice - yup, mash-ups. (I wrote about this previously in the context of widgets.) But to do that today, you have to know Javascript, Flash, and AJAX. I'm looking forward to the day when there will be simple interfaces into these APIs so that consumers like me (OK, Ill admit, Im a pretty geeky, techno consumer) can create our own mash-ups.

This is because I think were just at the very tip of the revolution. Imagine if we could tap into the collective creativity of thousands, millions of consumers. How many times have you said to your self, "Wouldnt be nice if I could just." And heres the killer part - what if some built a platform for consumers to do this, and then enabled a way to SHARE those innovations? Some of them would float to the top (thanks to ratings, tagging, etc.) and you could actually start monetizing them. Now thats tapping into the power of consumers!

So heres my challenge - what mash-ups would YOU create? Add them to the comments below - Im curious to see what all you bright minds can come up with!

I personally think Ning is ahead of its time. What we need today is more web sites turning themselves into web platforms as well as business models for both the web platforms AND the developers building on these web platforms. As an industry we're still muddling our way through at this point. Once we've have an ecosystem of web platforms as well as sustainable business models for the various offerings, the next step is how to broaden the target of these platforms outside the traditional developer market. This is similar to the same way that Microsoft brought programming against COM components to the masses back in the 1990s with Visual Basic except this time the platform is the entire Web and not just one vendor's operating system.

That's why Ning is cool.


 

Categories: Social Software

Mini-Microsoft has a blog post on middle management at Middle Managers, Bureaucracy, and No Birds at Microsoft where he offers pointers to some counter arguments to his regular bashing of bureaucracy and middle management at Microsoft. The various linked posts and some of the comments do a good job of presenting an alternate perspective to the various complaints people make against bureaucracy and middle management.

Derek Denny-Brown, a friend of mine who just left Microsoft, has a blog post entitled The curse of Middle management where he writes

I had a long discussion with a friend of mine about Longhorn aka Windows Vista. He had just caught up with news and some of the recent interviews with Jim Allchin. He knew I had some involvement with the OS divisions, and was just generally curious for my perspective 2 weeks out of the company.

In my view, a lot of the problems at Microsoft stem from bad middle management. Microsoft has built up a whole ecology of managers, who are at least as concerned with their career as they are with making good decisions. I've interacted more than I like to admit. The effect is that upper management doesn't hear a clear story about what is really going on. I think the phrase I used was that they 'massage the message'. Combine that with long release cycles and lack of accountability falls out as an inevitability.

One of the reasons I left is because I just don't see any way out of that mess. I am humbled by MiniMicrosoft and his determination to be part of the solution.

I tend to agree with Derek about lack of accountability being a problem here. One thing I've noticed about bad middle managers is that (i) it is almost impossible to get them out of their positions and (ii) all it takes is one or two to seriously derail a project. I've personally been surprised at how many bad middle managers just keep on ticking at Microsoft. It seems it is a lot easier to see individual contributors or even VPs pushed out for incompetence than middle management (Dev Managers, General Managers, Product Unit Managers, Group Program Managers, etc).

It is also surprising how much damage, a well-placed yet broken middle management cog can be in the smooth functioning of the corprorate machine. I've lost count of the amount of times I've asked some member of a team that creates a much reviled project why they product sucked so much and the response would be "our test/dev/general manager didn't see fixing those problems as a high priority". As Derek states, it is even worse when you get the ones that present a false impression to upper management about how things are going. 

Michael Brundage also covered some of this in his essay on Working at Microsoft which is excerpted below

+ Company Leadership

Bill Gates and Steve Ballmer get most of the press, but it's an open secret that all of the division heads (and their staff, and their staff) are top-notch. I'm (happily) oblivious to how that circle operates, so I can only judge them on their results.

Given that Microsoft's been convicted of monopolistic practices, it may shock you when I say that Microsoft's upper management strikes me as very ethical. They talk about ethical behavior all the time, and as far as I've seen, lead by example. Maybe I'm being naive, but I find Microsoft's upper management to be very trustworthy. They're also thinking very far ahead, and doing a good job getting the information they need to make solid decisions.

Microsoft's leaders are also very generous, and frequently encourage the rest of us to make charitable donations (both money and time) a priority. Giving is a large part of Microsoft's corporate culture.

It's refreshing to work at a company where you can trust that the upper echelon is smart, hardworking, and making right decisions. I don't have to worry that my general manager or vice-president will drive our division (or company) into the ground through incompetence or greed. Microsoft's no Enron or WorldCom.

- Managers

In contrast, most of the middle management should be tossed.

Did I mention I've had six or seven managers in five years? I've only changed jobs twice the others were "churn" caused by reorganizations or managers otherwise being reassigned. In fact, in the month between when I was hired and when I started, the person who was going to be my manager (we'd already had several phone/email conversations) changed! It's seven if you count that, six if you don't.

None of these managers were as good as my best manager at NASA. Of the six-seven managers I've had, I'd relish working for (or with) only two of them again. Two were so awful that if they were hired into my current organization (even on another team), I'd quit on the spot. The other two-three were "nngh" -- no significant impact on my life one way or another. I'd love to think this is some kind of fluke, that I've just been unlucky, but many other Microsoft employees have shared similar experiences with me.

I think part of the problem is that Microsoft doesn't generally hire software developers for their people- or leadership-skills, but all dev leads were developers first. Part of the problem is also that (unlike some companies that promote incompetence) good leads are usually promoted into higher positions quickly, so the companies best managers rise to the top. Consequently, the lower ranks are filled with managers who either have no interest in advancing up the management chain (which is fine) or else are below-average in their management skills (which is not).

But it's more complex than this. At Microsoft, many managers still contribute as individuals (e.g., writing code) and are then judged on that performance (which is mostly objective) as much or more than they're judged on their leadership performance (which is mostly subjective). Because individual developers have so much freedom and responsibility, it's easy and typical to give individuals all the credit or blame for their performance, without regard to the manager's impact. Conversely, managers' performance often does not translate into tangible effects for their teams (other than the joy or misery of working for them). For example, I can still get a great review score even if my manager is terrible. I think these factors contribute to management skills being undervalued.

Microsoft also suffers from a phenomenon that I've seen at other companies. I describe this as the "personality cult," wherein one mid-level manager accumulates a handful of loyal "fans" and moves with them from project to project. Typically the manager gets hired into a new group, and (once established) starts bringing in the rest of his/her fanclub. Once one of these "cults" is entrenched, everyone else can either give up from frustration and transfer to another team, or else wait for the cult to eventually leave (and hope the team survives and isn't immediately invaded by another cult). I've seen as many as three cults operating simultaneously side-by-side within a single product group. Rarely, a sizeable revolt happens and the team kicks the cult out. Sometimes, the cult disintegrates (usually taking the team with it). Usually, the cult just moves on to the Next Big Thing, losing or gaining a few members at each transfer.

I think these "cults" are a direct result of Microsoft's review system, in which a mid-level manager has significant control over all the review scores within a 100+ person group (so it's in your best interest to get on his/her good side), and conversely needs only a fraction of that group's total support to succeed as a manager (so it's in his/her best interest to cultivate a loyal fanclub to provide that support). The cult gives the manager the appearance of broad support, and makes the few people who speak out against him/her look like sour grapes unrepresentative of a larger majority. After a string of successes, the manager is nearly invincible.

Fortunately, these managers are unlikely to move further up the ranks, due to the inherent deficiences in their characters (which are usually visible to upper management and enough to prevent their advancement, but not so severe as to warrant firing them).

These "personality cults" always negatively impact the group eventually (while they're there and/or when they leave), but counterintuitively sometimes these personality cults have a large positive initial effect. Many successful Microsoft products have come into existence only through the actions of such personality cults. Some of these products even survived after the personality cult left for the Next Big Thing.

I totally agree with Michael's analysis. Like Derek, I'm unsure as to how one would go about reversing this trend. However I definitely think the way we assess accountability of folks in [middle and executive] management needs an overhaul.


 

Categories: Life in the B0rg Cube

October 9, 2005
@ 08:18 PM

Below is a mishmash of thoughts that crossed my mind while at the Web 2.0 conference last week.

  • If you attended the conference you'd have gotten the impression that "web 2.0" isn't about technology or about social effects, it is about money. Specifically "web 2.0" is a meme that tries to describe the characteristics of the new generation of startups that have gained success either by IPOing (e.g. Google) or being bought by large companies (e.g. Flickr). The targets of this meme are VCs (to tell them what kinds of companies to invest in), startups (to tell them what kinds of companies to emulate) and big companies (to tell them what kinds of companies to buy). The fact that there multiple panelists were VCs is also very telling.

  • Google is the new Microsoft. Several times during the conference, it was brought up that Google has replaced Microsoft as the company that Silicon Valley companies love to hate because it enters nascent markets and dominates them (e.g. there was visible negative reactions by some audience members to the announcement of Google Reader

  • Microsoft isn't on anyone's radar in this audience except as an example of an aging dinosaur or the butt of some joke. This is consistent with the impressions I've seen at other conferences such as ETech and Gnomedex.

  • Just like in the dotbomb days of 1999/2000, most "web 2.0" companies don't have a business model besides getting bought by a big company. The new crutch many of them lean on is Google AdSense when queried about how they plan to be profitable.

  • Sun Microsystems is living on borrowed time. Their business strategy now seems to be one giant Hail Mary play.

  • The folks at Yahoo! Inc totally get it. If I was at Google I'd probably spend as much time worrying about them as I do about Microsoft.

  • Meebo reminds me a lot of the dotbomb days. It's basically an AJAX version of Trillian.  Building an AJAX version of Trillian is cool from a geeky perspective but I honestly don't see them lasting long except if they get bought by some bigger player like Google, Yahoo!, MSN or maybe even Trillian.


 

Categories: Current Affairs

The panle on what teens want was hosted by Safa Rashtchy who asked questions of 5 teenagers. There were three males and two females.

Earlier in the day, I was chatting with Mike and I pointed out that all through th econference I hadn't heard mention of the kind of Web apps that excite the younger generation. I hadn't heard MySpace mentioned once, and the only times instant messaging came up was in the context of Skype being sold to eBay for $4 billion. The Web 2.0 conference seemed dedicated to applications mostly of interest to the twenty five and over crowd.

This changed during the session when Safa Rashtchy questioned the teenagers about various aspects of their computer usage. The notes summary below is mainly from memory since I didn't take notes during this session.

Three out of five of the teenagers used MySpace. One of them said he spends all his free times waiting for comments to his space. Another teenager said she had stopped using MySpace when she went to college because it was too "high school" and now she used Facebook which was more college oriented.

One of the teenagers said he spent up to $50 a month on ring tones. Four of them had iPods and all of them rarely [if ever] paid for music.  It seemed thay had all tried the iTunes Music Store at one time or the other but eventually succumbed to the lure of file sharing networks.

They all used AOL Instant Messenger and one other IM client. Two used MSN Messenger mainly because they had friends outside the US (Mexico & Brazil) and MSN Messenger is very popular outside the US. One or two used Yahoo! Messenger. None of them used Skype and in fact they sounded like they had never heard of it. They didn't seem interested

They all used Google for search.

Two of them had used eBay but worried about being ripped off online.  

When asked what kind of applications they'd like to see on the Web. They asked for "more free stuff" and "get rid of spyware".

The most amusing part of the session was when Safa was trying to find out what eCommerce sites they'd visit. He first asked where they'd buy a cellular phone and each kid said they'd go to the website of their current cellular service provider. Then Safa tried another tack and asked where they'd buy a CD player online and the first kid went "CD Player?" with the same tone of voice and expression on his face I'd have if asked where I could buy a record player online. The audience found this hilarious.

PS: This panel is almost identical to a similar one at the Microsoft Research Social Computing Symposium 2005 held earlier this year. MSR has a video of that panel available online.


 

Categories: Trip Report

After lunch on friday, there was a surprise session. John Battelle announced that he was going to have a conversation with Sergey Brin. Throughtout the interview Sergey came off as very affable and it's easy to see how he can tell his employees that their corporate motto is "Do No Evil" without them questioning its naiveté.  

John Battelle started off by asking "It's been a long strange trip to where you are today, how's your head?". Sergey responded that they were very fortunate to have started at Stanford. Being in Silicon valley turned out to be very helpful and influential to the course his life has taken today. When he and Larry first started Google they had planned to open source the Google code. The main reason they decided to start a company was because they needed money to purchase the significant computing resources that the Google search engine needed.

John Battelle then asked Sergey to respond to Terry Semel's comments from the previous day that Google is an extraordinary search engine but as a portal they probably rank as number 4. Sergey responded by jokingly stating that although their cafeteria is nice and they keep trying to improve the quality of the food, they aren't in the top 10 or top 100 restuarants in the world. This elicited loud laughter from the audience.

John followed up by asking Sergey what he thought of the comments by Yusuf Mehdi of Microsoft that they are now the underdog.  Sergey replied that he is very excited that Google is considered a leader in terms of technology. He knows they may not be number 1 when it comes to big business deals or creating huge platforms like Microsoft but they are definitely a technology leader.

John then asked Sergey whether he felt any pressure due to their high share of the Web search market and high stock market valuation. Sergey said he wasn't a valuation expert so he couldn't comment on that. As for high market share in the search market, he is glad that so many people use their search engine based on word of mouth. It shows they have built a quality product. They have some promotional partnerships but for the most part their market share has grown due to the great search experience they provide.

The next question from John was whether Google would keep the clean look on their search page. The response from Sergey was that they will continue with that look on their front page but there will arise the need for other kinds of products from Google. For example, GMail arose out of the need for a better user experience around Web mail. Not only have they improved the web mail experience for GMail users but they have also bettered the lot of users of competing services since competitive responses have increased the mail quota size on various services by 100 times or more.

John began his next question by bringing up a topic that had been an undercurrent in various conversations at the conference. Google has become the new Microsoft, in that they are the 800 lb. gorilla that enters markets and takes them over from existing players. John gave the specific example that the newly launched Google Reader has now scared vendors of web-based RSS readers. Sergey responded by pointing out that when Google enters markets it usually leads to good things for existing parties in the market because small companies get bought and new companies get funded. He used GMail as an example of the entrance of Google into a market leading to a flurry of positive M&A activity. Secondly, Sergey stated that some of their offerings are intended to benefit the Web at large. He said they created AdSense as a way for Web publishers to make money and stay in business. Google had become concerned that a lot of web publishers were going out of business which meant less content on the Web which was bad for their search engine.

The questions from John Battelle ended and a Q&A session with the audience began.

The first question asked was about the rumored office suite being developed by Google. Just like Ray Ozzie and Jonathan Schwartz had done when asked the question, Sergey said he didn't think that it made sense to simply port outdated ideas like the mini-computer to the Web. The audience laughed at the comparison of Microsoft Office to the mini-computer. Sergey did say that Google would likely be creating new kinds of applications that solved similar problems to what people currently use traditional Office suites to solve.

The next question from a member of the audience was whether Sergey thought that click fraud was a big problem for Google. Sergey felt that click feaud wasn't a big problem for Google. He said that like credit card companies they have lots of anti-fraud protections. Additionally their customers calculate their ROI on using Google's services and know they get value. Finally, he added that the algorithms that power their advertising engine are fairly complex and not easy to game.

Continuing with the "Google as the new Microsoft" meme, the next question from a member of the audience was what markets did Google not plan to enter in the near future so VCs could tell where was safe to invest. Sergey joked that he thought the various markets entered were good investments. His serious response was that Google is a very bottom up company, and their engineers usually end up deciding what becomes products instead of directives from the executives. John Battelle then jumped in and asked if the company wasn't being directed in its recent offerings then how come most of the offerings seem to be echoing the offerings found in traditional portal sites. Sergey's response was that it was probably because Google's engineers wanted to build better products than the existing offerings in the market place.

I asked Sergey that given Terry Semel's comments that search only accounts for about 5% of page views on the Web while content consumption/creation and communications applications made up 40% of user page views each, what was Google's vision for communications and content related applications. Sergey said that Google definitely plans to improve the parts of the Web where people spend a lot of their time which is part of the motivation for them shipping GMail.


 

Categories: Trip Report

The second From the Labs was presented by Usama Fayyad and Prabhakar Raghavan.

The presentation started by listing a number of Yahoo!'s recent innovative product releases such as Yahoo! 360o, Yahoo! MyWeb 2.0, Yahoo! Mail beta, Yahoo! Music
Engine
and Yahoo! Messenger with voice. Yahoo! launches 100s of products a year and recently started spending more resources on research. They want to create a science to explain the various characterisitic of the Web which they can use to build innovative products.

So far they have launched the Yahoo! Tech Buzz Game which was launched at the O'Reilly Emerging Technology Conference earlier this year. It is a fantasy prediction market for high-tech products, concepts, and trends. They also demoed Yahoo! Mindset which enables you to sort search results by your intent. The example scenario on the website is being able to sort search results for terms like "HDTV" based on whether you are doing research or trying to buy something. This is something Robert Scoble was recently asking for in his blog as the next generation in search. This is very impressive if they can actually scale it out to be more than demoware

Finally they showed off an application called Tagline which was a visual representation of popular photos and tagging trends in Flickr over time. It was a very flashy looking application but I couldn't see what the practical uses could be.


 

Categories: Trip Report

The final From the Labs was presented by Alan Eustace and Jason Shellen.

Alan began by stating that google is focused on innovation which is why the have small teams [to promote spontaneity] and give their 20% time to allow engineers free to pursue projects they are passionate about. They demoed two recent efforts

The first was an image recognition engine that could identify the sex of people by their faces using machine learning. They had trained it with over 2 billion images and it's accuracy had gotten up to 90%. The long term goal is to enable scenarios such as 'identify the people in this picture' and then other pictures with this person in them'. That would be very cool if they can actually get that to work correctly.

The second project that was demoed was the Google Reader. Actually, it wasn't really demoed. It was announced. Like everyone else I tried to use it by navigating to the site but it was so abominably slow, I gave up.


 

Categories: Trip Report

The first From the Labs was presented by Gene Becker.

Gene started off by asking how many people in the audience were growing board with the traditional computing interface of keyboard, mouse and monitor. He called the current computing interface a kazoo when we really need a virtuoso violin. HP Labs is fovused on utility and ubiqitous computing. The Web has become increasingly social, diverse, mobile, creative, experiential, contextual, and physical. HP Labs is designing software and hardware for this new world.

He showed a number of interesting developments from HP Labs such as physical hyperlinks, media scapes - digital landscapes overlayed over physical locations by combining gps + wireless + audio + ipaqs, the misto table - an interactive digital coffee table, and virgil - a context aware extensible media driven browser. Gene also mentioned that HP has been making strides in utility computing by renting out their grid networks to animators such as DreamWorks SKG. Their grid has also enabled independent animators to have access to large-scale render farms that would traditionally be out of their price range.


 

Categories: Trip Report

I attended the discussion on open versus closed models which was hosted by Danny Rimer  Jeff Barr, Toni Schneider and Sam Schwartz.  

Tim O'Reilly began the session by talking about openness and how this is a central theme of Web 2.0. However he pointed out that at the end of the day to have value a company must own something. He then asked the various members of the panel what their company owned.

Jeff Barr said that although Amazon has open APIs they do own their customer database, the buying experience, as well as the procurement and shipping process. Toni Schneider responded that Yahoo! wants to own the user's trust so that users have no qualms about placing their data in Yahoo!'s services. Tim then asked if Yahoo!'s users could export the data they have in Yahoo!'s services. Toni responded that there were ways to get data out of Yahoo!'s services and this was mostly provided based on customer demand. For example, one can export photos uploaded to Flickr but this reduces their worth since the user loses the network effects from being part of the Flickr ecosystem.

Tim's next target was Danny Rimer who he asked whether Skype wasn't as proprietary as AOL Instant Messenger since it's IM protocol wasn't based on open standards like Jabber. Danny responded that although the IM protocol is closed they did have a client-side API. He also stated that the main reason the VOIP protocol isn't more open is that they are still working out the kinks. However he did note that the Skype API hasn't gotten a lot of traction.  

Tim O'Reilly then asked the participant where they resided on the continuum of control versus openness. Sam Schwartz mentioned that there was a delicate balance between the old school and new school of thought at Comcast. Toni said Yahoo! believes in opening up their platform which is why they created YSDN, however this doesn't mean throwing things over the wall without support. Tim O'Reilly stated that it seemed Yahoo! seemed more intent on controlling things since the primary list of Yahoo! Maps mashups is hosted on a Yahoo! site while the primary list of Google Maps mashups is not hosted on a Google owned website. So it seems Google's API efforts are more community driven than Yahoo!'s. Toni responded by saying that YSDN is a good first step for Yahoo!. They aren't just about enabling people to put data in their system but also enable them to get it out as well.

As someone who has had to drive developer efforts at Microsoft, first for the XML team and now at MSN, it is a very delicate balance between enabling the community and dominating it. Unlike Tim, I interpret Yahoo!'s efforts as highlighting the efforts of their developer community and also I'd point out that Google does the same as well. It seems weird to criticize companies for highlighting the efforts of people using their platform.

Tim the asked what VC's like him were looking for in today's startups. Danny replied that he is now primarily interested in companies for geographies outside the United States such as China and Isreal. Sam responded that Comcast is looking for people and services who used a lot of broadband resources.

Tim followed up by asking about business models, it seemed to him that the goal of a lot of startups such as Toni's Oddpost was to be bought by a big company. Toni agreed that a lot of people were building applications without a business model. it was also argued by the group that we need better business models for Web 2.0 besides affiliate programs like those used by Amazon and eBay. Danny argued that it isn't a bad thing if what ends up happening is that new startups end up being fuel for innovation at big companies by being purchased. My assumption is that since he's a VC he gets paid either way. ;)


 

Categories: Trip Report

The session on Open Source and Web 2.0 was an interview of Mitchell Baker and Jonathan Schwartz by Tim O'Reilly.

By the end of this talk I was pretty stunned by some of the things Jonathan Schwartz said. He sounded like a dot bomb CEO from 1999. If I had any SUNW stock, I definitely would have ditched it after the talk.

Tim O'Reilly began the session by stating that the title of the talk was misleading. He asked for a show of hands how many people used Linux and then for how many people used Google. A lot more hands showed up for Google users than Linux users. Tim said that while people tend to limit their idea of Linux applications to desktop software like the Gimp, Google is probably the most popular Linux app. So the discussion is really about relationship between Open Source and Web 2.0.

Tim began the interview by asking Jonathan Schwartz what he felt was fresh about Sun's Open Source strategy. Jonathan said that the key thing behind Open Source's rise isn't the availability of source code but because it is available for free (as in beer). This is what is so cool about Google, they provide their services online for free which increases their reach. Sun is embracing this notion to increase the usage of its software.

The then asked Jonathan to talk about Sun's grid computing efforts. Jonathan said they recently moved to a self service model for their computing grid. Customers no longer need to sign contracts up front, instead customers just need to go to a webpage on Sun's website and have their credit card ready. Tim O'Reilly commented that customer self service is one of the pillars of Web 2.0. Since Sun moved to this model they have sold out their grid services, primarily to Texas oil & gas companies wanting to run simulations on about Hurricane Rita. Sun's goal is for their grid to target the long tail in which case Stanford college students working on their Ph.D's and the like may become their primary customers.  

The previous night Tim O'Reilly had asked Ray Ozzie if he felt new revenue models such as ad-supported Web-based software would make as much money as old revenue models such as selling shrinkwrapped software. Tim continued with this theme by asking Jonathan if he thought that Sun's new revenue model of renting out their grid would bring in as much money as their old model of selling hardware. Jonathan said their internal models show that the grid business will be very profitable for them. At this point Mitchell Baker jumped into the conversation to add that the old models currently suffer from needing a control structure which eats into revenue. Controls such as DRM, anti-piracy measures, EULAs, etc add cost to existing business models and once we move to more open models based on customer self service the savings will be huge.

Tim then asked Mitchell whether she thought that Firefox's trump card was the fact that anyone can modify the application to meet their needs since it is open source and customizable. Mitchell replied that it was less about source code availability and more about the culture of participation within the Firefox community.  

Tim then asked Mitchell if she felt that Greasemonkey would be widely adopted. Mitchell said she thought that would be extremely unlikely. She pointed out that the average user already is confused by the difference between their web browser and a web page let alone adding something as complex as Greasemonkey into the mix. I have to agree with Mitchell here, I recently found out that a surprising number of end users navigate the Web by entering URLs into search boxes on various web search engines instead of using the address bar of their browser. The web is already too confusing to these users let alone 'remixing' the Web using applications like Greasemonkey.

A number of times while he was speaking, Tim O'Reilly gave the impression that extensions like Greasemonkey are examples of Firefox's superiority as a browser platform. I completely disagree with this notion, and not only because Internet Explorer has Greasemonkey clones like Trixie and Turnabout. The proof is in the fact that the average piece of Windows spyware actually consists of most of the core functionality of Greasemonkey. The big difference is that Firefox has a community of web developers and hobbyists who build cool applications for it while most of the folks extending Internet Explorer in the Windows world are writing spyware and other kinds of malware.

It isn't the availability of the source that's important. It's the community around the application.

The next question was for Jonathan and it was about the recent announcement between Sun and Google. Jonathan's started by stating that although many people wanted the announcement to be about an AJAX Office suite that wasn't on the horizon. He said the deal was about distribution and communities which are very important. He pointed out that there are a number of widely distributed and near ubiqitous platforms such as Flash and Java which aren't Open Source. Having a wide distribution network with Java deployed on many desktops meant that one could automatically download new applications such as a VOIP client or Toolbar application on any desktop with Java or StarOffice installed. Mitchell jumped in to point out that well-distributed but lousy products don't work. She went on to add that the new distribution model is no longer about being distributed with the OS but instead is powered by word of mouth on the Web. Firefox has gotten 90 million downloads with no traditional distribution mechanisms.

Tim asked Mitchell whether there would be a return to the browser wars of the nineties where Netscape and Microsoft one-upped each other with incompatible, proprietary new features on a regular basis. Mitchell said there were two things wrong with the browser wars in the nineties; the war itself which led to incompatibilities for web developers and Netscape's defeat which led to stagnation of the Web browser. Mitchell said that Firefox will innovate with new features but they plan to ensure that these features will not be incompatible with other browsers or at least will degrade well in them.

Tim asked Jonathan what was behind the thinking that led him to becoming one of the most senior regular bloggers in the technology industry? Jonathan replied that he believes very strongly in community. He felt that developers don't buy things, they join movements. In this case, Sun's transparency is a competitive weapon. This is especially true when they can't compete with $500 million to $1 billion marketing budgets of companies like Microsoft and IBM.

Tim asked whether Jonathan's the blog is always transparent and he never attempts to mislead or provoke. Jonathan said that he definitely provokes but never misdirects. Even then the legal department at Sun doesn't read his entries before he posts them although a bunch of lawyers now have him on their speed dial and often ask him to include disclaimers in his posts.

Tim then asked Mitchell whether the large number of Google employees working on Firefox caused problems since the company is notoriously secretive. Mitchell responded by pointing out that there are people from lots of different companies working on Firefox, it's just that the Google folks get the most press. All the Google folks are still active on the core of the browser and they know that anything that goes into the core must be open for discussion. She stated that if they began to be secretive about code that would be shipping in the core of the browser then they'd be asked to put those changes in extensions instead.

The questions ended and the Q & A session began. I asked a question each of Mitchell and Jonathan.

My question for Mitchell was that given that the rise of AJAX is primarily because Firefox copied the XMLHttpRequest from Internet Explorer, was there a policy of keeping abreast of innovations in IE. "Not always", was Mitchell's answer. On the one hand they did copy XMLHttpRequest but on the other hand they didn't clone ActiveX even though they took a lot of heat for not doing so. given all the security woes with ActiveX she felt that in retrospect they had made the right decision.

My question for Jonathan was why he dismissed the idea of an AJAX Office suite earlier during the talk. Jonathan said he thought that in some cases not every application transferred well to the Web as an AJAX application. He gave examples of Firefox and Photoshop as applications that wouldn't make sense to build as AJAX applications.

Another member of the audience asked what Sun had learned from Netscape's open sourcing of Mozilla in their efforts. Jonathan replied that everything Sun produces from now on will be Open Source. He encouraged companies to join the Open Source community since he saw no down side. His goal was to get as wide a distribution as possible and then figure out how to give value to their shareholders after that.


 

Categories: Trip Report

I attended the session where Tim O'Reilly interviewed Terry Semel, who is the chairman and CEO of Yahoo! Inc. I was very impressed with Terry Semel's talk and it very much cemented for me that Yahoo! will be a company to watch over the next few years. I hope my notes do his words justice.

Terry joined Yahoo! after spending years in Hollywood because he wanted a change and saw the immense opportunity for advertising on the Web. Tim O'Reilly pointed out that some members of the mainstream media have been worried about some of Yahoo!'s media moves such as the recent hiring of Lloyd Braun, former Chairman of ABC Entertainment Television Group, to run Yahoo!'s Media Group. Tim asked if Terry was trying to turn Yahoo! into the interactive studio of the future.

Terry said he isn't sure what exactly Tim meant by an "interactive studio". Terry pointed out that when he was in Hollywood, they cared about two things; content and distribution. Technology wasn't a big deal, it was something that changed every decade or so that the studios could take advantage of. In the 21st century, Terry believes there are now three pillars of media; content, distribution and technology. Yahoo! is in distribution since it reaches over 400 million people. Yahoo! delivers content. Yahoo! is a technology company. Yahoo! is a 21st century technology company that drives great media.

Tim then asked whether the fact that Yahoo! hires reporters to doesn't put them in conflict with traditional news organizations like CNN or the LA Times. Terry responded that Yahoo! is all about content. Sometimes it is user generated content. Sometimes it is licensed content from a media company. And finally sometimes it is content created by Yahoo! as they experiment with discovering the future of content generation. Yahoo! wants to take a leadership role in redefining the nature of content generation. Terry gave the example of travel reviews and how often it is more authentic to obtain user generated content and photos about a trip than a professional travel reporters opinion. Yahoo! is trying to enable all those scenarios with their various offerings.

Tim then asked how Yahoo! can reconcile the difference between their role as a service provider with being a news organization. Tim brought up the recent incident where Yahoo! turned over information about a Chinese dissident which led to him being jailed. Tim argued that traditional news organizations would not have given up the information. Terry began by clarifying that 99% or more of Yahoo!'s news content is syndicated from other sources and they aren't a news organization. Secondly, any organization that operates in China has to observe the riles and regulations of the land. This doesn't just apply to China but for any country a company does business in from the EU to the United States. Although some of these laws may be unsettling to him, the fact is that there are the laws in those lands and everyone who lives in and/or operates a business in these countries knows the law. Terry also does not agree with the opinion that Western companies shouldn't do business with China. Exposing Chinese audiences to Western cultures is much better than a policy of isolation. After all, who would have thought that the Berlin Wall and the Iron Curtain would fall?  

Tim asked what Terry thought about Google. Terry gave Google credit for doing great work in search. Yahoo! just got into it 18 months ago, and other companies such as Microsoft are coming along as well. If he were Google he would worry about the fact that only 5% of page views on the Web are from search yet the account for about 40% of the revenue generated on the Web. Google realizes that they have to diversify and become a portal which is why they now have offerings like maps, shopping, email, customizable homepages, etc. However since they are becoming a portal one should rate them as a portal and not just as a search engine. And as a portal, they would probably rank fourth behind portals such as Yahoo! and MSN. Yahoo! has a number of superior offerings from shopping to mail. Yahoo!'s revamped email offering was rated as superior to GMail by the Wall Street Journal and Yahoo! Mail has 10 times the user base of GMail.  

The fact is that people spend 40% of their time online consuming content and another 40% in communications programs. There seems to be a great opportunity to monetize this time Web user's spend online that hasn't been seized yet. Yahoo! plans to seize this opportunity

Tim asked what Terry thought about the fact that the stock market is rating Google higher than Yahoo! when it comes to market capitalization. Terry stated that the market is currently focused on search and the revenue from search but there are avenues for deeper engagement with end users with richer opportunities when it comes to communications and content.

Tim then asked if Terry would consider giving Google access to the information in HotJobs if they launched jobs.google.com? Terry responded that Yahoo! is more of an open platform company than Google and has deeply embraced syndication technologies like RSS.


There was a brief Q & A after the interview.

I asked Terry what he felt was Yahoo's biggest strength and Google's biggest weakness. I also pointed out that I agreed with his statement that as portals go Google is #4 which prompted him to ask if I worked at Yahoo!, when I responded that I worked at MSN, this seemed to make the audience laugh. Terry responded that he doesn't talk about the weaknesses of his competitors in public BUT he would say that Yahoo's strength is that they have assets that encourage deep consumer engagement but with regards to communication and content.

Another audience member asked whether Yahoo! considered user generated content to be important. Terry stated that they did which is why they have offerings that enable people to share their experiences with others such as Flickr and Yahoo! 360o.  Terry stated that people tend to polarise discussions such as when they ask whether to bet on branded advertising or sponsored ads in search? It isn't an either-or situation. It's like asking a parent to pick which of their genius kids is the favorite. Can't they love both? If sponsored search ads are huge then Yahoo! benefits. If branded advertising becomes huge then Yahoo! benefits as well.


 

Categories: Trip Report

I attended the Web 2.0 dinner hosted by Ray Ozzie , Gary Flake, and Yusuf Mehdi.

During the dinner there was a Q & A session with the Microsoft folks with John Battelle and Tim O'Reilly asking the questions. My notes below are heavily paraphrased since I took the notes on my phone.

Q: Over the past year the sentiment is slowly becoming stronger that Microsoft isn't the dominant player it once was. How does it feel not to be the big dog?

A: Yusuf -  It's great to be the underdog.

Q: How do you feel about some the new contenders in today's market?

A: Ray - The software industry is changing and Microsoft will have to adapt. Old business models are giving way to new ones and we will have to pay attention to them.

Q: MSN used to be a discarded group within Microsoft but now it is getting a lot of focus. How will MSN make a difference?

A: Gary - Web 2.0 is about software ecosystems and developer platforms. Microsoft and hence MSN, has a lot of experience when it comes to fostering developer platforms and software ecosystems.

Q: What do you think of the fact that big money makers like Office & Windows are imperilled by Web-based offerings and the new markets may not be as profitable as existing ones?

A: At Microsoft we are big believers in the value of Web-based services and the new business models they present. However we are also investing in our existing products like Office and Windows.

Q: When will we see office on the web? 

A: Ray - It is a process. It makes sense to move some stuff to the Web such as email but other applications such as graphics editing will likely be on the desktop for a while. We are still figuring things out how to strike the right balance.

Q: Will we see rapid development from microsoft?

A: Ray/Yusuf/Gary - It depends on the product. MSN ships software in timeframes measured in months. Other parts of Microsoft are in years. However how long something takes to ship really is a function of the complexity and maturity of the application. The more complex it is, the longer it takes to ship. After all, Netscape used to ship a new browser every couple of months back in the early days of the Web. Now things are a bit different. 

Q: Is groove technology going to show up in Microsoft products?

A: Ray - It's part of office and will go through all the things needed to make it part of the suite including consistent UI with other office apps and localization in dozens of languages. However the model of a smart client that harnesses the power of the network will permeate across the company.

Q: What about new revenue models such as ad-supported offerings?

A: Ray - The new paid search model was pioneered by overture and perfected by google. However ee as an industry still haven't fully figured exactly how much software can be ad-supported vs. paid.

Q: Are you guys buying AOL?

A: No comment.

Q: You recently launched MSN AdCenter, how do you plan to get new advertisers? Will it be due to access to high traffic MSN sites or by undercuting prices?

A: Yusuf - It'll be a little of both. There definitely will be something of a bonanza when it comes to purchasing keywords when we launch the system but in the long term the value of our ad network will be the high traffic from our sites.

Q: What assets does microsoft have that give it an edge?

A: Ray - We work better together across devices from mobile, PC, etc. we also have the experience of having both consumer offerings and business offerings.    


Q: Will there be a free ad-supported version of Office? Perhaps this is the answer to piracy in emerging markets?

A: Gary -  Show off hands, how many people want an ad-based office suite? How many people don't? [ed note -- a seemingly even number of people raised their hands both times]. See our problem? Ray - Ads aren't always the answer. It is one thing to show a user ads when they are looking for a product in a search engine and quite another to shove ads in their face when they are creating a document.

Q: In the past, Microsoft has locked user's in with proprietary Office formats such as .doc and .xls. Will the next versions of Office support open formats?

A: Ray - There have been two big changes in the next version of Office. The first is that the user interface has been totally revamped while the other is that the file formats are now open XML formats with a liberal license.

Q: Ray, are you having fun at Microsoft?

A: Ray - Yes!!! I thrive in startups and building ideas from scratch but sometimes my ideas are bigger than I can implement and now with Microsoft I have access to lots of resources. I am particularly impressed with the mobile platform, if you haven't checked out Windows Mobile 5.0, you should give it a look.


 

Categories: Trip Report

I attended the panel on business models for mash-ups hosted by Dave McClure,
Jeffrey McManusPaul Rademacher, and Adam Trachtenberg.

A mash up used to mean remixing two songs into something new and cool but now the term has been hijacked by geeks to means mixing two or more web-based data sources and/or services.

Paul Rademacher is the author of the Housing Maps mash-up which he used as a way to find a house using Craig'sList + Google Maps. The data obtained from Craig's List is fetched via screen scraping. Although Craig's List has RSS feeds, they didn't meet his needs. Paul also talked about some of the issues he had with building the site such as the fact that since most browsers block cross-site scripting using XMLHttpRequest then a server needs to be set up to aggregate the data instead of all the code running in the browser. The site has been very popular and has garnered over 900,000 unique visitors based solely on word-of-mouth.

The question was asked as to why he didn't make this a business but instead took a job at Google. He listed a number of very good reasons

  1. He did not own the data that was powering the application.
  2. The barrier to entry for such an application was low since there was no unique intellectual property or user interface design to his application

I asked whether he'd gotten any angry letters from the legal department at Craig's List and he said they seem to be tolerating him because he drives traffic to their site and caches a bunch of data on his servers so as not to hit their servers with a lot of traffic. 

A related mash-up site which scrapes real estate websites called Trulia was then demoed. A member of the audience asked whether Paul thought the complexity of mash-ups using more than two data sources and/or services increased in a linear or exponential fashion. Paul said he felt it increased in a linear fashion. This segued into a demo of SimplyHired with integrates with a number of sites including PayScale, LinkedIn, Job databases, etc.

At this point I asked whether they would have service providers giving their perspective on making money from mash-ups since they are the gating factor because they own the data and/or services mash-ups are built on. The reply was that the eBay & Yahoo folks would give their perspective later.

Then we get a demo of a Google Maps & eBay Motors mash-up. Unlike the Housing Maps mash-up, all the data is queried live instead of cached on the server. eBay has dozens of APis that encourage people to build against their platform and they have an affiliates program so people can make money from building on their API. We also got showed Unwired Buyer which is a site that enables you to bid on eBay using your cell phone and even calls you just before an auction is about to close. Adam Trachtenberg pointed out that since there is a Skype API perhaps some enterprising soul could mash-up eBay & Skype.

Jeffrey McManus of Yahoo! pointed out that you don't even need coding skills to build a Yahoo! Maps mash-up since all it takes is specifying your RSS feed with longitude and latitude elements on each item to have it embedded in the map. I asked why unlike Google Maps and MSN Virtual Earth, Yahoo! Maps doesn't allow users to host the maps on their page nor does there seem to be an avenue for revenue sharing with mash-up authors via syndicated advertising. The response I got was that they polled various developers and there wasn't significant interest in embedding the maps on developer's sites especially when this would require paying for hosting.

We then got showed a number mapping mashups including a mashup of the London bombings which used Google Maps, Flickr & RSS feeds of news (the presenter had the poor taste to point out opportunities to place ads on the site), a mashup from alkemis which mashes Google Maps, A9.com street level photos and traffic cams, and a mash-up from Analygis which integrates census data with Google Maps data.

The following items were then listed as the critical components of mash-ups
 - AJAX (Jeffrey McManus said it isn't key but a few of the guys on the panel felt that at least dynamic UIs are better)
 - APIs
 - Advertising
 - Payment
 - Identity/Acct mgmt
 - Mapping Services
 - Content Hosting
 - Other?

On the topic of identity and account management, the problem of how mash-ups handle user passwords came up as a problem. If a website is password protected then user's often have to enter their usernames and passwords into third party sites. An example of this was the fact that PayPal used to store lots of username/password information of eBay users which caused the company some consternation since eBay went through a lot of trouble to protect their sensitive data only to have a lot of it being stored on Paypal servers.

eBay's current solution is similar to that used by Microsoft Passport in that applications are expected to have user's login via the eBay website then the user is redirected to the originating website with a ticket indicating they have been authenticated. I pointed out that although this works fine for websites, it offers no solution for people trying to build desktop applications that are not browser based. The response I got indicated that eBay hasn't solved this problem.

My main comment about this panel is that it didn't meet expectations. I'd expected to hear a discussion about turning mashups [and maybe the web platforms they are built on] into money making businesses. What I got was a show-and-tell of various mapping mashups. Disappointing.


 

Categories: Trip Report

I attended the first board meeting of the AttentionTrust which was open to all and hosted by Steve GillmorHank Barry and Seth Goldstein.

Steve Gillmor began by talking about the history of the organization and why it got started. As has been stated previously the core goal of the organization is that attention data - that is, data that describes what you're paying attention to - has value, and because it has value, when you give someone your attention you should expect to be given something in return. And just because you give someone your attention, it doesn't mean that they own it. You should expect to get it back.

Seth Goldstein mentioned that they are now officially a non-profit. Seth also mentioned that there is now a post on their group blog that goes into some detail to clarify their intentions. They have worked with the developer of the Outfoxed Firefox extension (Stan James) to build an Attention Recorder plugin for Firefox. This plugin records a user's attention data to their hard drive with the assumption being that eventually there will be AttentionTrust certified companies that users can trade this information with.

Stan James gave a demo of the Attention Recorder plugin and stated that it has two main features

  1. The toolbar button which lights up if you are on an AttentionTrust certified site.
  2. The Attention Recorder logs all a user's web traffic to a particular website. Sites can be excluded from being logged so one doesn't accidentally log access to sensitive websites. One can also configure the sites to which the clickstream logs should be sent.

After Stan's demo, the rest of the session turned into a Q & A session. Below are paraphrased questions and answers.

Q: Why would users share their attention data without getting paid? Is the promise of killer apps that can harness the user's attention data enough for users?

A: The value/price of a user's attention data is up to agreements between the users and companies.

Q: What about legacy attention data?

A: The AttentionTrust has partnered with companies that have some attention data today such as Rojo to see if they can expose it in AttentionTrust compliant ways.

There was a brief interlude at this point where Stan James went over some of the implemention details of the Attention Recorder as well as showed examples of the XML format it uses for logging a user's clickstream. On seeing the XML format, Dave Sifry who was in the audience brought up also brought up the point of ensuring that sensitive data such as usernames and passwords aren't being sent to services. 

Greg Yardley is a part of a lead-generation company building a service to match up users to businesses using attention clickstream data. He's been making sure to follow the principles of AttentionTrust while building their application. For example, users can delete their attention data if needed. Steve Gillmor asked Greg what kinds of apps people could build if they had more access to user's attention data. Greg responded with a lot of examples such as more accurate personalized search engines, searching only over websites the user has seen recently, more accurate data for dating sites to use in matching people up, RSS readers that know to mark items as read if you read them from your browser and a number of other interesting ideas.

A member of the audience asked how AttentionTrust compliant lead-generation companies  could be marketed as being better than their traditional alternatives that used slimy methods. The response from Seth Goldstein was that leads generated from attention data would be of higher quality (e.g. leads for mortgage customers generated from people searching for "refinance" are better than leads from people signing up to receive free iPods). Another member of the audience disagreed with Seth and pointed out that it isn't so cut and dried. She pointed out that an unemployed college student could spend their days surfing shopping sites for luxury goods but that doesn't make them a good lead for companies trying to sell luxury goods.

Another user asked what ways exist to convince users to choose AttentionTrust companies. Seth said people will build cool apps about themselves based on local data is probably the key. I jumped into the discussion and used Amazon as an example of end users giving a company their attention data on music and books they like either implicitly (by buying them) or explicitly (by rating them). My question was how could AttentionTrust convince Amazon to open up all their attention data. Steve Gillmor replied that it isn't likely that they'd be able to convince the incumbents to follow the principles of AttentionTrust but if enough small players got together and started building some of these cool apps then great things could happen.

I believe there was also a positive comparison to Richard Stallman and the Free Software Movement but I've forgotten the exact details.


 

Categories: Trip Report

I attended the panel on Open Source Infrastructure hosted by Marc CanterTantek Çelik,
Brian DearMatt Mullenweg and Toni Schneider.

Marc Canter coined the term "Open Source infrastructure" while pitching OurMedia to big companies while seeking funding. He pointed out that in the Web 2.0 world we are in danger of swapping the lock-in of desktop platforms controlled by big companies like Apple and Microsoft for lock-in of Web platforms controlled by big companies like eBay and Amazon. The same way we have open source platforms to prevent desktop lock-in, we need open source Web infrastructure to prevent platform lock-in.

Brian Dear of EVDB is working on making event publishing on the Web easier. Eventful is a website that aggregates events. The website is built on the EVDB API which is itself built on an the EVDB index over the EVDB Database. This same API is available for third parties to build applications against. Brian divides events into high definition and low definition events. Low definition events are easy to create and have simple metadata usually just a title and start/end time for the event. However simple events are hard to discover due to the lack of structured metadata. On the other hand, high definition events have lots of fields (title, start/end time, description, price, recurrence, etc) which makes them harder to create but easier to discover by applications.  They have created SES (simple event sharing) which is a mechanism for web sites to ping an event server with changes similar to how weblogs currently ping places like Weblogs.com and blo.gs when they are updated.  

At this point Marc Canter interjects and asks where the events are located. Will they be able to suck up events from sites like Craig's List or will they have to be on Eventful? Brian Dear states that he prefers the model where aggregators of events such as Eventful points to existing sites such as Craig's List especially since they are not metadata rich (i.e. low definition events). Marc then points to someone in the audience who has a similar site and asks whether they have a ping server model and he mentions that they crawl the Web instead.  

This segued into Matt Mullenweg talking about ping servers. Matt talked about ping-o-matic which aggregates ping servers so blogs can just ping it and it pings the dozens of ping servers out there instead. However they have significant scaling issues with some days where they get up to 4,000,000 pings a day. Unsurprisingly, a lot of the pings turn out to be spam. Matt has asked help from various sources and has gotten servers from Technorati. . Marc asks whether pings can grow beyond blogs to events, group creation and other types of pings. Although Matt seemed hesitant, Brian points out that they have extended the ping format already from what sites like Weblogs.com uses to accomodate their needs for events.

Yahoo! just bought upcoming which is an event site and Tony Schneider [who used to be at Oddpost] was representing Yahoo! on the panel. Tony believes that big companies shouldn't own the core services on the Web which is one of the motivations for Yahoo! opening up their APIs. The biggest develop communities around Yahoo!'s offerings come from the entities they have purchased such as Flickr & Konfabulator. Turning their services into developer platforms is now big at Yahoo!. Another thing that is really big at Yahoo! is using RSS. Yahoo! doesn't care that it isn't an official standard produced by a standard's body. It gets the job done. They are also acquiring core online services like blo.gs and keeping them open to benefit the developer world.  Marc asks when they decide when working with others (e.g. MediaRSS) versus buying companies (blo.gs and upcoming). Todd replies that with formats they first look at whether anything exists before creating something new (e.g. GeoRSS used in Yahoo! Maps API vs. MediaRSS which they created).

Tantek talked about Web 2.0 themes ('you control your own data', 'mix and match data', 'open, interoperable and web friendly data formats & protocols'). Marc points out that the Web is not the end all and be all so 'Web friendly' is cool but not overriding. Tantek also discussed Talked about microformat design principles and the various themes within the microformats community (open source tools, open communications and focus on concrete scenarios). He then briefly talked about two microformats. The first was hReview which is a format for reviews which got input from folks at Yahoo!, SixApart and MSN among others.  there are a lot of websites for reviews (Amazon, Yahoo! Local) but no real standard. The second was hReview which is now being used by http://www.avon.com to mark up the contact information for over 40,000 Avon representatives.  Tantek also shows that you can mark up the Web 2.0 speakers list and then he wrote a bookmarklet that can suck up all the speakers into his address book.

During the Q & A session I asked three questions 

  1. Isn't the focus on centralized services for pinging worrying for services with lots of users because it is quite possible for us to overwhelm the services with our user base? Matt responded that this is why he was seeking donations from large companies. 
  2. Currently microformats seem to be going fine since Tantek is the main guy creating them but what happens when others start creating them and unlike XML which has namespaces there is no way to disambiguate them? Tantek responded that there were already many people creating microformats. He also stated that the issue of duplicate or redundant microformats would be dealt with via the community process.
  3. Isn't the big problem with the lack of adoption of standards for creating events, the lack of a killer app for consuming events? Tantek responded that the killer apps may be here already by showing how he wrote an app to consume hCalendar events and place them in his iCal calendar. Brian mentioned that Eventful uses hCalendar and hCard. 

In general, I'd have preferred if the panel was more of a discussion as opposed to an hour or more of sales pitches from the panelists with 10 minutes of Q & A at the end. I have a feeling that a lot more knowledge would have been gained by having members of the audience riff along with the panelists instead of the traditional "wisdom from above" model used in thuis panel.

An additional meta-comment about the conference is that so far I've been unable to get into 2 out of the 3 sessions I wanted to attend this morning because they were filled to overflowing. Given how much folks are paying for the conference and how much experience the O'Reilly folks have with holding conferences, one wouldn't see such problems occur.


 

Categories: Trip Report

Brent Simmons has confirmed that Newsgator has purchased NetNewsWire. One of the reasons Brent gave for the purchase is excerpted below

The first is that we get requests constantly about syncing—not just better syncing, not just between copies of NetNewsWire, but with Windows RSS readers, PDAs, Outlook, and so on. People even ask us to create a website version for when they’re away from their normal computers.

We couldn’t do all this on our own—but we agree completely with NetNewsWire users who tell us that RSS is hugely important, too important to have to read the same news items twice on different computers and different devices.

NewsGator was already working on this—but they didn’t have a Mac client. It was almost like putting together a jigsaw puzzle: NetNewsWire fit right in!

Users of RSS Bandit already have access to a robust syncing mechanism that allows users to use the application on multiple computers without having to mark the same item as read twice. However there are several limitations to the mechanisms used by RSS Bandit today.

  • The user has to set up a server (FTP or WebDAV)
  • The user cannot synchronize with an online reader for the times they aren't at a PC with RSS Bandit installed
  • The user cannot synchronize with a Mac-based reader for the times they are using a Mac and not a PC

I have been hit by all three of these limitations at one time or the other this year. All three of these problems will be a thing of the past with the Nightcrawler release of RSS Bandit which will support synchronization with Newsgator Online via the Newsgator API.

Congratulations to Brent and Greg on their union. And congratulations to RSS Bandit users who will now be able to sync between NetNewsWire on the Mac and RSS Bandit on the PC.


 

Categories: RSS Bandit

October 5, 2005
@ 03:11 AM

I took a look at Ning today and I find it interesting from a number of perspectives. On the one hand, it's simply a toolkit for building websites. The same kind of toolkit you see hawked on infomercials while channel surfing at midnight because you have insomnia. The main difference is that instead of a CD that contains software for building eCommerce websites it's an online website for building 'social software' websites. On the other hand, it shows an interesting direction for web platforms to move in.

Let's say you are Yahoo! or MSN and you now have an extensive array of web services at http://developer.yahoo.com or http://msdn.microsoft.com/msn. So now developers can build desktop applications or their own websites that interact with your APIs. These are akin to the low level APIs for interacting with your web platform. The next step is to build the equivalent of rich, high-level components for interacting with your services so people don't have to mess with the low level details. What I find interesting about Ning isn't "Here's another place where I can build yet another social network different from my IM buddy list/Orkut/Yahoo! 360/Friendster/etc clique" but instead "Here's a place where I can build new apps that harness my existing social network from my IM buddy list/Orkut/Yahoo! 360/Friendster/etc". That would be pretty revolutionary if the various parties involved would be interested in opening their social networks to such mashups.

I suspect I'll be trying to track down some folks from Ning while at the Web 2.0 conference this week.

Anyway, gotta go catch my flight.


 

I've been reading the Mini-Microsoft blog for a couple of months now and recently considered unsubscribing. The main reason I haven't is that the recent story in Business Week about the blog has attracted a lot of readers which makes the comment threads interesting in a Slashdot kinda way. Since I couldn't locate an email address for the blog's author on the front page of the blog, I'll just post my comments here on why the blog jumped the shark for me.

  1. Complaints about Symptoms instead of the Root Problems: A lot of the things complained about by the author of the Mini-Microsoft blog are symptoms of a single problem. About six years ago, Motley Fool ran an article entitled The 12 Simple Secrets of Microsoft Management which listed a number of characteristics of the company which made it successfull. The fourth item on the list is Require Failure and it states

    In contrast, at Microsoft, failure is expected, and even required because risking failure is the only way to push the envelope. As a result, Microsofties relentlessly pursue success without fear of failure. And if they fail, they understand that the key is to fail quickly and not waste time.

    One of the unfortunate things about a culture that turns a blind eye to failure is that this eventually there is no difference between requiring failure and a lack of accountability. A lot of the things Mini complains about point to an environment where  a lack of accountability runs rampant. I'd rather see him ring the bell about these issues [which he does every once in a while] as opposed to meaningless distractions like complaining about vague ship dates or asking for mass firings because the company is "too large".

  2. Microsoft is Lots of Little Companies: One thing that isn't clear from Mini's posts is that a number of the complaints he raises are organization specific. The culture in the Office group is different from that at MSN, the issues facing the people working in the Windows group are different from those facing the folks working on XBox. Mini's blog not only isn't representative but it doesn't seem like he pays much detail to what is happening outside of his group. For example, it is quite telling that he didn't know that the ship date for Visual Studio 2005 was announced a while ago. Given how many people within the company work on and are impacted by the shipping of Whidbey & Yukon, it seems clear that Mini doesn't pay much attention to what is going on outside of his group.

  3. Stack Ranking: This point is a probably a repeat of my first one. First of all when it comes to performance reviews, I tend to agree with Joel Spolsky that Incentive Pay Considered Harmful. Joel wrote

    And herein lies the rub. Most people think that they do pretty good work (even if they don't). It's just a little trick our minds play on us to keep life bearable. So if everybody thinks they do good work, and the reviews are merely correct (which is not very easy to achieve), then most people will be disappointed by their reviews. The cost of this in morale is hard to understate. On teams where performance reviews are done honestly, they tend to result in a week or so of depressed morale, moping, and some resignations. They tend to drive wedges between team members, often because the poorly-rated are jealous of the highly-rated, in a process that DeMarco and Lister call teamicide: the inadvertent destruction of jelled teams.

    In general, systems where you try to competitively rank people and reward them based on their rankings suck. A lot. When you combine this with the current rewards associated with positive rankings at Microsoft, then you have a system that doubly sucks. I think Mini gets this but he keeps talking about alternative performance review systems even though there is lots of evidence that incentive pay systems simply do not work.

Those are the top three reasons that I find myself losing interest in keeping up with the Mini-Microsoft blog. However I'll probably keep reading because the comments now have me gawking at them regularly, sometimes even more than I do at Slashdot.


 

Categories: Life in the B0rg Cube

October 2, 2005
@ 02:02 AM

Tim O'Reilly has posted What Is Web 2.0? : Design Patterns and Business Models for the Next Generation of Software which further convinced me that the definition of Web 2.0 used by Tim O'Reilly and his ilk may be too wide to be useful. In the conclusion of his article he writes

Core Competencies of Web 2.0 Companies

In exploring the seven principles above, we've highlighted some of the principal features of Web 2.0. Each of the examples we've explored demonstrates one or more of those key principles, but may miss others. Let's close, therefore, by summarizing what we believe to be the core competencies of Web 2.0 companies:

  • Services, not packaged software, with cost-effective scalability
  • Control over unique, hard-to-recreate data sources that get richer as more people use them
  • Trusting users as co-developers
  • Harnessing collective intelligence
  • Leveraging the long tail through customer self-service
  • Software above the level of a single device
  • Lightweight user interfaces, development models, AND business models

The next time a company claims that it's "Web 2.0," test their features against the list above.

The list seems redundant in some places and could probably be reduced to 3 points. Half the bullet points all seem to say that the company should expose Web services [in this context I mean services over the Web whether they be SOAP, REST, POX/HTTP, RSS, etc]. So that's point number one. The second key idea seems to be that of harnessing collective intelligence such as with Amazon's recommendation engine, Wikipedia entries and folksonomies/tagging systems. The final key concept is that Web 2.0 companies leverage the long tail. One example of the difference between Web 1.0 and Web 2.0 when it comes to harnessing the long tail is the difference between http://www.msn.com which is a portal that has news and information of general interest that aims at appealing to broad audiences (one size fits all) and http://www.start.com which encourages people to build their own portal that fits their needs (every niche is king).

So let's review. Tim O'Reilly's essay can be reduced to the following litmus test for whether an offering is Web 2.0 or not

  • Exposes Web services that can be accessed on any device or platform by any developer or user. RSS feeds, RESTful APIs and SOAP APIs are all examples of Web services.
  • Harnesses the collective intelligence knowledge of its user base to benefit users
  • Leverages the long tail through customer self-service

So using either Tim O'Reilly's list or mine, I'd be curious to see how many people think http://www.myspace.com is a Web 2.0 offerings or not. If not, why not? If so, please tell me why you think all the folks who've called MySpace a Web 2.0 offering are wrong in my comments. For the record, I think it isn't but would like to compare my reasons with those of other people out there.


 

Categories: Web Development

October 2, 2005
@ 12:47 AM

Brian Jones has a post entitled Native PDF support in Office "12" where he writes

Today's another exciting day as we move closer to Beta 1. We are just wrapping up the MVP summit here in Redmond and we've finally announced another piece of functionality I've wanted to talk about for a long time now. This afternoon Steven Sinofsky announced to our MVPs that we will build in native support for the PDF format in Office "12".  I constantly get asked by customers if we can build in this support for publishing documents as PDF files, and now I can thankfully say "yes!" It's something we've been hearing about for years, and earlier in this project we decided that while there were already existing third party tools for doing this, we should do the work to build the functionality natively into the product.

The PDF support will be built into Word, Excel, PowerPoint, Access, Publisher, OneNote, Visio, and InfoPath! I love how well this new functionality will work in combination with the new Open XML formats in Word, Excel, and PowerPoint. We've really heard the feedback that sharing documents across multiple platforms and long term archiving are really important. People now have a couple options here, with the existing support for HTML and RTF, and now the new support for Open XML formats and PDF!

This is a very welcome surprise. The Office team is one of the few groups on main campus who seem to consistently get it. Of course, the first thought that crossed my mind was the one asked in the second comment in response to Brian's post.


 

There's been a bunch of MSN Virtual Earth hacking going on in my building over the past couple of weeks. There was my Seattle Movie Finder hack. Recently two folks on the MSN Messenger team created a shared map browsing application using the recently released MSN Messenger Activity API. Chandu Thota has the details in his post Virtual Earth and MSN Messenger : Peer-2-Peer Mapping Experience

If you are running MSN Messenger 6.0 or higher, open a conversation with your contact and click on "Activities" menu item; it will display a list of activities that you can use which includes "Virtual Earth Shared Map" as shown below:

Once you and your contact accept this activity you both can find, pan and zoom on the Virtual Earth map all interactively, like the one shown below:

Okay, I don't want to waste your time anymore - this is one of the coolest things I have seen in this space - try it out! you won't be disappointed! :)

PS: Kudos to Steve Gordon and Shree Madhavapeddi from MSN for creating such a wonderful app!

I got a demo of this from Steve and Shree last week, I didn't realize that it would show up in the MSN Messenger application so soon. That is some quick turn around time.

I also got a demo of a cool Start.com gadget which uses MSN Virtual Earth from Matt this week. I wonder how long that will take to sneak out onto the Web.

PS: In his post about this Robert Scoble states that the application was created by Scott Swanson. This isn't accurate, Scott wrote a similar application as a PDC demo but the version you can get in MSN Messenger activities menu isn't it.


 

Categories: MSN