Kent Newsome has a blog post entitled Educating Kent: Facebook where he asks

I have a genuine question.

What is so much better about Facebook (and MySpace and other similar platforms) than an ordinary blog on a popular platform- say WordPress?

I would love it if someone could explain this to me.

As someone who's worked on a blogging/social networking service for the past two and a half years I have some perspective on why social networking sites are more popular than blogs (i.e. more people have a social network profile than a traditional "blog").

MY ANSWER: Social networking sites [especially Facebook] take better advantage of the human need to communicate by leveraging the following trends that became obvious once blogging took off

  1. Personal publishing is more than just text, it spans all media. Videos, music and photos are just as important for people to share as is text. Traditional blogging tools/services like WordPress and Blogger have not taken advantage of this fact.

  2. People like to be informed about what is going on in their circle of friends (i.e. social networks). Bloggers tend to do this by subscribing to RSS feeds in their favorite RSS reader. Unfortunately, subscribing to RSS feeds has and always will be a fairly cumbersome means to satisfy this need regardless of how many browsers, email clients and Web sites add RSS reading functionality. On the other hand, a model where subscription is automatic once a user declares another user as being of interest to them (e.g. adding them as a friend) as opposed to locating and subscribing to their RSS feed is easier for users to adopt and use. In addition, integrating the process of keeping abreast of updates from "friends" into an existing application the user is familiar with and uses regularly is preferable to introducing a new application. I like to call this the LiveJournal lesson.

The above phenomena are the reason that MSN Spaces Windows Live Spaces grew to having over 100 million unique visitors less than two years after it first showed up. MSN Spaces was one of the first major personal publishing sites to place publishing of other media (e.g. photo albums) on the same footing as blogging/creating a journal. This was a big hit with users and the service followed up with tools for embedding music and videos, however we didn't provide media hosting or a library of content which users could choose from. These mistakes weren't made by MySpace which thanks to its widget platform could rely on services like PhotoBucket and YouTube to provide both media hosting and a library of content for users to share. Now MySpace is one of the most popular sites on the Web.

The second major reason for the initial success of MSN Spaces Windows Live Spaces lies in its integration with Windows Live Messenger . The key aspect of this integration was the gleams feature which was described as follows by Paul Thurrott in his review of MSN Messenger 7

Additionally, when you click on your own display picture in Messenger 7.0, your Contact Card displays (Figure). This small window provides a range of personal information and links to other MSN services. You can access other users' Contact Cards by clicking their picture in the main Messenger window (Figure). But Messenger 7.0 takes this capability a bit further with another new feature called a gleam, that visually reminds you when one of your contacts has updated their MSN Spaces blog or other personal information. Gleams appear as small orange stars next to contact pictures in the main MSN Messenger window (Figure).

With gleams, the act of adding someone as an IM buddy also subscribes you to getting updates about changes on their Windows Live space. Our users loved it. In hindsight, where we dropped the ball is that it isn't much of a stretch to imagine a Web interface which summarizes these updates from your friends so you can access it from anywhere not just your IM client. In addition, it is also lame that we don't provide details of the nature of the update inline and  instead require users to click on the contact card to tell which of their friends information has changed. Once you add those two features, you've pretty much got Twitter (text only) and the Facebook News Feed which have both turned out to be big hits on the Web.

To recap, social networking sites like MySpace and Facebook are better bigger than blogging sites because they enable people to connect, communicate and share with each other in richer and easier ways than blogging does.


 

Categories: Social Software

Recently the mainstream media has been running profiles on people who have built businesses worth hundreds of millions of dollars by buying lots of Internet domain names then filling them with Google ads. Last week, CNN Money ran an article entitled The man who owns the Internet which contained the following excerpt

When Ham wants a domain, he leans over and quietly instructs an associate to bid on his behalf. He likes wedding names, so his guy lifts the white paddle and snags Weddingcatering.com for $10,000. Greeting.com is not nearly as good as the plural Greetings.com, but Ham grabs it anyway, for $350,000. Ham is a devout Christian, and he spends $31,000 to add Christianrock.com to his collection, which already includes God.com and Satan.com. When it's all over, Ham strolls to the table near the exit and writes a check for $650,000. It's a cheap afternoon.
...
Trained as a family doctor, he put off medicine after discovering the riches of the Web. Since 2000 he has quietly cobbled together a portfolio of some 300,000 domains that, combined with several other ventures, generate an estimated $70 million a year in revenue. (Like all his financial details, Ham would neither confirm nor deny this figure.)
...
And what few people know is that he's also the man behind the domain world's latest scheme: profiting from traffic generated by the millions of people who mistakenly type ".cm" instead of ".com" at the end of a domain name. Try it with almost any name you can think of -- Beer.cm, Newyorktimes.cm, even Anyname.cm -- and you'll land on a page called Agoga.com, a site filled with ads served up by Yahoo

The New York Times has a profile on another multimillion dollar company in the same business in today's article entitled Millions of Addresses and Thousands of Sites, All Leading to One which contains the following excerpts

What Internet business has raised $120 million in financing in the last year, owns 725,000 Web sites, and has as its chief executive the former head of Primedia and International Data Group? If you guessed NameMedia, a privately held owner and developer of Web sites based in Waltham, Mass., you take the prize.
...
“What we’ve wanted to do, quietly, is amass the largest real estate position on the Internet, which we feel we have,” Mr. Conlin said. Some of those properties, he said, are the equivalent of “oceanfront” sites, or high-value addresses like Photography.com or DailyHoroscope.com that NameMedia will populate with relevant editorial content. Those who type in any of NameMedia’s other 6,000 or so photography-related Internet addresses, like photographyproducts.com, will land on Photography.com.
...
So far the company’s strategy is paying off, Mr. Conlin said, with company revenue doubling last year, to $60 million.

Companies like this are bad for the Internet for several reasons. For one, they artificially reduce the pool of domain which has resulted in legitimate domains having to choose name that are either awful misspellings or sound like they were stolen from Star Wars. Secondly, a lot of these sites tend to clog up search results especially when they have generic domain names and a couple thousand sites all linking or redirecting back to one domain. Finally, the fact that these companies are making so much money in a manner that is user-hostile and ethically questionable encourages more such businesses which prey on naive Internet users to be formed.

What I've found most shocking about this trend is that the big Web advertising companies like Google go out of their way to court these businesses. In fact, Google has a service called Google AdSense for Domains [with the tastefully chosen URL http://www.google.com/domainpark] which caters exclusively to these kinds of sites.

One of the things I've disliked about the rush towards advertising based business models on the Web is that if unchecked it leads to user-hostile behavior in the quest to satisfy the bottom line. The recent flap over Google and Dell installing the equivalent of spyware on new PCs to show users ads when they make a typo when browsing the Web is an example of this negative trend. Now it turns out that Google is in bed with domain name squatters. These are all examples of Google's Strategy Tax, the fact that they make their money from ads compromises their integrity when there is a conflict between between doing what's best for users and doing what's best for advertisers.

Do no evil. It's now Search, Ads and Apps


 

Earlier this week, there were a flurry  of blog posts about the announcement of the Facebook platform. I've taken a look at the platform and it does seem worthy of the praise that has been heaped on it thus far.  

What Facebook has announced is their own version of a widget platform. However whereas most social networking sites like MySpace, Bebo and Windows Live Spaces treat widgets as second class citizen that exist within some tiny box on the user's profile page, widgets hosted on Facebook are allowed to deeply integrate into the user experience. The page Anatomy of a Facebook Application shows ten integration points for hosted widgets including integration into the left nav, showing up in the users news feed and adding themselves as action links within the user's profile. This is less of a widgts platform and more like the kind of plug-in architecture you see in Visual Studio, Microsoft Office or the Firefox extension infrastructure. This is an unprecedented level of integration into a Web site being offered to third party developers by Facebook

Widgets for the Facebook platform have to be written in a proprietary markup language called Facebook Markup Language (FBML). The markup language is a collection of "safe" HTML tags like blockquote and table, and custom Facebook-only tags like fb:create-button, fb:friend-selector and fb:if-is-friends-with-viewer. The most interesting tag I saw was fb:silverlight which is currently undocumented but probably does something similar to fb:swf. Besides restricting HTML to a "safe" set of tags there are a number of other security measures such as disallowing event handlers like onclick, stripping Javascript from CSS style attributes and requesting images referenced in FBML via their proxy server so Web bugs from 3rd party applications can't track their users.

Facebook widgets can access user data by using either the Facebook REST API or Facebook Query Language (FQL) which is a SQL-like query language for making requests from the API for developers who think constructing RESTful requests is too cumbersome.

Color me impressed. It is clear the folks at Facebook are brilliant on several levels. Not only have they built a killer social software application but they've also pulled off one of the best platform plays I've seen on the Web yet. Kudos all around.


 

Categories: Social Software | Platforms

A little while ago Facebook added the News Feed feature which is basically a river of news on your Facebook home page which shows what people in your social network are up to. What I hadn't noticed until today is that sometimes they have ads in there masquerading as updates from people in your friends list. Here's what I saw when I logged in today to take a look at the much ballyhoed Facebook developer platform.

It's one thing to inject ads into a social experience like Facebook has done here, it's another thing for the ad to contain disgusting imagery from a big budget snuff film. Seriously...WTF?


 

Categories: Rants | Social Software

May 26, 2007
@ 01:00 AM

Mike Champion has a blog post entitled WS-* and the Hype Cycle where he writes

There's a persistent theme talked up by WS-*ophobes that it's all just a fad, rapidly sliding down toward the "Trough of Dilillusionment" in the Gartner Hype Cycle. I've come to the opposite conclusion after six weeks back in the web services world.  The WS technologies are taking hold, deep down in the infrastructure, doing the mundane but mission critical work for which they were designed. Let's consider one example, WS-Management, which I had barely heard of when I started in CSD Interoperability.
...
At first glance this appears to duplicate widely deployed bits of the web.  For example, it depends on the oft-criticized WS-Transfer spec, and some are advocating using Atom and the Atom Publishing Protocol rather than WS-* for describing collections and subscribing to notifications of their contents.  On closer examination, WS-Management is widely used today in situations where the web-scale alternatives really don't fit, such as deep within operating systems or in the firmware of chips.
...
In short, from what I have learned recently, the trajectory of WS-* isn't pointing toward oblivion, it looks headed toward the same sort of pragmatic ubiquity that XML has achieved.  That's not to say all is rosy; there is lots of confusion and dissension, again just like there was in the early days of the Web and XML.   Likewise, "ubiquity" doesn't mean that the WS technologies are the best or the only option across the board,  but that it they are increasingly available as a very viable option when developers need protocol-neutrality, security, identity, management capability, etc.

I was going to post a response when I first read his post on Monday but I decided to wait a few days because I couldn't think of anything nice to say about this post and Mike Champion is someone I consider a friend. After a few days of calm reflection, the only thing I can say about this post is...So what?

It seems Mike is trying to argue that contrary to popular belief WS-* technologies are still useful for something. Seriously, who cares? The general craptacular nature of WS-* technologies was a major concern when people were marketing them as a way to build services on the Web. Now it is quite obvious to anyone with a clue about Web development that this is not the case. None of the major Web companies or "Web 2.0" sites is taking a big bet on WS-* Web services for providing APIs on the Web, on the other hand they all are providing RESTful XML or JSON Web services.

Now if WS-* technologies wants to own the niche of one proprietary platform technology talking to another in a homogeneous, closed environment...who cares? Good riddance I say. Just keep that shit off the Web.


 

Categories: XML Web Services

Duncan Riley over at TechCrunch let's us know Digg API Visualization Contest Delivers Apollo Powered Applications, specifically

The Digg API Visualization Contest held to celebrate the launch of the Digg API is now in its final stages with 10 shortlisted candidates.

Four of the ten finalists are Abode Apollo based applications, remarkable for a platform launched just over 2 months ago.

Agreed, it's pretty remarkable to see so many desktop applications in a Web mashup contest. As John Dowdell warns in his post Apollo ain't casual an Apollo application is a desktop app with all the security implications that come with that. So it is definitely impressive and a little scary to see so many people downloading random executables off of the Web and voting for them in what you'd expect to be a Web-based mashup contest.

It's also somewhat interesting that all the apps seem to be written using some variation of the Flash platform; Apollo, Flex or Flash Lite. I guess it just goes to show that for snazzy data visualization, you really can't beat Flash today. On reading the contest rules it seems it's sponsored by Adobe given that the prizes are primarily Adobe products and submissions are required to be written in Flash. Too bad, it would have been interesting to see some AJAX/DHTML or Silverlight visualizations going up against the Flash apps. 


 

Categories: Programming

I've been reading the Google Data APIs blog for a few months and have been impressed at how Google has been quietly executing on the plan of having a single uniform RESTful Web service interface to their various services. If you are unfamiliar with GData, you can read the GData overview documentation. In a nutshell, GData is Google's implementation of the Atom 1.0 syndication format and the Atom Publishing Protocol with some extensions. It is a RESTful XML-based protocol for reading and writing information on the Web. Currently one can use GData to manipulate and access data from the following Google services

with more on the way. Contrast this with the API efforts on Yahoo! Developer Network or Windows Live Dev which are an inconsistent glop of incompatible RESTful protocols, SOAP APIs and XML-RPC methods all under the same roof. In the Google case, an app that can read and write data to Blogger can also do so to Google Calendar or Picasa Web Albums with minimal changes. This is not the case when using APIs provided by two Yahoo! services (e.g. Flickr and del.icio.us) or two Windows Live services (e.g. Live Search and Windows Live Spaces) which use completely different protocols, object models and authentication mechanisms even though provided by the same vendor.

One way to smooth this disparity is to provide client libraries that aim to provide a uniform interface to all of the vendors services. However even in that case, the law of leaky abstractions holds. Thus the fact that these services use different protocols, object models and authentication mechanisms ends up surfacing in the client library. Secondly, not only is it now possible to create a single library that knows how to talk to all of Google's existing and future Web services since they all use GData. It is also a lot easier to provide "tooling" for these services than it would be for Yahoo's family of Web services given that they use a simple and uniform interface. So far none of the big Web vendors have done a good job of providing deep integration with popular development environments like Eclipse or Visual Studio. However I suspect that when they do, Google will have an easier time than the others due to the simplicity and uniform interface that is GData.


 

May 25, 2007
@ 09:19 PM

Via Robert Scoble's blog post entitled Microsoft postpones PDC we learn

Mary Jo Foley (she’s been covering Microsoft for a long time) has the news: Microsoft has postponed the PDC that it had planned for later this year.

The PDC stands for “Professional Developer’s Conference.” It happens only when Microsoft knows it’ll have a major new platform to announce. Usually a new version of Windows or a new Internet strategy.

So, this means a couple of things: no new Windows and no major new Internet strategy this year.
...
Now that Google, Amazon, Apple, are shipping platforms that are more and more interesting to Microsoft’s developer community Microsoft has to play a different game. One where they can’t keep showing off stuff that never ships. The stakes are going up in the Internet game and Microsoft doesn’t seem to have a good answer to what’s coming next.

Interesting analysis from Robert, I agree with him that Microsoft no longer has the luxury of demoing platforms it can't or won't ship given how competent a number of competitors have shown themselves on the platform front. The official Microsoft cancellation notice states

As the PDC is the definitive developer event focused on the future of the Microsoft platform, we try to align it to be in front of major platform milestones. By this fall, however, upcoming platform technologies including Windows Server 2008, SQL Server codenamed "Katmai", Visual Studio codenamed "Orcas" and Silverlight will already be in developers’ hands and approaching launch

This makes sense, all the interesting near term future stuff has already been announced at other recent events. In fact, when you think about it, it is kinda weird for Microsoft to have a conference for showing next generation Web platform stuff (i.e. MIX) and another for showing general next generation platform stuff (i.e. PDC). Especially since the Web is the only platform that matters these days.

My assumption is that Microsoft conference planners will figure this out and won't make the mistake of scheduling MIX and PDC a few months from each other next time.
 

Every couple of months I like to give a shout out to the blogs I'm currently reading and think are worth recommending. Below is my current list of top five blogs.

  1. Jeff Atwood: Every modern developer worth their salt should have read Mythical Man-Month, should know the common refactorings by heart, and should be reading Jeff Atwood's blog. It's that good. He covers a broad range of topics which are always of interest to developers from interesting glimpses into our shared computing history in posts such as Meet The Inventor of the Mouse Wheel and EA's Software Artists to excellent advice on designing applications for non-programmers such as his post Reducing User Interface Friction as well as a the occasional rant on pet peeves that a lot of developers share such as when he pointed out C# and the Compilation Tax.

  2. The Secret Diary of Steve Jobs : This is the best fake celebrity blog I've ever seen. The author is definitely up on his knowledge of Steve Jobs and Apple. The funniest posts are the ones where he gives an [evil] Steve Jobs perspective on current Apple affairs in posts such as So, you leaked an email to Engadget?, They call me Mr. Integrity and . Congratulations, Jon Ive

  3. Pat Helland: An old school Microsoft architect from Developer Division who recently came back to Microsoft after a two year stint at Amazon. Before leaving Microsoft two years ago, Pat wrote some well respected articles on building distributed systems such as Metropolis & Data on the Outside vs. Data on the Inside. He has now come back to the company with some practical experience from working on one of the largest Web sites on the planet. His most recent post, SOA and Newton's Universe, introduced me to the CAP Conjecture. Consistency, Availability, and Partition-tolerance. Pick two. Specifically, trying to maintain data consistency in a distributed system is in direct opposition to having high availability. I'd observed anecdotally while working on services in Windows Live but it was still interesting to read papers explaining this complete with mathematical proofs of why this is the case.

  4. Uncov: This site picks up where Dead 2.0 left off as the anti-TechCrunch by attempting to inject some snarky reality in the face of all the overhyped, me too, built to flip, "Web 2.0" startups we keep hearing about these days. Some of the more amusing recent posts are Meebo: Yahoo Chat was awesome in 1997, Mpire: Liked It Better When It Was Called Pricewatch and of course Web 2.0: So great you can't define it.

  5. Casey Serin: Since I recently became a homeowner, I've become interested in all this talk of real estate collapses and subprime loan crises. The USA Today article 10 mistakes that made flipping a flop describes Casey Serin as a poster child for everything that went wrong in the real estate boom. In under a year, the 24-year-old website-designer-turned-real estate-flipper bought eight homes in four states — and in every case but one, he put no money down. Over half of the homes have been foreclosed and he now has over $140,000 in debt. His blog documents his trials and tribulations trying to get out of debt. The comments are the best part, it seems his audience is split down the middle between people who cheer him for trying to get out of debt and others who attack him for seemingly getting away with abusing the system.

Do you have any similar recommendations?


 

From Mike Arrington's post $100 Million Payday For Feedburner - This Deal Is Confirmed we learn

Rumors about Google acquiring RSS management company Feedburner from last week, started by ex-TechCrunch UK editor Sam Sethi, are accurate and are now confirmed according to a source close to the deal. Feedburner is in the closing stages of being acquired by Google for around $100 million. The deal is all cash and mostly upfront, according to our source, although the founders will be locked in for a couple of years.

I use FeedBurner to track stats for my blog and RSS feed so this is great news because it means the service is here to stay. I've exchanged mail with Eric Lunt a bunch of times about issues I've had with the service and he was always quick to respond with a solution or an ETA for a fix. Google has landed some great folks who built a killer service.

I hope now that they have Google level resources at their disposal users of the service can now get historical statistics for their blogs and feeds instead of being limited to only the last 30 days. I'm curious about what my most popular posts of all time are not just the most popular in the last 30 days.