December 8, 2006
@ 04:59 PM

From Jon Udell's blog post entitled A conversation with Jon Udell about his new job with Microsoft he writes

Q: Your new job is with Microsoft?

A: That's right. My last day at InfoWorld will be Friday Dec 15. On Jan 15, after a month-long sabbatical, I'll become a Microsoft employee. My official title will be Evangelist, and I'll report to Jeff Sandquist. He's the leader of the team that creates Channel 9 and Channel 10, websites that feature blogs, videos, screencasts, and podcasts for Microsoft-oriented developers.

Q: What will your role be?

A: The details aren't nailed down, but in broad terms I've proposed to Microsoft that I continue to function pretty much as I do now. That means blogging, podcasting, and screencasting on topics that I think are interesting and important; it means doing the kinds of lightweight and agile R&D that I've always done; and it means brokering connections among people, software, information, and ideas -- again, as I've always done.

Q: Why are you doing this?

A: I'm often described as a leading-edge alpha geek, and that's fair. I am, and probably always will be, a member of that club. But I'm also increasingly interested in reaching out to the mainstream of society.

For those of us in the club, it's a golden age. With computers and networks and information systems we can invent new things almost as fast as we can think them up. But we're leaving a lot of folks behind. And I'm not just talking about the digital divide that separates the Internet haves from the have-nots. Even among the haves, the ideas and tools and methods that some of us take for granted haven't really put down roots in the mainstream.

I had dinner with Jon a couple of weeks ago when he came up to Microsoft for interviews and I was impressed with the plan he described for the future of his career. I was pretty sure that once anyone interviewing him spent even a few minutes talking to him they'd be convinced they'd found the right person for the job, even though the job was Jon's idea. I was honored that Jon contacted me to talk to me about his plans and have been on pins & needles wondering if the folks at Microsoft would hire him or not.

Congrats to Jeff Sandquist. First Rory, now Jon Udell. You're hiring all the right folks.


 

Categories: Life in the B0rg Cube

December 7, 2006
@ 01:27 AM

Via Sam Ruby's blog post entitled Equal Time I noticed that there has been an interesting conversation brewing about message security and RESTful Web services between Pete Lacey and Gunnar Peterson. However they both seem to be cherry picking parts of each others arguments to dispute which reduces some the educational value of their posts.

Gunnar Peterson started the discussion going with his post REST Security (or lack thereof) where he writes

So the whole REST security thing just gets funnier, the S for Simple folks forget that S also stands for security. Here was a response to my post on the fact that people who say REST is simpler than SOAP with WS-Security conveniently ignore things like, oh message level security:

HTTP Basic or HTTP Digest or SSL (certificate-based) for authentication. SSL for encryption and digital signatures. You know, the way we've been doing things since 1995.

Where to start? Right, it was state of the art in 1995. no bout a doubt it. The world has moved on slightly since then. You know a couple 97 million stolen identities, endless phishing/pharming (growing double digit pct each month), malware taking 60% cpu utilization on consumer desktops. You know little stuff like that
...
Now if you are at all serious about putting some security mechanisms in to your REST there are some good examples. One being Amazon's developer tokens using HMAC for authentication at the message level (you know where the data is). But if you are going to say that REST is so much simpler than SOAP then you should compare REST with HMAC, et. al. to the sorts of encryption and signature services WS-Security gives you and then see how much simpler is. And, you know, maybe even see, oh golly gee I don't know, which one protects your customers' data better? Until then, we'll just continue (as Gene Spafford said) using an armored car to deliver between someone living in a cardboard box and someone living on a park bench.

Gunnar has a good point which he ruins with some of his examples. The point being that HTTP authentication and SSL aren't the be all and end all of securely communicating on the Web. However his examples of spyware and phishing are unrelated to his point and end up harming his argument. For one, there's nothing one can do at the service security layer to protect against a user that has malware running on their computer. Once the user's machine has been compromised, it is over. As for phishing, that is a problem that relies on the unique combination of social engineering and the unfortunate characteristics of email readers and Web browsers. Phishing is not really an architectural problem that affects machine to machine interaction via Web service. It is an end user problem of the HTML Web.

In Pete Lacey's response entitled RESTful Security he writes

Gunnar notes that the world has moved past SSL etc., and cites as examples identity theft, phishing/pharming, and malware. But these security threats are completely orhtogonal to the security concerns SSL addresses. Ditto, I might add, WS-Security. Both of these standards address identity propagation, message encryption, and message integrity only, and neither will protect you from the threats just mentioned. Security is a BIG subject and the areas covered by SSL and WS-Security are just one small part of it. We also need good practices around securing persisted data (and what data to persist); education to prevent social engineering attacks; properly designed operating systems that won’t run anything with a .exe extension or run needless services; developers who are cognizant of buffer overflows, SQL injection, and cross-site scripting attacks; properly managed perimeter defenses; and so on and so on.
...
With all of that behind us, I can get on to what seems to be Gunnar’s main point and the only significant difference (outside of the whole simplicity and interoperability thing) between SSL and WS-Security. And that is that SSL provides transport level, point-to-point security while WSS provides message level, end-to-end security. That’s true, but that doesn’t provide WSS with magical security powers, it just solves a different problem. Nor does it relegate SSL to the scrap heap of history. SSL is not a security panacea–nothing is, but it does what it is does very well. Regardless, there is nothing in REST that prohibits the use of message-level encryption, though the mechanism–should it be needed–would need to be spec’d out.

I’m not dismissing WSS, it’s a perfectly adequate specification for what it does (though it requires the WS-I Security Profile to introduce enough constraints to provide a reasonable chance of interoperability). But the value of message level security should still be questioned. For one thing, what’s the business case? If message-level encryption is so important, why isn’t anyone using it? When Burton Group queried its clients as to their use of WSS, it was found that the only use was to pass identity tokens over HTTPS. When I was working at Systinet (now HP) I vividly recall the WASP (not Systinet Server for Java) product manager spitting nails because his team had just spent six months implementing WSS at our customer’s request and no one–not even those who requested the feature–was using it. Also, this is not the first time message level security has been proposed. When I was working at Netscape back in 1997 I spent a fair amount of my time advocating for S/MIME. Now, nearly ten years later, how many people are using S/MIME to secure their email? And how many are using SSL? Exactly.

I tend to agree with Pete Lacey that a lot of the people who claim that they need message level security actually are fine with the transport level security provided by SSL. Message level security is primarily needed if the message will be passing through hostile intermediaries without secure point-to-point communications between the sender and receiver. But how often does that really happen on the Web? One could argue that the vaunted example by Gunnar Peterson, Amazon Web Services which utilize HMAC-SHA1 hashes of a developer's secret key for authentication could just as easily have been implemented using SSL. After all, man-in-the-middle attacks are prevented in both examples. If the issue is what happens if the sender's machine has been compromised (e.g. by malware) then both approaches fall down flat.

That said, there are times when one has to author an application where the message has to pass through potentially hostile intermediaries and message level security is needed. I've actually had to deal with one such situation in my day job so I know that they are real although I doubt that there are many that will encounter the exact same problem that we did at work.

Once you get to that point, the tough problems are usually around key exchange, key protection and key revokation not around the niceties of whether you should roll your own usage of XML Signatures or should go with a fully featured yet inconsistently implemented protocol like WS-Security. Using the Amazon Web Services as an example, I couldn't find any information on how to protect my secret key beyond admonitions "not to send it around in email" nor did I find any mechanism to revoke or reissue my secret key if it became compromised. As a Web service developer, you'll likely spend more time worrying about those issues than you will figuring out how to integrate signing or encryption of XML documents into your RESTful Web Service.


 

Categories: XML Web Services

Linking to Niall Kennedy's blog reminded me that I owed him an email response to a question he asked about a month ago. The question asked what I thought about the diversity of speakers at the Widgets Live conference given my comments on the topic in my blog post entitled Who Attends 'Web 2.0' Conferences

After thinking about it off and on for a month, I realize that I liked the conference primarily because of its content and focus. The speakers weren't the usual suspects you see at Web conferences nor were they homogenous in gender and ethnic background. I assume the latter is a consequence of the fact that the conference was about concrete technical topics as opposed to a gathering to gab with the hip Web 2.0 crowd which meant that the people who actually build stuff were there...and guess what they aren't all caucasian males in their 20s to 30s, regardless of how much conferences like The Future of Web Apps and Office 2.0 pretend otherwise.

This is one of the reasons I decided to pass on the Web 2.0 conference this year. It seems I may have made the right choice given John Battelle's comments on the fact that a bunch of the corporate VP types that spoke at the conference ended up losing their jobs the next week. ;)


 

Categories: Trip Report

December 6, 2006
@ 02:50 AM

Niall Kennedy has been on a roll in the past couple of weeks. He has a blog post entitled Brands will be widgetized, but who is the author? which tackles the interesting problem of widgets, branding and customer confusion. He writes

Sites with personal user data placed behind a username and password may be subject to new types of phishing attacks from the widget web. A user will likely locate your service's widget through the widget provider's directory, searching for terms such as "Gmail" and "eBay" to access their latest mail messages or watched auction items. These widgets will prompt the user for their login information before delivering personalized information from each service, leaving the trust of a brand in the hands of a third-party developer who may or may not act in the best interest of the data provider.

If Google Mail and eBay worked directly with the large widget producers to establish certified or trusted widget status they could reduce opportunities available for third party widgets offering enticing functionality to send messages to a remote server with collected user data. The trusted, certified, or verified seals provided by each widget platform is one way to ensure users receive the official product and not a knock-off.

This issue has been rattling around in my head ever since I wrote a Flickr gadget and a Blufr gadget for Windows Live Spaces. After all, I don't work for either company yet here I am writing gadgets that are being used by hundreds of users in their name. Who ends up getting the taint if my gadget is buggy or causes some problems for the user? Me or Flickr? What happens if legitimate looking gadgets like mine are actually fronts for phishing attacks? How can Flickr protect their users and their brand from malicious or just plain sloppy developers? I like the idea of the major widget galleries like Windows Live Gallery, Yahoo! Widget Gallery and Spring Widgets coming up with a notion of trusted or certified gadgets but it seems like an unfortunate hoop that web sites now need to jump through to police their brands on the various widgets sites on the Web.  Reminds me of trademark holders having to rush to register their brand name as a domain whenever new TLDs are opened up.

PS: This is one of the reasons you don't see a bunch of Windows Live gadgets out there today. The brand dilution and phishing problem is a real one that worries lots of folks over here.


 

If you are a reggular reader of Slashdot you probably stumbled on a link to the Groklaw article Novell "Forking" OpenOffice.org by Pamela Jones. In the article, she berates Novell for daring to provide support for the Office Open XML formats in their version of OpenOffice.

Miguel De Icaza, a Novell employee, has posted a response entitled OpenOffice Forks? where he writes

Facts barely matter when they get in the way of a good smear. The comments over at Groklaw are interesting, in that they explore new levels of ignorance.

Let me explain.

We have been working on OpenOffice.Org for longer than anyone else has. We were some of the earliest contributors to OpenOffice, and we are the largest external contributor to actual code to OpenOffice than anyone else.
...
Today we ship modified versions of OpenOffice to integrate GStreamer, 64-bit fixes, integrate with the GNOME and KDE file choosers, add SVG importing support, add OpenDMA support, add VBA support, integrate Mono, integrate fontconfig, fix bugs, improve performance and a myriad of others. The above url contains some of the patches that are pending, but like every other open source project, we have published all of those patches as part of the src.rpm files that we shipped, and those patches have eventually ended up in every distribution under the sun.

But the problem of course is not improving OpenOffice, the problem is improving OpenOffice in ways that PJ disapproves of. Improving OpenOffice to support an XML format created by Microsoft is tantamount to treason.

And of course, the code that we write to interop with Office XML is covered by the Microsoft Open Specification Promise (Update: this is a public patent agreement, this has nothing to do with the Microsoft/Novell agreement, and is available to anyone; If you still want to email me, read the previous link, and read it twice before hitting the send button).

I would reply to each individual point from PJ, but she either has not grasped how open source is actually delivered to people or she is using this as a rallying cry to advance her own ideological position on ODF vs OfficeXML.

Debating the technical merits of one of those might be interesting, but they are both standards that are here to stay, so from an adoption and support standpoint they are a no-brainer to me. The ideological argument on the other hand is a discussion as interesting as watching water boil. Am myself surprised at the spasms and epileptic seizures that folks are having over this.

I've been a fan of Miguel ever since I was a good lil' Slashbot in college. I've always admired his belief in "Free" [as in speech] Software and the impact it has on people's lives as well as the fact that he doesn't let geeky religious battles get in the way of shipping code. When Miguel saw good ideas in Microsoft's technologies, he incorporated the ideas into Bonobo and Mono as a way to improve the Linux software landscape instead of resorting to Not Invented Here syndrome.

Unfortunately, we don't have enough of that in the software industry today.


 

Categories: Mindless Link Propagation | XML

December 5, 2006
@ 02:53 PM

I'm a big fan of alcopops but it seems like everytime I settle on one I like, it stops being carried in my local grocery stores. Here's my list so far

  1. Mike's Hard Iced Tea [relegated to urban legend]
  2. Brutal Fruit [discontinued]
  3. Bacardi Silver O3 [tasty and hard to find]
  4. Hornsby's Amber Hard Cider [so far so good]
This is just my way of warning you folks out there that if you like Hornsbys Amber Hard Cider you better stock up because given my luck it's going to be discontinued in the next couple of months. :)
 

Categories: Personal | Ramblings

December 5, 2006
@ 01:26 PM

Any Zune owners got any good stories about using the music sharing feature yet? The Zune ads make it looks so cool but I wonder how it actually works out socially. Do people ask each other for songs or is it more like the comic strip above?
 

Categories: Music

By now, hard core RSS Bandit fans have found out that the installer for the Jubilee release of RSS Bandit is available. A bunch of people have tried it out and given us a lot of good feedback on how some of our new features can be tweaked to make them even better. One of the places we got good feedback [and bug reports] has been our behavior when automatically downloading podcasts from a feed. One signficant bug is that in the beta, RSS Bandit doesn't keep track of what enclosures it has previously downloaded so it may download the same enclosures several times. However, even with this bug fixed we realized there is a problem when one first subscribes to a podcast feed especially if the feed has videos such as Microsoft's Channel 9. On the first time subscribing to that feed, RSS Bandit would automatically start downloading 2-3 gigabytes of videos from the site since that's how many are exposed in the feed. This seems like a bad thing, so we added two new options which are shown in the screenshot below

My main question is what default values we should use. I was thinking 'Only download the last 2 podcasts' and 'Only download files smaller than 500MB' as the defaults. What do you guys think?


 

Categories: RSS Bandit

From the blog post on the Windows Live Search team's blog entitled Search on the Go with Live Search for Mobile Beta we learn

we’re proud to announce three new ways to search on the go:

Mobile Software Download an application to your phone for local search, maps, driving directions, and live traffic information in a faster, richer and more interactive user interface. It's the best way to search from your phone.

LSMb1LSMb2LSMb3LSMb4

 

Mobile Browsing - Access maps and directions directly on your phone’s browser. Simply enter mobile.live.com/search into your phone’s address bar and select Map. Choose from the scopes of Local, Web, Map, News and Spaces and get Live Search from your mobile device.

Text Messages (SMS) - If you don’t have a data plan, you can simply send a text message to 95483 (WLIVE) with a query like “Toys Chicago, IL” or “Coffee 90210” and you’ll immediately receive a text message reply with the nearest business listings with address and phone numbers.

This is a pretty sweet release and I can't wait to get it on my AudioVox SMT 5600. So far, the release has been favorably reviewed by those that have tried it including Gizmodo which has an article entitled Windows Live Search For Mobile vs. Google Maps Mobile which ends on the following note

If you're using a Windows Mobile phone, we'd definitely recommend you try out Windows Live Search. The Java-based Google Maps is just too buggy and slow, not to mention clunky, to be useful to us.

Not bad, eh? I thought Google was the king of innovative search products. :) Speaking of innovation and Microsoft, there is a debate between Robert Scoble and Dave Winer in a recent Wall Street Journal article Is Microsoft Driving Innovation Or Playing Catch-Up With Rivals? which has both bloggers going head to head on whether Microsoft is innovative or not. Interesting read.


 

Categories: Windows Live

First it was Yahoo! Mail that swallowed the AJAX pill only to become unusably slow and now it looks like Yahoo! TV is another casualty of this annoying trend.

Dave Winer writes

Yahoo says they improved Yahoo TV, but imho, they broke it. The listings page, which until today was the only page I knew or cared about (they just added a bunch of community features) took a few seconds to load, now it's an Ajax thing, and it loads as you scroll. Great. There's a delay every time I hit Page Down. Now instead of finding out if there's anything on in seconds it takes minutes. That's an improvement? 

In his post entitled Yahoo TV Goes 2.0. Argh.Paul Kedrosky writes

Well, Yahoo in its wisdom has launched a 2.0-ified version of its TV listings tonight, complete with an Ajax-y interface, cool blue colors, social rating of programs, etc. That's all swell, and frankly I wouldn't care one way or the other (other than they broke my URL for full listings), but the darn thing is sooooo much slower than the old listings. Tables have to get populated, drop-downs have to ... drop, and sliders have to slide while data creakily loads.

It's really irritating -- so irritating, in fact, that rather then wade back in to find out what time tonight the new Frontline episode is out about credit cards, I think I'll just watch it on the Frontline site.

Seriously, who's making these decisions at Yahoo? Don't they realize that slower websites cost them money regardless of how buzzword compliant it now makes them?