Every once in a while I check out the forums on Channel 9 to see what the conversations are happening between Microsoft and its customers. Today I found a post entitled MSNFOUND is so sad... which is in reference to the MSN.:FOUND website. The comments on the site on the Chanel 9 forums seems to be universally negative. Looking a little further, I notice that checking the Bloglines citations for http://www.msnfound.com also results in numerous blog postings about the site whose feedback is primarily negative.

The reasons for this are unsurprising. They were listed in Robert Scoble's blog posting where he lambasted the creators of the site and asked for them to be fired. I agree with most of what Robert wrote in his post. I'd only add one thing, please don't hold this against us. We aren't as stupid as our marketing people make us look.


 

Categories: Life in the B0rg Cube | MSN

For the second time in two months I was contacted by a reporter about my blog. Howard Wen has written an article in InternetWeek entitled Blogging About Work is Risky Business on some of the concerns around blogging and the workplace. I referred him to our PR folks for answers to most of his questions but did have some words to say about what guidelines I use when deciding what to post. From the article

For the employee blogger, finding out what one's company policy is toward blogging would be the first step. If the company doesn't have one, or has one that is not exactly clear about the matter of blogging, it may be best to practice a little paranoia.

"I've had my blog postings appear in such publications as the Wall Street Journal," Microsoft employee Dare Obasanjo says. "So the exhortation to treat blog posts as if they could appear on the cover of a national newspaper is something I keep in mind whenever I write my blog."

Obasanjo keeps two blogs, one on the Microsoft developer network and another on a personal server for non-work-related topics.

I also sent him a link to some of Robert Scoble's blogging guidelines but I guess they didn't make it past editing. My original quote was longer and began with the principle that one should treat one's blog postings as if they were going to appear on the cover of a national newspaper. You never know which one of your posts some reporter or A-list blogger is going to decide to bring to prominence by quoting or linking to it so one should never put anything on their public weblog they wouldn't be comfortable with being read by their friends, family, coworkers as well as thousands of total strangers.

Speaking of reporters, I recently noticed that my blog post on Google Toolbar's AutoLink feature was referenced in the eWeek article Google's Tool Bar Links Stir Debate. Like I said, you never know if or when one of your posts could end up as fodder for a news stories so always think before you post.


 

Categories: Life in the B0rg Cube

As a user of Microsoft products and an employee of the company, I am of two minds about its entrance into the anti-spyware and anti-virus arenas. I agree with the sentiments Michael Gartenburg of Jupiter Research shared in his post Microsoft and Security - Demand if they do and demand if they don't

It's tough to be Microsoft. On one hand, folks insist that security, spyware and virus are issues that they must own. On the other hand, when they do own it and respond, they get dinged for it. Microsoft's decision to get into the business and make these tools available should be lauded. Security is a tough issue that I've written about before and users need to also take on a share of the responsibility of keeping their systems safe. The fact is, even with good solutions on the market, not enough users are protecting their systems and if Microsoft entering the game can help change that, it's a good thing.

Given that spyware is quite likely the most significant problem facing users of Windows I believe that Microsoft has a responsibility to its customers to do something about it. Others may disagree. For example, Robert X. Cringely attacks Microsoft for its recent announcements about getting in the anti-spyware market in his post Killing Us With Kindness: At Microsoft, Even Good Deeds Are Predatory. He writes

How can giving away software tools be bad? Look at how Microsoft is doing it. Their anti-virus and anti-spyware products are aimed solely at users of Windows XP SP2. This has very different effects on both the user base and the software industry. For users, it says that anyone still running Windows 98, ME, or 2000 has two alternatives -- to upgrade to XP/SP2 or to rely on non-Microsoft anti-virus and anti-spyware products. Choosing to upgrade is a 100+ million-unit windfall for Microsoft. That's at least $10 billion in additional revenue of which $9 billion is profit -- all of it coming in the next 12 months.That $10 billion, by the way, comes from you and me, and comes solely because of Microsoft's decision to offer "free" software.

Of course, you can decide not to upgrade to XP/SP2 and rely instead on third-party products to protect your system. But Microsoft has set the de facto price point for such products at zero, zilch, nada. By doing this, they have transferred their entire support obligation for these old products to companies like Symantec and Network Associates without transferring to those companies any revenue stream to make it worth their trouble. Maybe Peter Norton will say, "Screw this, I'm not going to support all these millions of people for nothing." Well, that's Symantec (not Microsoft) apparently forcing users into the same upgrade from which Microsoft (not Symantec) gains all the revenue.
...
Look how this decision transforms Microsoft. By choosing to no longer support a long list of products (is that even legal?), hundreds and perhaps thousands of developers will be switching to new duties. If this were any other company, I would predict layoffs, but a key strategy for Microsoft is hiring the best people simply to keep them from working elsewhere, so I don't think layoffs are likely. What IS likely is an exodus of voluntary departures. What's also likely is that those hundreds or thousands of reassigned developers will be moved to some other doomsday product -- something else for us to eagerly anticipate or fear.

Cringely seems to be confusing supporting old versions of Windows with providing applications that run on current versions of Windows. Windows 98, Windows 98 SE and Windows Millenium are old versions of Windows whose support life cycle was supposed to end about a year ago but was extended by Microsoft. At the current time, Microsoft will continue to provide critical security updates for these operating systems but new applications won't be targetting them but instead will target the current version of Windows for the desktop which is Windows XP.

Microsoft's anti-spyware and anti-virus applications are not an operating system upgrade but instead new applications targetting the current versions of Windows. Even if they were, Windows 98, Windows SE and Windows Millenium are past the stage in their support life cycle where they'd be eligible for such upgrades anyway. Given that Robert X. Cringely is quite familiar with the ways of the software industry I'm surprised that he expects a vendor of shrinkwrapped software to be providing new features to seven year old versions of their software when current versions exist. This practice is extremely uncommon in the software industry. I personally have never heard of such an instance by any company.


 

While playing Magic:The Gathering with Michael Brundage and a bunch of former co-workers last week he informed me that he had updated his Working At Microsoft essay given some comments and feedback he had seen. Some excerpts from the additions to his essay are included below

Unreality

As a parent, I've come to understand that there's a wide gray area between overprotecting your children and creating a nuturing environment in which they can develop. I think Microsoft struggles with a similar problem with its employees. Microsoft provides its employees with a nuturing environment in which they can be most productive. But like children, these employees also need to be grounded in reality and exposed to ideas that can be disruptive or even disturbing. Otherwise a sheltered monoculture can develop that's unhealthy for everyone involved.

It's hard for people who don't work at Microsoft's main campus to understand just how unreal the experience of working there can become. Some employees forget that most of the world doesn't have broadband wireless networking, high-end consumer electronics, luxury vehicles, and enough money that they don't need to live on a budget. Some employees spend so much time using Microsoft products, that they forget about the competition and/or lose touch with typical customers' needs

...

Microsoft's Not Evil

The reality is that Microsoft is made up of mostly honest, earnest, hardworking people. People with families. People with hardships. People with ordinary and extraordinary lives. People who make wise and foolish decisions. Some employees are bad apples, and some leaders make poor decisions (which their employees may or may not support). Both usually meet with failure. All the Microsoft employees I know are internally driven to "succeed," where success sometimes means outselling the competition but always means doing your personal best and improving people's lives with your work.

Although groups don't have intentions, it's true that group policies reward some kinds of behavior over others. So perhaps "Microsoft is evil" is shorthand for "Microsoft's policies are evil."

The thing is, I haven't seen any evidence of that on the inside and I'm usually very critical of these things. For as long as I've worked at Microsoft, ethics have been a real part of employee performance reviews. It's not just talk, but the way work goes each day. Most product designs revolve around addressing specific customer needs. No one ever says "Hey, let's go ruin company P" or other things that could be construed as "evil." Instead, it's "customers Q and R are having trouble with this, and I have an idea how we could fix it..." and other positive, constructive statements.

If anything, Microsoft seems to have the opposite problem, in which employees sometimes design or cut a feature or product without fully appreciating the huge impact their decision can have outside the company. When the media goes wild with knee-jerk reactions for or against something Microsoft did, often the employees responsible for the decision are caught off-guard by the disproportionate public attention.

I came to similar conclusions about Microsoft employees a year or two ago. The main problem with Microsoft employees is not that they are out to 'cut of the air supply' of their competitors. Instead it is that they often make decisions without awareness of the impact their decisions will have on the software industry as a whole. I have often been surprised to realize that some program manager for some new feature or component in some Microsoft product has failed to realize that the entrance of Microsoft in that space could potentially put partners or competitors out of business. This isn't to say that the execs don't realize that Microsoft entering markets usually decimates competition but that a lot of the rank and file guys implementing these decisions or coming up with some of these ideas often don't realize the impact of their actions.

One of the reasons I like working at MSN is that we aren't under any illusions about these issues. We aren't "adding a new feature to Visual Studio to make developers more productive" or "adding some new functionality to Windows to make it more useful for end users" which also happens to have been a thriving ISV market until we decided to enter it. Instead we are directly competing with folks like Yahoo, Google or AOL for a particular market. This situation definitely sits a lot better with me.


 

Categories: Life in the B0rg Cube

February 22, 2005
@ 05:36 PM

The next release of RSS Bandit is scheduled for next month. However we've hit a bit of a snag with regards to translations to langauges other than English. We currently have translators signed up for Japanese, Simplified Chinese, Hindi, Turkish, Brazilian Portueguese, and German. The problem is that in the previous version of RSS Bandit we had translations for a larger set of languages but haven't gotten any responses from the original translators for this new version. This means the next version of RSS Bandit may not be localized in Polish, French and Russion even though the previous one was.

If you'd be interested in translating RSS Bandit to your native language or acting as a proof reader for one of our existing translations please contact us at the email address provided below.

RSS Bandit development team contact email address
 

Categories: RSS Bandit

Saturday was musuem day for me. I visited both the Science Fiction Museum and Hall of Fame in Seattle and the Museum of Glass in Tacoma. The entrance fee for the Science Fiction Museum $12.95 which is a bit overpriced considering what one gets out of the tour. The musuem is primarily a collection of old science fiction books and magazines as well as props from various movies. Some of the props are quite cool such as the alien queen and Ripley's construction suit from Aliens while others such as the collection of phasers from the various Star Trek movies failed to light my fire. At the end it seemed more like I'd just been shown some geek's private hoard of science fiction memorabilia than I'd been at an actual musuem or hall of fame. I probably would have felt less ripped off if the cover fee was $5 instead of almost $13.

The Museum of Glass was more satisfying although it also felt like it was over too soon. The  Einar and Jamex de la Torre: Intersecting Time and Place was amazing although a bit gory. The artwork by the Torre brothers had demons, exposed human organs and catholic religious relics as a recurring theme. It led to some interesting artwork which I could see some of the parents with younger children had difficulty explaining to their kids. The Solid Cinema: Sculpture by Gregory Barsamian was also impressive. They were mechanical animated pieces illuminated by strobe light. There were only three pieces but they were extremely well done and I spent some time scratching my head trying to figure out how they worked. The Museum of Glass was definitely worth the $10.

The next local museum on my list is the Museum of Flight.


 

Categories: Ramblings

Michael Gartenburg has a blog posting entitled Is Google doing what Microsoft couldn't with their new search bar?  where he writes

As Yogi would say, "it's deja vous, all over again". When Google introduced the newest version if their toolbar, it seems they added a feature that sounds very similar to what Microsoft wanted to do with SmartTags. Apparently the new software will create links in web text that will send you back to Google sites or sites of their choosing. If I recall correctly, there was a huge outcry over the SmartTag feature. Even petitions. How come there is no outcry here? Is it because Google does no evil?

Like I said yesterday, who needs a new browser to do stuff like this when you can co-opt IE with a toolbar?

This is one of the key differences between Google and Microsoft; perception. I am glad to see Google imitating one of Microsoft's innovations from a few years ago. After all, imitation is the sincerest form of flattery. As can be expected Dave Winer is already on the offensive.

Personally, I can't wait to see how much cognitive dissonance this causes the Slashdot crowd.


 

Categories: Technology

February 16, 2005
@ 04:17 PM

From the first day we added the Search Folder feature to RSS Bandit, I've wanted to be able to create a folder that contained unread messages that are a week old which contain 'Microsoft' in the title from the Slashdot, InfoWorld or Microsoft-Watch feed. However the existing Search Folder feature did not allow you to restrict the targets of a search to particular feeds or categories. The back end code to do this has existed for a while but Torsten never got around to adding the UI for this, until yesterday. It's looking like the Wolverine release is shaping up to be the best release yet.


 

Categories: RSS Bandit

This morning I saw a post by Tim Bray entitled Another Loyal Oppositionist where he pointed to a post by James Governor stating that Microsoft is ignoring the demand for toolkits that support plain old XML over HTTP and instead focusing on SOAP-based XML Web Services and the WS-* family of specifications. Before I could respond I saw that Mike Champion had already beat me to the punch with his post  MS Ignoring developer demand for REST tools? where he writes

The long running PR battle is heating up again between those who advocate the implementation of service architectures [1] "RESTfully" using XML transferred via HTTP, vs those who work with SOAP, WSDL, and the specs built on them known as "WS-*".  The item that finally got me to blog about this painful [2] subject was James Governor's SOAP is boring, wake up Big Vendors or get niched

Evidence continues to mount that developers can' t be bothered with SOAP and the learning requirements associated with use of the standard for information interchange.  ...Developers are turning their backs on the standard. Folks that is, building interesting information splicing apps--semantically rich platforms like flickr and Amazon are being accessed by RESTful methods, not IBM/MS defined "XML Web Services" calls.

Yesterday my partner Stephen issued a wake up call for middleware and tools vendors - give developers what they want, not what you think they should have. ...

One big question is why haven't IBM and Microsoft responded? The obvious answer is vested interest. When you have "bet the company" on a technology stack its kind of a drag to have to respond to something else.

A few thoughts: For one thing, I'd note that Microsoft  has responded to this need ... in about 1999.  The XMLHTTPRequest object implemented since the second version of the MSXML library provides such a powerful and convenient way to use XML and HTTP together that this API has been implemented in competing platforms, and was featured in an XML.com article just last week.   Forgive us for not touting Microsoft's support for this newfangled "RESTful" approach, I guess people here thought it was old news :-)

When I read James Governor's post, I also wondered what he was expecting from a toolkit for supporting XML Web Services that just used POX (plain old XML). We've shipped multiple XML APIs and have libraries for working with HTTP in the .NET Framework. In MSXML, we shipped XMLHTTP which recently started making headlines again because Google has been using it heavily in their recent web applications such as Google Suggest. Microsoft has and will continue to ship libraries and tools that make it easier for developers to work with plain old XML over HTTP.

SOAP and the WS-* family of specifications target developer problems with more sophisticated requirements and needs than can be satisfied by simply using POX. As I stated in my recent post  On Interoperability and Tim Ewald's 3 Web Services Stacks

However context is everything. Replacing a world of distributed applications written with DCOM, CORBA, Java RMI, etc with one where they are written using the WS-* protocols is a step in the right direction. I completely agree that it leads to better interoperability across technologies produced by the big vendors who are using incompatible technologies today.

But when your plan is to reach as many parties as possible one should favor simpler Web services technologies like plain old SOAP or just plain old XML (aka POX). Plain old XML doesn't necessarily mean following the principles of REST either. After all a number of successful plain old XML services on the Web don't follow these principles to the letter. For example, Joe Gregorio pointed out in his article Amazon's Simple Queue Service that Amazon's Queueing Service violated these principles. In Sam Ruby's post entitled Vacant Space he points out that plain old XML web services exposed by Bloglines and del.icio.us aren't RESTful either.

Given that Don Box linked to the above post and commented favorably on it I believe this makes it clear that we at Microsoft understand that POX, SOAP, and WS-* each have a part to play in the world of XML Web services.


 

Categories: XML Web Services

For Her and For Him. Too bad I found about these after I got my gifts. I guess there's always next year.


 

February 11, 2005
@ 04:57 AM

Steve Vinoski has a blog posting entitled Focus on the contract where he writes

Tim offers some extremely excellent advice (as usual) regarding what really matters when you write your services. If I may paraphrase what he says and perhaps embellish it a bit, starting from the implementation language and generating your contracts from it is just plain wrong, wrong, wrong, at least for systems of any appreciable magnitude, reach, or longevity. Instead, focusing on the contracts first is the way to go. I've written about this for many years now, starting well over a decade ago.

When you start with the code rather than the contract, you are almost certainly going to slip up and allow style or notions or idioms particular to that programming language into your service contract. You might not notice it, or you might notice it but not care. However, the guy on the other side trying to consume your service from a different implementation language for which your style or notions or idioms don't work so well will care.

Although Steve Vinoski's argument sounds convincing, there is one problem with it. It is actually much easier to make an uninteroperable Web service if one starts with the service contract instead of with object oriented code. The reason for this is quite simple and one I've harped on several times in the past; the impedance mismatch between XSD and objects is quite significant. There are several constructs in W3C XML Schema which simply have no counterpart in traditional object oriented languages which cause current XML Web service toolkits to barf when consuming them. For example, the XmlSerializer class in the .NET Framework supports about half the constructs in W3C XML Schema. Most XML Web Service toolkits support a similar number [but different set] of features of W3C XML Schema.

This isn't theoretical, more than once while I was the program manager for XML Schema technologies in the .NET Framework I had to take conference calls with customers who'd been converted to the 'contract first' religion only to find out that toolkits simply couldn't handle a lot of the constructs they were putting in their schemas.Those conversations were never easy.

The main thing people fail to realize when they go down the 'contract first' route is that it is quite likely that they have also gone down the 'XML first' route which most of them don't actually want to take. Folks like Tim Ewald don't mind the fact that sometimes going 'contract first' may mean they can't use traditional XML Web Service toolkits but instead have to resort to SAX, DOM and XSLT. However for many XML Web Service developers this is actually a problem instead of a solution.


 

Categories: XML | XML Web Services

Tim Ewald has a post entitled My 3 Web Services Stacks where he writes

The point Chris was making was that you created a COM class, your choice of data type determined who could use your component. In other words, your choice affected how far your code could reach. There will be a similar split in the Web services world, hence the title of this post. This time though the spin will be a little different. It won't focus on data format so much (it's all just XML), but on behavior. To wit, “My 3 Web Service Stacks“:

First, there is basic SOAP over HTTP described with WSDL and XSD. It hits a sweet spot for developers who want tools to hide XML details but is still simple enough to implement everywhere. Then there is WS-*, which offers richer functionality but is harder to implement and will not be supported by all products, especially legacy systems. Finally, there are toolkit specific features like BEA's idiom for async messaging over HTTP using a header to convey a callback URL, Iona's Artix support for a CORBA binding, or Microsoft's Indigo support for binary messaging.

In this world, your choice of feature will decide how far you can reach: to everyone, to everyone using WS-* enabled tools and infrastructure, or only to others using the same toolkit you are.

I completely agree with this and in fact back when I was on the XML team I used to be frustrated with the fact that the given all the reams of text coming out of the XML Web Services Developer Center on MSDN Microsoft never did a good job of explaining this to developers. Of course, this would require admitting that many developers don't require the functionality of the WS-* specs and due to their complexity in certain cases this would actually hamper interoperability compared to plain old SOAP (aka POS). To some this may contradict Bill Gates's statements on interoperability in his recent executive memo Building Software That Is Interoperable By Design  where he wrote

The XML-based architecture for Web services, known as WS-* ("WS-Star"), is being developed in close collaboration with dozens of other companies in the industry including IBM, Sun, Oracle and BEA. This standard set of protocols significantly reduces the cost and complexity of connecting disparate systems, and it enables interoperability not just within the four walls of an organization, but also across the globe. In mid-2003, Forrester Research said that up to a "ten-fold improvement in integration costs will come from service-oriented architectures that use standard software plumbing." 

However context is everything. Replacing a world of distributed applications written with DCOM, CORBA, Java RMI, etc with one where they are written using the WS-* protocols is a step in the right direction. I completely agree that it leads to better interoperability across technologies produced by the big vendors who are using incompatible technologies today.

But when your plan is to reach as many parties as possible one should favor simpler Web services technologies like plain old SOAP or just plain old XML (aka POX). Plain old XML doesn't necessarily mean following the principles of REST either. After all a number of successful plain old XML services on the Web don't follow these principles to the letter. For example, Joe Gregorio pointed out in his article Amazon's Simple Queue Service that Amazon's Queueing Service violated these principles. In Sam Ruby's post entitled Vacant Space he points out that plain old XML web services exposed by Bloglines and del.icio.us aren't RESTful either.

When building distributed applications using XML, one should always keep in mind the 3 web service stacks. My day job now involves designing XML web services for both internal and external consumption so I'll probably start writing more about web services in the coming months.


 

Categories: XML Web Services

February 10, 2005
@ 02:01 PM

Just saw Iran Promises 'Burning Hell' for Any Aggressor and N.Korea Says It Has Nuclear Arms, Spurns Talks. Looks like the world woke up on the wrong side of bed.


 

February 9, 2005
@ 03:05 PM

David Megginson (the creator of SAX) has a post entitled The complexity of XML parsing APIs where he writes

Dare Obasanjo recently posted a message to the xml-dev mailing list as part of the ancient and venerable binary XML permathread (just a bit down the list from attributes vs. elements, DOM vs. SAX, and why use CDATA?). His message including the following:

I don’t understand this obsession with SAX and DOM. As APIs go they both suck[0,1]. Why would anyone come up with a simplified binary format then decide to cruft it up by layering a crufty XML API on it is beyond me.

[0] http://www.megginson.com/blogs/quoderat/archives/2005/01/31/sax-the-bad-the-good-and-the-controversial/

[1] http://www.artima.com/intv/dom.html

I supposed that I should rush to SAX’s defense. I can at least point to my related posting about SAX’s good points, but to be fair, I have to admit that Dare is absolutely right – building complex applications that use SAX and DOM is very difficult and usually results in messy, hard-to-maintain code.

I think this is a pivotal part of the binary XML debate. The primary argument for binary serializations of XML is that certain parties want to get the benefit of the wide array of technologies for processing XML yet retain the benefits of a binary format such as reduced size on the wire and processing time. Basically having one's cake and eating it too.

For me, the problem is that XML is already being pulled from too many directions as it is. In retrospect I realize it was foolish for me to think that the XML team could come up with a single API that would satisfy people processing business documents written in  wordprocessingML, people building distributed computing applications using SOAP or developers reading & writing to application configuration files. All of these scenarios use intersecting subsets of the full functionality of the XML specification. The SOAP specs go as far as banning some features of XML while others are simply frowned upon based on the fact that the average SOAP toolkit simply doesn't know what to do with them. One man's meat (e.g. mixed content) is another man's poison.

What has ended up happening is that we have all these XML APIs that expose a lot of cruft of XML that most developers don't need or even worse make things difficult in the common scenarios because they want to support all the functionality of XML. This is the major failing of APIs such as the .NET Framework's pull model parser class, System.Xml.XmlReader, DOM and SAX. The DOM also has issues with the fact that it tries to support conflicting data models (DOM vs. XPath) and serialization formats (XML 1.0 & XML 1.0 + XML namespaces). At the other extreme we have APIs that try to simplify XML by only supporting specific subsets of its expressivity such as the System.Data.DataSet and the System.Xml.XmlSerializer classes in the .NET Framework. The problem with these APIs is that the developer is dropped of a cliff once they reach the limits of the XML support of the API and have to either use a different API or resort to gross hacks to get what they need done. 

Unfortunately one of the problems we had to deal with when I was on the XML team was that we already had too many XML APIs as it was. Introducing more would create developer confusion but trying to change the existing ones would break backwards compatibility. Personally I'd rather see efforts being to create better toolkits and APIs for the various factions that use XML to make it easier for them to get work done than constantly churning the underlying format thus fragmenting it.


 

Categories: XML

I got an email from Shelly Farnham announcing a Social Computing Symposium, sponsored by Microsoft Research which will be held at Redmond Town Center on April 25-26. Below is an excerpt from the announcement

In the spring of 2004 a small two-day Social Computing Symposium, sponsored by Microsoft Research, brought together researchers, builders of social software systems, and influential commentators on the social technology scene...A second symposium at Microsoft Research is planned for April 25th-26th, 2005...

If you are interested in attending the symposium, please send a brief, 300-500 word position paper. The symposium is limited to 75 people, and participants will be selected on the basis of submitted position papers.

Position papers should not be narrowly focused on either academic study or industry practice. Rather, submissions should do one or more of the following: address theoretical and methodological issues in the design and development of social computing technologies; reflect on concrete experiences with mobile and online settings; offer glimpses of novel systems; discuss current and evolving practices; offer views as to where research is needed.

We are particularly interested in position papers that explore any of the following areas. However, given the symposium’s focus on new innovation in social technologies, we are open to other topics as well.

a) The digitization of identity and social networks.

b) Proliferation and use of social metadata.

c) Mobile, ubiquitous social technologies changing the way we socialize.

d) Micropublishing of personal content (e.g. blogs), and the democratization of information exchange and knowledge development.

e) Social software on the global scale: the impact of cross-cultural differences in experiences of identity and community.

Please send your symposium applications to scspaper@microsoft.com by February 28th.

I would like to attend which means I have to cough up a position paper. I have 3 ideas for position papers; (i) Harnessing Latent Social Networks: Building a Better Blogroll with XFN and FOAF, (ii) Blurring the Edges of Online Communication Forms by Integrating Email, Blogging and Instant Messaging or (iii) Can Folksonomies Scale to Meet the Challenges of the Global Web?

So far I've shared this ideas with one person and he thought the first idea was the best. I assume some of the readers of my blog will be at this symposium. What would you guys like to get a presentation or panel discussion on?


 

February 9, 2005
@ 01:43 PM

The team I work for at MSN has open developer and test positions. Our team is responsible for the underlying infrastructure of services like MSN Messenger, MSN Spaces, MSN Groups and some of Hotmail. We frequentlu collaborate with other properties across MSN such as MyMSN and MSN Search as well as with other product units across Microsoft such as in the case of Outlook Live. If you are interested in building world class software that is used by hundreds of millions of people and the following job descriptions interest you then send me your resume

Software Design Engineer (Developer)

The team builds key platforms that facilitate storing and sharing photo albums, files, blogs, calendars, buddy lists, favorites, etc. We are looking for a strong software developer with good problem solving skills and at least 2-3 years of experience with C++ or other programming languages. Experience in relational databases especially data modeling, schema design, developing stored procedures and programming against the databases is required. Candidates with good SQL knowledge and performance tuning databases will be preferred.

Software Design Engineer/Test (Tester)

The MSN Communication and Platform Services team is looking for a passionate and experienced SDET candidate with a strong programming, design, and server background to help us test next generation internet sharing and communication scenarios at MSN. You will be responsible for designing and creating the test automation and tools for the MSN Contacts service, which stores over 9 billion live MSN Hotmail and MSN Messenger contacts for over 400 million users. You should be able to demonstrate thorough understanding of testing methodologies and processes. Requirements for this position include strong skills in C++ or C#, design, problem solving, troubleshooting, proven experience with test case generation techniques or model based testing methodologies. communication (written & verbal). and a strong passion for quality. Your position will include key tasks such as writing test plans and test cases, working with PM and Development to provide them with integration and architecture feedback, working across teams with major partners such as Hotmail, Office, and Windows, and driving quality and performance related issues to resolution.

Email your resume to dareo@msft.com (replace msft with microsoft) if the above job descriptions sound like they are a good fit for you. If you have any questions about what working here is like, you can send me an email and I'll either follow up via email or my blog to answer any questions of general interest [within reason].


 

Categories: Life in the B0rg Cube | MSN

February 9, 2005
@ 12:27 PM

Abbie, who is one of the coolest MSN Spaces users around, has a posted a collection of links to various posts showing how to get extra mileage out of MSN Spaces. Check out her post MSN Spaces Tips, Tricks, Gods and More . Some of my favorite links from her page include

Alerts For Your Space - want to set up alerts are learn how they work? Read here!

Edit It! Button - Scott's trick for obtaining additional blog editing features.

Guide to Trackbacks - What are trackbacks and how do you use them?

Edit! Help for FireFox Users - Some editing perks for your FireFox users.

Understanding Layout Customization - Learn your way around customizing your Space.

Minimizing Content Spam - Great post by Mike regarding spam in your Space.

Podcasting Your Space - Great information on how to set up your own podcast of your Space

Deleting Your Space - What really happens when you delete your Space?

Take Too Long and Lose It - Did you know you could lose your post if you take too long to type it out?  Read and learn how to prevent it.

Add A GuestBook - a unique way to add a guestbook to your Space

Give Your FeedBack and Ideas for Spaces - Have an idea for Spaces? Get it heard here!

Who Owns Your Spaces Content - a small but great FYI post regarding your content

There is at least one neat trick that Abbie missed from her list. Jim Horne shows how to embed audio and video into a blog post on a Space. Excellent.


 

Categories: Mindless Link Propagation | MSN

February 7, 2005
@ 02:24 PM
If you're black Iraqi, you gotta look at America a little bit different. You gotta look at America like the uncle who paid for you to go to college ... but molested you."

The above sentence, slightly modified from a quote in Chris Rock's Never Scared, sums up my thoughts on the elections in Iraq.


 

Categories: Ramblings

We are now feature complete for the next release of RSS Bandit. Interested users can download it at RssBandit.1.3.0.18.Wolverine.Beta.zip. All bugs specific to the previous alpha release have been fixed. The next steps for us are finish working on documentation, translations and fixing any bugs that are reported during the beta with the intention of shipping a final release in the next two weeks.

Check out a screenshot of the Wolverine Beta which shows some of the new features such as newspaper styles and the additional columns in the listview. The major new features are listed below but there are couple of minor ones such as proxy exclusion lists and better handling of SSL certificate issues which aren't listed. There will be a comprehensive list of bug fixes and new features in the announcement for the final release.

NEW FEATURES (compared to v1.2.0.117)

  • Newspaper styles: Ability to view all unread posts in feeds and categories or all posts in a search folder in a Newspaper view. This view uses templates that are in the same format as those used by FeedDemon so one can use RSS Bandit newspaper styles in FeedDemon and vice versa. One can choose to specify a particular stylesheet for a given feed or category. For example, one could use the slashdot.fdxsl stylesheet for viewing Slashdot, headlines.fdxsl for viewing news sites in the News category and outlook-express.fdxsl for viewing blog posts.  

  • Column Chooser: Ability to specify which columns are shown in the list view from a choice of Headline, Topic, Date, Author, Comment Count, Enclosures, Feed Title and Flag. One can choose to specify a column layout for a particular feed or category. For example, one could use the layout {Headline, Author, Topic, Date, Comment Count} for a site like Slashdot but use one like {Headline, Topic, Date} for a traditional blog that doesn't have comments.

  • Category Properties: It is now possible to specify certain properties for all feeds within a category such as how often they should be updated or how long posts should be kept before being removed. 

  • Identities: One can create multiple personas with associated information (homepage, name, email address, signature, etc) for use when posting comments to weblogs that support the CommentAPI.

  • del.icio.us Integration: Users who have accounts on the del.icio.us service can upload links to items of interest directly from RSS Bandit.  

  • Skim Mode: Added option to 'Mark All Items As Read on Exiting a Feed' 

  • Search Folder Improvements: Made the following additions to the context menu for search folders; 'New Search Folder', 'Refresh Search', 'Mark All Items As Read' and 'Rename Search Folder'. Also deletion of a search folder now prompts the user to prevent accidental deletion

  • Item Deletion: News items can be deleted by either using the [Del] key or using the context menu. Deleted items go to the "Deleted Items" special folder and can be deleted permanently by emptying the folder or restored to the original feed at a later date.

  • UI Improvements: Tabbed browsers now use a multicolored border reminiscent of Microsoft OneNote.

POSTPONED FEATURES (will be in the NightCrawler release)

  • NNTP Support
  • Synchronizing state should happen automatically on startup/shutdown
  • Applying search filters to the list view 
  • Provide a way to export feeds from a category

 

Categories: RSS Bandit

When I used to work on the XML team at Microsoft, there were a number of people who I interacted with who were so smart I used to feel like I learned something new everytime I walked into their office. These folks include

  • Michael Brundage - social software geek before it was a hip buzzword, XML query engine expert and now working on the next generation of XBox

  • Joshua Allen - semantic web and RDF enthusiast, if not for him I'd dismiss the semantic web as a pipe dream evangelized by a bunch of theoreticians who wouldn't know real code if it jumped them in the parking lot and defecated on their shoes, now works at MSN but not on anything related to what I work on

  • Erik Meijer - programing language god and leading mind behind X# Xen Cω , he is a co-inventor on all my patent applications most of which started off with me coming into his office to geek out about something I was going to blog about

  • Derek Denny-Brown - XML expert from back when Tim Bray and co. were still trying to figure out what to name it, one heckuva cool cat

Anyway that was a bit of digression before posting the link mentioned in the title of the post. Michael Brundage has an essay entitled Working at Microsoft  where he provides some of his opinions on the good, the bad, and the in-between of working at Microsoft. One key insight is that Microsoft tends to have good upper management and poor middle management. This insight strikes very close to home but I know better than to give examples of the latter in a public blog post. Rest assured it is very true and the effects on the company have cost it millions, if not billions of dollars.


 

Categories: Life in the B0rg Cube

One of the biggest assumptions I had about software development was shattered when I started working on the XML team at Microsoft. This assumption was that standards bodies know what they are doing and produce specifications that are indisputable. However I've come to realize that the problems of design by committee affects illustrious names such as the W3C and IETF just like everyone else. These problems become even more pernicious when trying to combine technologies defined in multiple specifications to produce a coherent end to end application.

An example of the problem caused by contradictions in core specifications of the World Wide Web is summarized in Mark Pilgrim's article, XML on the Web Has Failed. The issue raised in his article is that determining the encoding to use when processing an XML document retrieved off the Web via HTTP, such as an RSS feed, is defined in at least three specifications which contradict each other somewhat; XML 1.0, HTTP 1.0/1.1 and RFC 3023. The bottom line being that most XML processors including those produced by Microsoft ignore one or more of these specifications. In fact, if applications suddenly started following all these specifications to the letter a large number of the XML documents on the Web would be considered invalid. In Mark Pilgrim's article, 40% of 5,000 RSS feeds chosen at random would be considered invalid even though they'd work in almost all RSS aggregators and be considered well-formed by most XML parsers including the System.Xml.XmlTextReader class in the .NET Framework and MSXML.

The newest example, of XML specifications that should work together but instead become a case of putting square pegs in round holes is Daniel Cazzulino's article, W3C XML Schema and XInclude: impossible to use together??? which points out

The problem stems from the fact that XInclude (as per the spec) adds the xml:base attribute to included elements to signal their origin, and the same can potentially happen with xml:lang. Now, the W3C XML Schema spec says:

3.4.4 Complex Type Definition Validation Rules

Validation Rule: Element Locally Valid (Complex Type)
...

3 For each attribute information item in the element information item's [attributes] excepting those whose [namespace name] is identical to http://www.w3.org/2001/XMLSchema-instance and whose [local name] is one of type, nil, schemaLocation or noNamespaceSchemaLocation, the appropriate case among the following must be true:

And then goes on to detailing that everything else needs to be declared explicitly in your schema, including xml:lang and xml:base, therefore :S:S:S.

So, either you modify all your schemas to that each and every element includes those attributes (either by inheriting from a base type or using an attribute group reference), or you validation is bound to fail if someone decides to include something. Note that even if you could modify all your schemas, sometimes it means you will also have to modify the semantics of it, as a simple-typed element which you may have (with the type inheriting from xs:string for example), now has to become a complex type with simple content model only to accomodate the attributes. Ouch!!! And what's worse, if you're generating your API from the schema using tools such as xsd.exe or the much better XsdCodeGen custom tool, the new API will look very different, and you may have to make substancial changes to your application code.

This is an important issue that should be solved in .NET v2, or XInclude will be condemned to poor adoption in .NET. I don't know how other platforms will solve the W3C inconsistency, but I've logged this as a bug and I'm proposing that a property is added to the XmlReaderSettings class to specify that XML Core attributes should be ignored for validation, such as XmlReaderSettings.IgnoreXmlCoreAttributes = true. Note that there are a lot of Ignore* properties already so it would be quite natural.

I believe this is a significant bug in W3C XML Schema that it requires schema authors to declare up front in their schemas where xml:lang, xml:base or xml:base may occur in their documents. Since I used to be the program manager for XML Schema technologies in the .NET Framework this issue would have fallen on my plate. I spoke to Dave Remy who toke over my old responsibilities and he's posted his opinion about the issue in his post XML Attributes and XML Schema .  Based on the discussion in the comments to his post it seems the members of my old team are torn on whether to go with a flag or try to push an errata through the W3C. My opinion is that they should do both. Pushing an errata through the W3C is a time consuming process and in the meantime using XInclude in combination with XML Schema is signficantly crippled on the .NET Framework (or on any other platform that supports both technologies). Sometimes you have to do the right thing for customers instead of being ruled by the letter of standards organizations especially when it is clear they have made a mistake.

Please vote for this bug on the MSDN Product Feedback Center


 

Categories: XML