Now that I've gotten Visual Studio.NET 2003 reinstalled on my machine I've been working towards fixing the few bugs I own that have prevented us from shipping a release. The most important bug left to fix is [ 930282 ] Remote storage doesn't actually synchronize state which exists because of laziness on my part.

RSS Bandit gives you the option to download and upload your feed list from a file share, an FTP server or a dasBlog weblog. However this doesn't actually do much synchronization during the import phase, basically it just adds the feeds that don't currently exist in your aggregator. It doesn't synchronize the read/unread messages, remove deleted feeds or remember which items you've flagged for follow up. I am in the process of fixing this for the next release.

I'm currently thinking that we'll break backwards compatibility with this feature. Synchronization will only work between current versions of RSS Bandit. You'll have a choice of two transfer formats ZIP and SIAM. If you select ZIP then we'll synchronize your search folders, flagged items, replied items, subscribed feeds and read/unread message state. All of these will be transferred as a ZIP file. The SIAM option will synchronize subscribed feeds and read/unread message state and will be transferred as a SIAM document. The supported data sources will be WebDAV folder, network share and FTP. I'm interested in any other options people think is interesting to support.

There are a couple of interesting problems to solve before I'm done mostly to do with how to perform the synchronization as quickly as possible. They revolve around scenarios like “What if my work machine has 3 months of posts while my home machine only has 2 weeks of posts in its cache and I synchronize between them?“ or “What happens if I've read different posts from the same feed on my work machine and on my home machine?“. The issue revolve around the fact that replacing the existing information with the incoming information while simple leads to information loss.

This should be fun, I've wanted this functionality for a while.


Thursday, April 22, 2004 3:57:34 PM (GMT Daylight Time, UTC+01:00)
How about SCP, SFTP, or some other way of securely transferring the files? (It's not that I care so much about who sees what it is that I write. It's more that I have a slight aversion to sending my password in cleartext over the internet.)
Thursday, April 22, 2004 6:22:27 PM (GMT Daylight Time, UTC+01:00)
How about actually placing the "working" files on an FTP/Webdav/Network Folder store? So I could even simultaneously read my feeds on two machines. (With some sort of cashing and synch-in-background of course)

This would mean a massive Backwards compatibility break and possibly you'd have to break the single XML files down into per-feed files, so that one doesn't have to upload 1MB files per synchronization step.

I am thinking of something like: store each iteam as [guid].xml inside a [feedID/Name]-folder. When I read an item, the [feedID/Name]/[guid].xml file gets updated and gets a "read" attribute. The file is immediately uploaded to the storage folder. A global small file in the root folder contains a list of recently read/updated items, so that another RSS Bandit can check that file periodically (or manually) for updates. So the other Bandit sees "the status feed item [feedID]/[guid].xml has been updated. read the file, update my database".

This would still leave some room for "race conditions" but at least it solves the problem of not being forced into a master/slave situation.

My usual setup is like this: During the week I read my feeds on my work machine. On the weekend, I would like to read remaining items and get new items (=sync from master->slave), and when I return to the office, I'd like to synch from slave->master.

Just a thought. :-)
Comments are closed.