One of the things that has always frustrated me about programming in C# that it is such a hassle to return multiple values from a method. You either have to create a wrapper class whose entire purpose is to hold two or three variables or even worse use ref or out parameters. I used to get around this problem in C++ by using the pair utility class since I often wanted to deal with an object plus some value associated with it. However this approach quickly breaks down when you have more than two objects you want to associate temporarily for some processing.  

For example, in the Top Stories feature of RSS Bandit I have some code that operates on a URL, its weighted score and a list of all the posts that reference it. In C#, there’s no good way to deal with those three objects as a single entity without wrapping them in a class definition. In Python, it’s quite easy to do that using tuples. Compare the following two blocks of code and notice how I don’t need the RelationHrefEntry and RankedNewsItem types in the Python version of the code

C#:     /* Tally the votes, only 1 vote counts per feed */

 //RelationHrefEntry is (Href, Score, References), RankedNewsItem is (NewsItem, Score)

List<RelationHRefEntry> weightedLinks = new List<RelationHRefEntry>();

foreach (KeyValuePair<RelationHRefEntry, List<RankedNewsItem>> linkNvotes in allLinks) {

Dictionary<string, float> votesPerFeed = new Dictionary<string, float>();

//pick the lower vote if multiple links from a particular feed

foreach (RankedNewsItem voteItem in linkNvotes.Value) {

string feedLink = voteItem.Item.FeedLink;

if(votesPerFeed.ContainsKey(feedLink)){

votesPerFeed[feedLink] = Math.Min(votesPerFeed[feedLink], voteItem.Score);

}else{

votesPerFeed.Add(feedLink, voteItem.Score);

linkNvotes.Key.References.Add(voteItem.Item);

}

}

float totalScore = 0.0f;

foreach (float value in votesPerFeed.Values) {

totalScore += value;

}

linkNvotes.Key.Score = totalScore;

weightedLinks.Add(linkNvotes.Key);

}

weightedLinks.Sort(delegate(RelationHRefEntry x, RelationHRefEntry y) { return y.Score.CompareTo(x.Score);} );

weightedLinks = weightedLinks.GetRange(0, numStories);

Python:

    # tally the votes, only 1 vote counts per feed
    weighted_links = []
    for link, votes in all_links.items():
        site = {}
        for weight, item, feedTitle in votes:   #tuple magic happens here            
            site[feedTitle] = min(site.get(feedTitle,1), weight)   #Python dictionaries are smarter than .NET’s 
        weighted_links.append((sum(site.values()), link))   #more tuple magic, no need for manual summing of values 
    weighted_links.sort()
    weighted_links.reverse()

Now playing: UGK - One Day


 

Wednesday, 05 December 2007 04:52:32 (GMT Standard Time, UTC+00:00)
Just wait until you start playing with an XPath 2.0 sequence inside of XSLT 2.0. It's at that moment in time when you realize "why have we been doing it the hard way for so many years?" ;-)
Wednesday, 05 December 2007 05:30:03 (GMT Standard Time, UTC+00:00)
C# 3.0 has anonymous types so you can type "var x = new{A=1,y.B,C=3}" and get x.A, x.B, etc.

using System.Linq
...
var weightedLinks = new List&lt;RelationHRefEntry>();

foreach(var linkNvotes in allLinks.Select(x=>new{Link=x.Key, Votes=x.Value})) {
var site = new Dictionary&lt;string, float>();

foreach (var vote in linkNvotes.Votes.Select(x=>new{v.NewsItem.FeedLink, v.Score})) {
site[vote.FeedLink] = vote.Contains(vote.FeedLink) ? linkNvotes.Link : Math.Min(site[vote.FeedLink], vote.Score)
}
}

weightedLinks.Add(site.Sum(x=>x.Value)); //LINQ Sum
weightedLinks = weightedLinks.Reverse(); //LINQ Reverse
Wednesday, 05 December 2007 05:39:24 (GMT Standard Time, UTC+00:00)
That should be "weightedLinks = weightedLinks.OrderByDescending(x => x.Score)"
Wednesday, 05 December 2007 16:19:06 (GMT Standard Time, UTC+00:00)
Python is all good until you have million of lines of both client and server code and then find that even with your scalable architecture there's still that sequential python overhead code to run that trashes the performance to unusable both client and server side. I would estimate that in 5 years we might have fast enough computers to meet the current needs. But not the needs 5 years from now. This is exactly the situation couple years back and it hasn't changed. Just couple weeks ago they announced being in the process of building one of Europe's largest supercomputers - and definitely the largest in the world for the what they're using it for. Just because of Python I tell you. But it's hard to counter their argument that Python has also to reach this point especially when you take into account when they started their codebase.

Now IronPython might be a lot faster but then again it doesn't provide the same things their custom implementation does so porting is a no go.
ac
Wednesday, 05 December 2007 16:23:01 (GMT Standard Time, UTC+00:00)
s/ they = Certain company famous for their Python use and sponsoring of Python perf improvements. also to = also allowed to
ac
Comments are closed.