You may have noticed that, for the last few days, all links are redirected when clicked. Needless to say that IPs are checked for dups, etc. The user agents are also matched against a list of some 400 known bots, crawlers and spiders. I'll post in more details about the bot discovery process later in the week.
I'm still debating whether I want to include the number of reads on the page. I pretty much decided against doing so in the feeds, as it would cause aggregators that do not pay attention to modification dates (like bloglines, etc.) to constantly reload the feed.
I'm still working a few of the details out, but as usual, let me know if there are any problems.