Quoting Raphael ManfrediI'm not sure if this exactly applies, but it was something I had considered coding...
I expect that a lot of PARQ entries will obsolete at some point: servents that
have requested them will disconnect and no longer be reacheable. Only
servents up 24x7 will be able to maintain their queued entries,
Because many of the files I'm looking for are quite rare, but fall under a few distinct categories (say, early punk), I have run into a few situations where one host gave me a query hit for like ten items I wanted, but was only online long enough to serve two or three of these. (I don't think crass fans can afford always-on internet...) As a consequence, I regularly have a block at the top of my queue that is from the same non-existant hosts, and was trying to think of a way to get them out of there. The reason these entries would not die is because they would hardly ever reach the timeout limit, with only one slot per ip allowed. (in a few days, they slowly die.)
A reasonable solution, it seemed to me, would be to count the amount of timeouts/connection failures for those hosts and remove all entries if this exceeded the user defined amount, using that as a guideline for a non-existant host. Of course, connection failures like "too many uploads" wouldn't enter into this, as they at least indicate that a servent is there.
Peace,
Clayton
_________________________________________________________________
Protect your PC - get McAfee.com VirusScan Online http://clinic.mcafee.com/clinic/ibuy/campaign.asp?cid=3963
-------------------------------------------------------
This SF.NET email is sponsored by:
SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!
http://www.vasoftware.com
_______________________________________________
Gtk-gnutella-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/gtk-gnutella-devel
