I forgot all about dryrun. This sounds like a great thing for somebody to run every so often and post the bad connections to the maintainers.
--
Alexander K. Hansen
Levitated Dipole Experiment
http://www.psfc.mit.edu/LDX


On Jan 13, 2004, at 7:46 PM, Gottfried Szing wrote:


i have tried a different approach which does not download all the packages, because just checking for the existence of the file on the mirror is enough.


i created a list of downloadable files with the help of fetch-all and the dryrun option which prints a list of urls. i always took the first url in the list (dryrun prints the name of the package, checksum and a list of download locations) and checked with a HEAD command (curl supports this, option -I ) and in combination with a proxy the existence for file. this brought up some non-working locations (503, 404, and time outs occured).

so, this produces a not so high traffic (eg 5kb/package, which means about 15meg for unstable with about 3000 packages) and also, sorting the files by server allows curl to combine requests to the same server and reduces the cost of connection-setup.

cu, gottfried






------------------------------------------------------- This SF.net email is sponsored by: Perforce Software. Perforce is the Fast Software Configuration Management System offering advanced branching capabilities and atomic changes on 50+ platforms. Free Eval! http://www.perforce.com/perforce/loadprog.html _______________________________________________ Fink-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/fink-devel

Reply via email to