Re: [Fink-devel] unstable and wrong links

2004-01-16 Thread Gottfried Szing
hi alex

I forgot all about dryrun.  This sounds like a great thing for somebody 
to run every so often and post the bad connections to the maintainers.
wow, just fyi. the situation is not that bad. i have run a dryrun and 
tested most of the urls and there are about 90 out of 3000 packages 
which results in a 404 and about 25 with a 503. there also some other 
error messages like 30x and other 50x, but in total these are only 180 
or so.

seems to be a fulltime job to inform the maintainers of the packages. :)

anyway, you are doing a great job, gottfried

---
The SF.Net email is sponsored by EclipseCon 2004
Premiere Conference on Open Tools Development and Integration
See the breadth of Eclipse activity. February 3-5 in Anaheim, CA.
http://www.eclipsecon.org/osdn
___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel


[Fink-devel] unstable and wrong links

2004-01-13 Thread Gottfried Szing
hi girls and guys,

maybe this question has been already answered in the past. but i cannot 
find a definite answer for this in the archives or in the FAQ. i have 
already tried the user-list, but with no success til now.

since i am using unstable i have always problems with the downloads. i 
mean that the download server has version X and the info file shows 
version Y, where X  Y. so the download obviously does not work. ok, the 
FAQ recommends to search for the package and to place is it into the 
/sw/src/ directory. sure this works, when you are happy to find the 
correct version of the file.

but there are some drawbacks like:
- it is not working all the time (sometimes i cannot find the right version)
- it is boring, to search for so many packages in the net.
- ease of use, because i want to start the update, go for a beer and 
come back in the morning, when everything is done.

i understand that unstable means unstable and i expect some problems. 
why not, maintaing the lot of packages is really tough.

isnt there a way of automatic link checking and checking the consistence 
of the info-files with the download locations? is there such a tool?

if the problems could be found in advance i think this would also 
improve the confidence into fink - even for unexperienced users. is 
there an automatic testing of the info files?

mabye this is the wrong location for the post, but any comments and 
corrections are highly appreciated.

gottfried

---
This SF.net email is sponsored by: Perforce Software.
Perforce is the Fast Software Configuration Management System offering
advanced branching capabilities and atomic changes on 50+ platforms.
Free Eval! http://www.perforce.com/perforce/loadprog.html
___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel


Re: [Fink-devel] unstable and wrong links

2004-01-13 Thread Gottfried Szing
hi alex

so, for me this is quite fine, because i believe to know what was 
happening and i solved the problem for me. for unexperienced user this 
could a reason to not to use fink any more, because it could be 
frustrating. sure, noone forces them to use unstable, but who is not 
using the latest version of the software on a desktop system?

It's possible, though it seems like most people post to the mailing 
list--which means that the problem gets known.
yep, thats correct. but this requires the user either to subscribe to a 
mailing list or to register at SF to report a bug. both a really huge 
burden for average mac os users. i think that they are a little bit lazy. :)

and even in the case that the user is using stable, there is same 
problem (you have mentioned it at the beginning of the mail). this 
means that fink and the packagers relies on the feedback of users, if 
a package does no longer exist (huh, user feed back, is this really 
working?). i mean, one way to improve the confidence of normal users 
into fink is to ensure that at least the stable tree is consistent.
Every so often somebody does a check by running fink fetch-all, which 
tries to download every package.  This could be done more 
regularly--it's tedious, though, because I think there's about 1000 
packages in the stable tree, and twice that in unstable.
i have tried a different approach which does not download all the 
packages, because just checking for the existence of the file on the 
mirror is enough.

i created a list of downloadable files with the help of fetch-all and 
the dryrun option which prints a list of urls. i always took the first 
url in the list (dryrun prints the name of the package, checksum and a 
list of download locations) and checked with a HEAD command (curl 
supports this, option -I ) and in combination with a proxy the existence 
for file. this brought up some non-working locations (503, 404, and time 
outs occured).

so, this produces a not so high traffic (eg 5kb/package, which means 
about 15meg for unstable with about 3000 packages) and also, sorting the 
files by server allows curl to combine requests to the same server and 
reduces the cost of connection-setup.

cu, gottfried



---
This SF.net email is sponsored by: Perforce Software.
Perforce is the Fast Software Configuration Management System offering
advanced branching capabilities and atomic changes on 50+ platforms.
Free Eval! http://www.perforce.com/perforce/loadprog.html
___
Fink-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/fink-devel