> > Wget would be a bad choice unless someone fixes it so it doesn't
> > compress multiple slashes into one anymore.
>I usually work around this by encoding one of the slashes; i.e.
>wget -r http://127.0.0.1:8888/SSK%40whateverPAgM/foo/%2F

calling wget with that trick has not worked for me.
the urls of links within the html page seem to be reformatted by wget, resulting in 
double slashes transformed into single slashes.

but today i stumbled across "puf", which respects double slashes perfectly and has 
some other nice features, such as configurable connection timeout limit or retry count 
(data not found from fproxy -> automatic retry 
by puf):

puf - Parallel URL fetcher
What is puf?
puf is a download tool for UNIX-like systems. You may use it to download single files 
or to mirror entire servers. It is similar to GNU wget (and has a partly compatible 
command line), but has the ability to do many 
downloads in parallel. This is very interesting, if you have a high-bandwidth internet 
connection. See README for details (not very informative at the moment).

you can find the puf homepage and sources at:
http://puf.sourceforge.net/

./configured and make without problems on suse linux 8.2. runs smoothly without 
problems

perhaps give it a try?..... ;)








_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to