Tony Luck wrote:
Otherwise this looks really nice.  I was going to script something
similar using "wget" ... but that would have made zillions of seperate
connections.  Not so kind to the server.

How about building a file list and doing a batch download via 'wget -i /tmp/foo'? A quick test (on my ancient wget-1.7) indicates that it reuses connectionss when successive URLs point to the same server.


Writing yet another http client does seem a bit pointless, what with wget and curl available. The real win lies in creating the smarts to get the minimum number of files.

--Adam

-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [EMAIL PROTECTED]
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to