Daniel Barkalow wrote:
On Sat, 16 Apr 2005, Adam Kropelin wrote:
How about building a file list and doing a batch download via 'wget
-i /tmp/foo'? A quick test (on my ancient wget-1.7) indicates that
it reuses connectionss when successive URLs point to the same server.

You need to look at some of the files before you know what other files to get. You could do it in waves, but that would be excessively complicated to code and not the most efficient anyway.

Ah, yes. Makes sense. How about libcurl or another http client library, then? Minimizing dependencies on external libraries is good, but writing a really robust http client is a tricky business. (Not that you aren't up to it; I just wonder if it's the best way to spend your time.)


--Adam

-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [EMAIL PROTECTED]
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to