Dan Jacobson <[EMAIL PROTECTED]> writes: > Phil> How about > Phil> $ wget URI1 & wget URI2 > > Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g., > $ wget --max-parallel-fetches=11 -i url-list > (hmm, with default=1 meaning not parallel, but sequential.)
I suppose forking would not be too hard, but dealing with output from forked processes might be tricky. Also, people would expect `-r' to "parallelize" as well, which would be harder yet.