Hello Micah, Thanks for the feedback on this. I was almost banging my head against the wall trying to get this to work.
Is the change you mention below something that might be made? Is the -i option used very often? Seems like it should be really useful. Anyway, I'll be looking forward to a version with the update when it comes. Thanks for your help. Ray Sterner [email protected] The Johns Hopkins University North latitude 39.16 degrees. Applied Physics Laboratory West longitude 76.90 degrees. Laurel, MD 20723-6099 On Wed, 28 Apr 2010, Micah Cowan wrote: > Ray Sterner wrote: > > Hello Micah, > > > > When I use wget to grab all the files from the ftp site they download > > very quickly (relatively). That means it's possible to do. > > Yes. It's only a design flaw that prevents this from working on a > list-of-URLs. The recursive-descent mode, by its very nature, almost has > to reuse the same connection, but for some reason, the list-of-URLs mode > doesn't. > > > I can see why making a new connection for each file in a list is a > > reasonable default, they might be scattered all over the web. > > If they were, then it would make sense. But it can't be that hard to > save the connection and check to see if we still have an open connection > to the host, just as we do on HTTP. Though I suppose the main difference > there is that HTTP doesn't have to keep track of what the current > working directory is, the way FTP would need to. > > > I guess the get-all-the-files mode must use a single connection for > > everything on the target site. > > Yup. > > > Maybe a useful new option would be one that tries to use the same > > connection for as many files as it can in the given list. > > IMO, this doesn't need to be a new option. I don't believe anyone > _wants_ the current behavior for URL-lists, and if for some reason they > do, they could just run wget itself in a loop, giving it a single arg at > a time. > > It just needs someone to make the change. :) > > > -- > Micah J. Cowan > http://micah.cowan.name/ >
