On 9 September 2015 at 11:20, Hubert Tarasiuk <[email protected]> wrote: > On Sat, Aug 29, 2015 at 12:50 AM, Darshit Shah <[email protected]> wrote: >> Thanking You, >> Darshit Shah >> Sent from mobile device. Please excuse my brevity >> On 29-Aug-2015 1:13 pm, "Tim Rühsen" <[email protected]> wrote: >>> >>> Hi, >>> >>> normally it makes much more sense when having several download mirrors and >>> checksums for each chunk. The perfect technique for such is called >> 'Metalink' >>> (more on www.metalinker.org). >>> Wget has it in branch 'master'. A GSOC project of Hubert Tarasiuk. >>> >> Sometimes the evil ISPs enforce a per connection bandwidth limit. In such a >> case, multi segment downloads from a single server do make sense. >> >> Since metalink already has support for downloading a file over multiple >> connections, it should not be too difficult to reuse the code for use >> outside of metalink. > The current Metalink impl in Wget will not download from multiple > mirrors simultaneously since Wget itself is single-threaded. > Adding optional (POSIX) threads support to Wget (especially for the > Metalinks) could be perhaps worth discussion. > For now the solution might be to start multiple Wget instances using > the --start-pos option and somehow limit the length of download (I am > not sure if Wget currently has an option to do that). >
As said in the discussion when we were about to introduce --start-pos option, we can limit the length of download with other utilities such as dd. This is for the consideration of complexity. Well, I just made a proof of concept shell script for starting multiple wget processes to download HTTP files [1]. [1] Concurrent WGET with --start-pos option. https://gist.github.com/yousong/48266375afb68f9fb85f Cheers, yousong
