Development Ideas - Request 2 New Features

 

Hi, I really think the following two features are very useful, and based the


current function of WGet, they are not too difficult to implement. It will
be 

a pleasant thing if they can be implemented in the new release of WGet. 

 

Feature 1

ADD OPTION:

    --download-range=lower_offset,upper_offset

    --download-length=begin_offset,length

PURPOSE:

    Allow user to download the specific part (from lower_offset to
upper_offset) 

    of a file, and save that part to a separate file.

APPLICATION:

    When a huge file is downloaded, we can split it into several parts which
can 

    be downloaded concurrently by running multiple instances of WGet at the
same 

    time, in order to save time. 

EXAMPLE: 

    Suppose that: A huge file example.rar (for example 2,000,000,000 bytes)
will 

    be downloaded from http://host/example.rar. We can concurrently run 4 

    instances of WGet as follows:

      wget --download-length=0,500000000 http://host/example.rar

      wget --download-length=500000000,500000000 http://host/example.rar

      wget --download-length=1000000000,500000000 http://host/example.rar

      wget --download-length=1500000000,500000000 http://host/example.rar

    Note:

      The above 4 parts should be automatically saved as
example(0..499999999).rar, 

      example(500000000..999999999).rar, example(1000000000..1499999999).rar
and 

      example(1500000000..1999999999).rar, separately.

    After all the four parts are downloaded, we can merge then into a single
2 GB 

    file example.rar.

 

Feature 2

ADD OPTION: 

    --replace-range=lower_offset,upper_offset

    --replace-length=begin_offset,length

PURPOSE: 

    Allow user to download the specific part(from lower_offset to
upper_offset) 

    of a file, and automatically replace the corresponding part of the local
file 

    with the same name.

APPLICATION: 

    When a huge achieve file has been downloaded, but unfortunately it is
corrupted 

    for some reason. Since the file is very huge, redownloading it is a
great 

    waste of time. Usually by some means (for example through the TEST
function 

    of some achieve program), we can approximately know which small part of
the 

    huge achieve file contains erroneous data, and we only want to download
and 

    replace that corrupt part.

EXAMPLE: 

    Suppose that: A huge file example.rar (for example 2,000,000,000 Bytes)
has 

    been downloaded, but unfortunately it is corrupted for some reason.
Through 

    the TEST function of WinRAR, we can infer only the data after the 98%
exists 

    error, we only want to redownload and replace the 98%~100% part of the
file 

    example.rar. we can run WGet as follows:

      wget --replace-range=1940000000,- http://host/example.rar

      OR:

      wget --replace-range=97%,- http://host/example.rar

    Note: here '-' represent the end of the file.

 

   

   

   Hank Hou

Development Ideas - Request 2 New Features

Hi, I really think the following two features are very useful, and based the 
current function of WGet, they are not too difficult to implement. It will be 
a pleasant thing if they can be implemented in the new release of WGet. 

Feature 1
ADD OPTION:
    --download-range=lower_offset,upper_offset
    --download-length=begin_offset,length
PURPOSE:
    Allow user to download the specific part (from lower_offset to 
upper_offset) 
    of a file, and save that part to a separate file.
APPLICATION:
    When a huge file is downloaded, we can split it into several parts which 
can 
    be downloaded concurrently by running multiple instances of WGet at the 
same 
    time, in order to save time. 
EXAMPLE: 
    Suppose that: A huge file example.rar (for example 2,000,000,000 bytes) 
will 
    be downloaded from http://host/example.rar. We can concurrently run 4 
    instances of WGet as follows:
      wget --download-length=0,500000000 http://host/example.rar
      wget --download-length=500000000,500000000 http://host/example.rar
      wget --download-length=1000000000,500000000 http://host/example.rar
      wget --download-length=1500000000,500000000 http://host/example.rar
    Note:
      The above 4 parts should be automatically saved as 
example(0..499999999).rar, 
      example(500000000..999999999).rar, example(1000000000..1499999999).rar 
and 
      example(1500000000..1999999999).rar, separately.
    After all the four parts are downloaded, we can merge then into a single 2 
GB 
    file example.rar.

Feature 2
ADD OPTION: 
    --replace-range=lower_offset,upper_offset
    --replace-length=begin_offset,length
PURPOSE: 
    Allow user to download the specific part(from lower_offset to upper_offset) 
    of a file, and automatically replace the corresponding part of the local 
file 
    with the same name.
APPLICATION: 
    When a huge achieve file has been downloaded, but unfortunately it is 
corrupted 
    for some reason. Since the file is very huge, redownloading it is a great 
    waste of time. Usually by some means (for example through the TEST function 
    of some achieve program), we can approximately know which small part of the 
    huge achieve file contains erroneous data, and we only want to download and 
    replace that corrupt part.
EXAMPLE: 
    Suppose that: A huge file example.rar (for example 2,000,000,000 Bytes) has 
    been downloaded, but unfortunately it is corrupted for some reason. Through 
    the TEST function of WinRAR, we can infer only the data after the 98% 
exists 
    error, we only want to redownload and replace the 98%~100% part of the file 
    example.rar. we can run WGet as follows:
      wget --replace-range=1940000000,- http://host/example.rar
      OR:
      wget --replace-range=97%,- http://host/example.rar
    Note: here '-' represent the end of the file.

   
   
   Hank Hou

Reply via email to