Quoting [EMAIL PROTECTED]:
> If downloading files with size over 2GB, the file size displayed is wrong:
This is a Frequently Asked Question:
http://www.gnu.org/software/wget/faq.html#3.1
// Ulf
Quoting Alan Robinson <[EMAIL PROTECTED]>:
> When downloading a 4.2 gig file (such as from
> ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o
> f_the_4th_ave.mpeg ) cause the status text (i.e.
> 100%[+===>] 38,641,328 213.92K/sETA
>
Quoting Rainer Zocholl <[EMAIL PROTECTED]>:
> But today i have to report the first problem:
> the attempt to download a 3.3GB file (Suse Linux Image)
> failed in several ways:
> wget seems to get an integer wrap arround
> see "-931,424,256" as size...
It is a Frequently Asked Question, with the
Quoting Cezary Sliwa <[EMAIL PROTECTED]>:
> What about downloading files over 2GB on 32-bit platforms?
It is a Frequently Asked Question, with the answer that people are working on
it.
// Ulf
Quoting Christoph Anton Mitterer <[EMAIL PROTECTED]>:
> It seems that the joecartoon.com server sends the gzip file
> intentionally with an appended 0xA (perhaps is even an error).
Can you check if the additional 0xA byte is included in the Content-Length or
not? Does it increase the C-L by one
Quoting Jan Minar <[EMAIL PROTECTED]>:
> (2) Use alternative retrieval programs, such as pavuk, axel, or
> ncftpget.
FWIW pavuk is much worse securitywise than wget. I've been working on patching
pavuk for a few months, and it has lots of strcpy() and sprintf() calls that
lead to buffer overflows
Hello,
I have found that it's possible for a malicious FTP server to crash GNU
Wget by sending malformed directory listings. Wget will parse them without
checking if they are written in the proper format. It will do a fixed number
of strtok() calls and then atoi() calls, and with the wrong format,