Hi,
When using wget (version 1.9.1 running on Debian Sarge) to download
files over 2 gigs from an ftp server (proftpd), wget reports a negative
length and keeps downloading, but once the file is successfully
downloaded it crashes (and therefore doesn't download the rest of the
files). Here is t
Hello,
wget, win32 rel. crashes with huge files.
regards
[EMAIL PROTECTED]
___
Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden:
http://mail.yahoo.de==> Command Line
wget -m ftp://f
Arndt Humpert <[EMAIL PROTECTED]> writes:
> wget, win32 rel. crashes with huge files.
Thanks for the report. This problem has been fixed in the latest
version, available at http://xoomer.virgilio.it/hherold/ .
The obvious problem is that this command lacks --keep-session-cookies,
and the cookie it gets is session-based.
I tried to reproduce the bug in the more generic way.
But there are other problems
as well: if you examine the cookie.txt produced by (the amended
version of) the first command, you'll no
[EMAIL PROTECTED] writes:
>> Is there a publically accessible site that exhibits this problem?
>
> I've set up a small example which illustrates the problem. Files can
> be found at http://dev.mesca.net/wget/ (using demo:test as login).
Thanks for setting up this test case. It has uncovered at l
[EMAIL PROTECTED] writes:
> I tried to download the European Constitution in English from
>
> http://europa.eu.int/eur-lex/lex/en/treaties/dat/12004V/htm/12004V.html
>
> with the following wget command:
>
> wget -r -l 2
> http://europa.eu.int/eur-lex/lex/en/treaties/dat/12004V/htm/12004V.html
>
>