It seems that wget uses a signed 32 bit value for the content-length in HTTP.  I
haven't looked at the code, but it appears that this is what is happening. 

The problem is that when a file larger than about 2GB is downloaded, wget
reports negative numbers for it's size and quits the download right after it
starts.

I would assume that somewhere there is a loop that looks something like:

while( "what I've downloaded" < "what I think the size is" )
{
        //do some more downloading.
}

And after the first read from the stream, the loop fails because whatever you
read is indeed bigger than a negative number so it exits.

Of course, this is all speculation on my part about what the code looks like but
none the less, the bug does exist on both linux and cygwin.

Thanks,

Matt

---------------------------
BTW:
great job, really...  
on wget and all the GNU software in general...
THANKS

Reply via email to