I was testing the Cherokee webserver on fc2test3 and found that wget
doesn't support downloading of files greater than 2GB. This was
originally reported to me by Ali Ebrahim
The author of Cherokee has a patch to wget
http://www.alobbs.com/modules.php?op=modload&name=News&file=article&sid=380&mode
I was trying to use wget to download a Fedora iso image and it gave up at
the 2,147,654,328 byte limit which seems to indicate that there may be a 31
or 32 bit file size value being used where a 64 bit value should be used.
The file size is 4379752448 bytes.
Am I missing some parameter or does
Hrvoje Niksic wrote:
>
> Axel Pettinger <[EMAIL PROTECTED]> writes:
>
> > Is there a reason for (or a solution to avoid it) the following
> > message: "wget: strdup: Not enough memory." [1]
>
> Does Wget exit after the error, or does it keep running?
Wget terminates itself after the error. Is i
I can retrieve a html page (with or w/o images) from my web site however
I am unable to retrieve all of the html pages from one directory at one
time. Have tried -A.html, *.html to no avail. Suggestions??
Jim
"Arno Schuring" <[EMAIL PROTECTED]> writes:
> The manual (man wget) doesn't say anything about redirecting the logs to
> stdout; however, but since -O - is explicitly mentioned I figured I could
> use the same for -o.
Sorry about that. Since -o prints to stdout (ok, stderr) by default,
I didn't
Axel Pettinger <[EMAIL PROTECTED]> writes:
> Is there a reason for (or a solution to avoid it) the following
> message: "wget: strdup: Not enough memory." [1]
Does Wget exit after the error, or does it keep running?