wget fails on files >2GB. Patch by Alvaro Ortega

2004-05-07 Thread Yusuf Goolamabbas
I was testing the Cherokee webserver on fc2test3 and found that wget doesn't support downloading of files greater than 2GB. This was originally reported to me by Ali Ebrahim The author of Cherokee has a patch to wget http://www.alobbs.com/modules.php?op=modload&name=News&file=article&sid=380&mode

ftp size limit problem

2004-05-07 Thread Viktors Berstis
I was trying to use wget to download a Fedora iso image and it gave up at the 2,147,654,328 byte limit which seems to indicate that there may be a 31 or 32 bit file size value being used where a 64 bit value should be used. The file size is 4379752448 bytes. Am I missing some parameter or does

Re: wget: strdup: Not enough memory

2004-05-07 Thread Axel Pettinger
Hrvoje Niksic wrote: > > Axel Pettinger <[EMAIL PROTECTED]> writes: > > > Is there a reason for (or a solution to avoid it) the following > > message: "wget: strdup: Not enough memory." [1] > > Does Wget exit after the error, or does it keep running? Wget terminates itself after the error. Is i

Retrieving html

2004-05-07 Thread J G Lawrence
I can retrieve a html page (with or w/o images) from my web site however I am unable to retrieve all of the html pages from one directory at one time. Have tried -A.html, *.html to no avail. Suggestions?? Jim

Re: wget -o - outputs to file?

2004-05-07 Thread Hrvoje Niksic
"Arno Schuring" <[EMAIL PROTECTED]> writes: > The manual (man wget) doesn't say anything about redirecting the logs to > stdout; however, but since -O - is explicitly mentioned I figured I could > use the same for -o. Sorry about that. Since -o prints to stdout (ok, stderr) by default, I didn't

Re: wget: strdup: Not enough memory

2004-05-07 Thread Hrvoje Niksic
Axel Pettinger <[EMAIL PROTECTED]> writes: > Is there a reason for (or a solution to avoid it) the following > message: "wget: strdup: Not enough memory." [1] Does Wget exit after the error, or does it keep running?