Re: wget overwriting even with -c (bug?)

2004-06-12 Thread Petr Kadlec
H, sorry, I have just discovered that it has been reported about a week ago (http://www.mail-archive.com/wget%40sunsite.dk/msg06527.html). I really did try to search for some "overwrite", etc. in the archive, honestly. :-) But that e-mail does not use the word "overwrite" at all... Regards,

wget overwriting even with -c (bug?)

2004-06-12 Thread Petr Kadlec
Hi folks! Sometimes I experience very unpleasant behavior of wget (using some not-really-recent CVS version of wget 1.9, under W98SE). I have a partially downloaded file (usually a big one, there is not so big probability of interrupted download of a small file), so I want to finish the download

Re: [BUG] wget 1.9.1 and below can't download >=2G file on 32bits system

2004-05-27 Thread Hrvoje Niksic
Yup; 1.9.1 cannot download large files. I hope to fix this by the next release.

[BUG] wget 1.9.1 and below can't download >=2G file on 32bits system

2004-05-24 Thread Zhu, Yi
Hi, I use wget on a i386 redhat 9 box to download 4G DVD from a ftp site. The process stops at: $ wget -c --proxy=off ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso --12:47:24-- ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso => `FC2-i386-DVD.iso

Re: Maybe a bug or something else for wget

2004-05-23 Thread Jens Rösner
Hi Ben! Not at a bug as far as I can see. Use -A to accept only certain files. Furthermore, the pdf and ppt files are located across various servers, you need to allow wget to parse other servers than the original one by -H and then restrict it to only certain ones by -D. wget -nc -x -r -l2 -p

Maybe a bug or something else for wget

2004-05-23 Thread Gao, Ruidong
Hi, How can I download all pdf and ppt file by the following url with command line of: wget -k -r -l 1 http://devresource.hp.com/drc/topics/utility_comp.jsp I am on windows 2000 server sp4 with latest update. E:\Release>wget -V GNU Wget 1.9.1 Copyright (C) 2003 Free Software Foundation,

Bug report: two spaces between filesize and Month

2004-05-03 Thread Iztok Saje
Hello! I just found a "feature" in embedded system (no source) with ftp server. In listing, there are two spaces between fileize and month. As a consequence, wget allways thinks size is 0. In procedure ftp_parse_unix_ls it just steps back one blank before cur.size is calculated. My quick hack is j

May not be a Bug more a nice-2-have

2004-04-07 Thread Alexander Joerg Herrmann
Dear Reader, some may not really consider it a Bug so it is maybe more a nice-2-have When I try to mirror the Internetpages I develop http://www.nachttraum.de http://www.felixfrisch.de wget complains that the linux complains that the file name is to long. It is not exactly a Bug as I use cgi with

wget bug: directory overwrite

2004-04-05 Thread Juhana Sadeharju
Hello. Problem: When downloading all in http://udn.epicgames.com/Technical/MyFirstHUD wget overwrites the downloaded MyFirstHUD file with MyFirstHUD directory (which comes later). GNU Wget 1.9.1 wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Solution: Use of -E o

[BUG?] --include option does not use an exact match for directories

2004-03-28 Thread William Bresler
as nobody puts a new entry on the remote site which matches the -I values, but is not in the -X values, something which I cannot control. So, again, I say this is a bug. I see that frontcmp() is also called by (recur.c)download_child_p which is an HTTP function, so any possible patch would probab

wget bug report

2004-03-26 Thread Corey Henderson
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it bounced and said to try this email address. This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9 rpm -q wget wget-1.8.2-9 When I use a wget with the -S to show the http headers, and I use

Re: Bug report

2004-03-24 Thread Hrvoje Niksic
Juhana Sadeharju <[EMAIL PROTECTED]> writes: > Command: "wgetdir http://liarliar.sourceforge.net";. > Problem: Files are named as > content.php?content.2 > content.php?content.3 > content.php?content.4 > which are interpreted, e.g., by Nautilus as manual pages and are > displayed as plain te

Bug report

2004-03-24 Thread Juhana Sadeharju
Hello. This is report on some wget bugs. My wgetdir command looks the following (wget 1.9.1): wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Bugs: Command: "wgetdir http://www.directfb.org";. Problem: In file "www.directfb.org/index.html" the hrefs of type "/screen

wget bug in retrieving large files > 2 gig

2004-03-09 Thread Eduard Boer
Hi, While downloading a file of about 3,234,550,172 bytes with "wget http://foo/foo.mpg"; I get an error: HTTP request sent, awaiting response... 200 OK Length: unspecified [video/mpeg] [ <=> ] -1,060,417,124 13.10

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-05 Thread Hrvoje Niksic
ne with > the url on the command line; the bug only happens when the url is > passed in with: > > cat < http://... > EOF But I cannot repeat that, either. As long as the consecutive slashes are in the query string, they're not stripped. > Using this method is necessary s

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-04 Thread D Richard Felker III
Connection: Keep-Alive > > ---request end--- > HTTP request sent, awaiting response... > ... > > The request log shows that the slashes are apparently respected. I retried a test case and found the same thing -- the slashes were respected. Then I remembered that I was using -i. Wget seems to work fine with the url on the command line; the bug only happens when the url is passed in with: cat <

Re: bug in use index.html

2004-03-04 Thread Dražen Kačar
Hrvoje Niksic wrote: > The whole matter of conversion of "/" to "/index.html" on the file > system is a hack. But I really don't know how to better represent > empty trailing file name on the file system. Another, for now rather limited, hack: on file systems which support some sort of file attri

Re: bug in use index.html

2004-03-04 Thread Hrvoje Niksic
The whole matter of conversion of "/" to "/index.html" on the file system is a hack. But I really don't know how to better represent empty trailing file name on the file system.

bug in use index.html

2004-03-04 Thread Василевский Сергей
Good day! I use wget 1.9.1. By default all link to root site "/" or "somedomain.com/" wget convert to "/index.html" or "somedomain.com/index.html". But some site don't use index.html as default page and if use timestamp and continue download site in more than 1 session 1. wget first download inde

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread Hrvoje Niksic
D Richard Felker III <[EMAIL PROTECTED]> writes: >> > Think of something like http://foo/bar/redirect.cgi?http://... >> > wget translates this into: [...] >> >> Which version of Wget are you using? I think even Wget 1.8.2 didn't >> collapse multiple slashes in query strings, only in paths. > > I

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread D Richard Felker III
On Mon, Mar 01, 2004 at 03:36:55PM +0100, Hrvoje Niksic wrote: > D Richard Felker III <[EMAIL PROTECTED]> writes: > > > The following code in url.c makes it impossible to request urls that > > contain multiple slashes in a row in their query string: > [...] > > That code is removed in CVS, so mul

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread Hrvoje Niksic
D Richard Felker III <[EMAIL PROTECTED]> writes: > The following code in url.c makes it impossible to request urls that > contain multiple slashes in a row in their query string: [...] That code is removed in CVS, so multiple slashes now work correctly. > Think of something like http://foo/bar/r

Bug in wget: cannot request urls with double-slash in the query string

2004-02-29 Thread D Richard Felker III
The following code in url.c makes it impossible to request urls that contain multiple slashes in a row in their query string: else if (*h == '/') { /* Ignore empty path elements. Supporting them well is hard (where do you save "http://x.com///y.html";?), and

Re: bug in connect.c

2004-02-06 Thread Hrvoje Niksic
Manfred Schwarb <[EMAIL PROTECTED]> writes: >> Interesting. Is it really necessary to zero out sockaddr/sockaddr_in >> before using it? I see that some sources do it, and some don't. I >> was always under the impression that, as long as you fill the relevant >> members (sin_family, sin_addr, si

Re: bug in connect.c

2004-02-06 Thread Manfred Schwarb
Interesting. Is it really necessary to zero out sockaddr/sockaddr_in before using it? I see that some sources do it, and some don't. I was always under the impression that, as long as you fill the relevant members (sin_family, sin_addr, sin_port), other initialization is not necessary. Was I mi

Re: bug in connect.c

2004-02-04 Thread Hrvoje Niksic
"francois eric" <[EMAIL PROTECTED]> writes: > after some test: > bug is when: ftp, with username and password, with bind address specifyed > bug is not when: http, ftp without username and password > looks like memory leaks. so i made some modification

bug in connect.c

2004-02-03 Thread francois eric
nd.stup.ac.ru FTP server (Version wu-2.6.2-8) ready. ... -- after some test: bug is when: ftp, with username and password, with bind address specifyed bug is not when: http, ftp without username and password looks like memory leaks. so i made some modification before bind: src/connect.c: -- ...

Re: bug report

2004-01-28 Thread Hrvoje Niksic
You are right, it's a bug. -O is implemented in a weird way, which makes it work strangely with features such as timestamping and link conversion. I plan to fix it when I get around to revamping the file name generation support for grokking the Content-Disposition header.

BUG : problem of date with wget

2004-01-27 Thread Olivier RAMIARAMANANA (Ste Thales IS)
** High Priority ** Hi On My server AIX I use "wget" with this command /usr/local/bin/wget http://www.???.?? -O /exploit/log/test.log but when I read my file "test.log" its date it's January 30 2003 ??? that's incredible What's the problem please Regards olivier

Re: wget bug with ftp/passive

2004-01-22 Thread Hrvoje Niksic
don <[EMAIL PROTECTED]> writes: > I did not specify the "passive" option, yet it appears to have been used > anyway Here's a short transcript: > > [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip > --21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip >

wget bug with ftp/passive

2004-01-21 Thread don
Hello, I think I've come across a little bug in wget when using it to get a file via ftp. I did not specify the "passive" option, yet it appears to have been used anyway Here's a short transcript: [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.z

Re: wget bug

2004-01-12 Thread Hrvoje Niksic
Kairos <[EMAIL PROTECTED]> writes: > $ cat wget.exe.stackdump [...] What were you doing with Wget when it crashed? Which version of Wget are you running? Was it compiled for Cygwin or natively for Windows?

wget bug

2004-01-06 Thread Kairos
$ cat wget.exe.stackdump Exception: STATUS_ACCESS_VIOLATION at eip=77F51BAA eax= ebx= ecx=0700 edx=610CFE18 esi=610CFE08 edi= ebp=0022F7C0 esp=0022F74C program=C:\nonspc\cygwin\bin\wget.exe cs=001B ds=0023 es=0023 fs=0038 gs= ss=0023 Stack trace: Frame Function

bug report

2003-12-30 Thread Vlada Macek
Hi again, I found something what can be called a bug. The command line and the output (shortened): $ wget -k www.seznam.cz --14:14:28-- http://www.seznam.cz/ => `index.html' Resolving www.seznam.cz... done. Connecting to www.seznam.cz[212.80.76.18]:80... connected. HTTP

Re: Maybe a bug?

2003-12-28 Thread Jens Rösner
Hi! Well, the message you got really tells you to have a look at the user agreement. So I did. http://www.quickmba.com/site/agreement/ clearly explains why your download failed under the point "Acceptable Use" As long as you have wget identifying itself as wget, you probably will not get any fi

Maybe a bug?

2003-12-28 Thread James Li-Chung Chen
I'm playing around with the wget tool and I ran into this website that I don't believe the "-e robots=off" works. http://www.quickmba.com/ any idea why? I've tried a few combinations and I keep on getting this message in the response. We're sorry, but the way that you have attempted to acc

isn't it a little bug?

2003-12-23 Thread piotrek
Hi, I've just noticed a weird behavior of wget 1.8.2 while downloading a partial file with command: wget http://ardownload.adobe.com/pub/adobe/acrobatreader/unix/5.x/ linux-508.tar.gz -c The connection was very unstable, so it had to reconnect many times. What i noticed is not a big thing, just i

bug? different behavior of "wget" and "lwp-request (GET)"

2003-12-17 Thread Diego Puppin
/group/sammydavisjr/message/56 retrieves a standard page (HTTP 200). Is this a bug (of GET, wget?) or a "feature"? I realized this problem when testing two different Java program to download pages from a URL. One uses a Java socket, the other uses Java URLConnection. Well, **even if the r

Bug in 1.9.1? ftp not following symlinks

2003-12-09 Thread Manfred Schwarb
hi i tried to download the following: wget ftp://ftp.suse.com/pub/suse/i386/7.3/full-names/src/traceroute-nanog_6.1.1-94.src.rpm this is a symbolic link. downloading just this single file, wget should follow the link, but it creates only a symbolic link. excerpt from "man wget", section --retr-sy

Re: non-subscribers have to confirm each message to bug-wget

2003-11-18 Thread Hrvoje Niksic
Dan Jacobson <[EMAIL PROTECTED]> writes: >>> And stop making me have to confirm each and every mail to this list. > > Hrvoje> Currently the only way to avoid confirmations is to > Hrvoje> subscribe to the list. I'll try to contact the list owners > Hrvoje> to see if the mechanism can be improved.

Re: non-subscribers have to confirm each message to bug-wget

2003-11-17 Thread Dan Jacobson
>> And stop making me have to confirm each and every mail to this list. Hrvoje> Currently the only way to avoid confirmations is to subscribe to the Hrvoje> list. I'll try to contact the list owners to see if the mechanism can Hrvoje> be improved. subscribe me with the "nomail" option, if it can

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
"Kempston" <[EMAIL PROTECTED]> writes: > Yeah, i understabd that, but lftp hadles it fine even without > specifying any additional option ;) But then lftp is hammering servers when real unauthorized entry occurs, no? > I`m sure you can work something out Well, I'm satisfied with what Wget does

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
The problem is that the server replies with "login incorrect", which normally means that authorization has failed and that further retries would be pointless. Other than having a natural language parser built-in, Wget cannot know that the authorization is in fact correct, but that the server happe

Wget Bug

2003-11-10 Thread Kempston
Here is debug output :/FTPD# wget ftp://ftp.dcn-asu.ru/pub/windows/update/winxp/xpsp2-1224.exe -d DEBUG output created by Wget 1.8.1 on linux-gnu. --13:25:55-- ftp://ftp.dcn-asu.ru/pub/windows/upd

old patch to ".netrc quote parsing bug"

2003-10-25 Thread Noèl Köthe
Hello, upgrading to 1.9 I found an old unapplied patch to fix a parsing problem with .netrc. "The cause of this behavior is wget's .netrc parser failing to reset it's quote flag after seeing a quote at the end of a token. That caused problems for lines with unquoted tokens following quoted ones.

Re: Bug: Support of charcters like '\', '?', '*', ':' in URLs

2003-10-21 Thread Hrvoje Niksic
"Frank Klemm" <[EMAIL PROTECTED]> writes: > Wget don't work properly when the URL contains characters which are > not allowed in file names on the file system which is currently > used. These are often '\', '?', '*' and ':'. > > Affected are at least: > - Windows and related OS > - Linux when usin

Bug: Support of charcters like '\', '?', '*', ':' in URLs

2003-10-21 Thread Frank Klemm
Wget don't work properly when the URL contains characters which are not allowed in file names on the file system which is currently used. These are often '\', '?', '*' and ':'. Affected are at least: - Windows and related OS - Linux when using FAT or Samba as file system Possibilty to solve: On

RE: Wget 1.8.2 bug

2003-10-20 Thread Sergey Vasilevsky
e Niksic [mailto:[EMAIL PROTECTED] > Sent: Friday, October 17, 2003 7:18 PM > To: Tony Lewis > Cc: Wget List > Subject: Re: Wget 1.8.2 bug > > > "Tony Lewis" <[EMAIL PROTECTED]> writes: > > > Hrvoje Niksic wrote: > > > >> Incidentally, Wget i

Re: Wget 1.8.2 bug

2003-10-17 Thread Hrvoje Niksic
"Tony Lewis" <[EMAIL PROTECTED]> writes: > Hrvoje Niksic wrote: > >> Incidentally, Wget is not the only browser that has a problem with >> that. For me, Mozilla is simply showing the source of >> , because >> the returned content-type is t

Re: Wget 1.8.2 bug

2003-10-17 Thread Tony Lewis
Hrvoje Niksic wrote: > Incidentally, Wget is not the only browser that has a problem with > that. For me, Mozilla is simply showing the source of > , because > the returned content-type is text/plain. On the other hand, Internet Explorer

Re: Wget 1.8.2 bug

2003-10-17 Thread Hrvoje Niksic
"??? ??" <[EMAIL PROTECTED]> writes: >> I've seen pages that do that kind of redirections, but Wget seems >> to follow them, for me. Do you have an example I could try? >> > [EMAIL PROTECTED]:~/> /usr/local/bin/wget -U > "All.by" -np -r -N -nH --header="Accept-Charset: cp1251, window

Re: bug in 1.8.2 with

2003-10-14 Thread Hrvoje Niksic
You're right -- that code was broken. Thanks for the patch; I've now applied it to CVS with the following ChangeLog entry: 2003-10-15 Philip Stadermann <[EMAIL PROTECTED]> * ftp.c (ftp_retrieve_glob): Correctly loop through the list whose elements might have been deleted.

bug in 1.8.2 with

2003-10-14 Thread Noèl Köthe
Hello, which this download you will get a segfault. wget --passive-ftp --limit-rate 32k -r -nc -l 50 \ -X */binary-alpha,*/binary-powerpc,*/source,*/incoming \ -R alpha.deb,powerpc.deb,diff.gz,.dsc,.orig.tar.gz \ ftp://ftp.gwdg.de/pub/x11/kde/stable/3.1.4/Debian Philip Stadermann <[EMAIL PROTECT

Re: Wget 1.8.2 bug

2003-10-14 Thread Hrvoje Niksic
"Sergey Vasilevsky" <[EMAIL PROTECTED]> writes: > I use wget 1.8.2. When I try recursive download site site.com where > site.com/ first page redirect to site.com/xxx.html that have first > link in the page to site.com/ then Wget download only xxx.html and > stop. Other links from xxx.html not fo

Wget 1.8.2 bug

2003-10-14 Thread Sergey Vasilevsky
I use wget 1.8.2. When I try recursive download site site.com where site.com/ first page redirect to site.com/xxx.html that have first link in the page to site.com/ then Wget download only xxx.html and stop. Other links from xxx.html not followed!

Re: subtle bug? or opportunity of avoiding multiple nested directories

2003-10-10 Thread Hrvoje Niksic
Stephen Hewitt <[EMAIL PROTECTED]> writes: > Attempting to mirror a particular web site, with wget 1.8.1, I got > many nested directories like .../images/images/images/images etc For > example the log file ended like this: [...] Thanks for the detailed report and for taking the time to find the p

subtle bug? or opportunity of avoiding multiple nested directories

2003-10-09 Thread Stephen Hewitt
Attempting to mirror a particular web site, with wget 1.8.1, I got many nested directories like .../images/images/images/images etc For example the log file ended like this: --08:16:37-- http://www.can-online.org.uk/SE/images/images/images/images/images/images/images/images/images/images/images/im

RE: Bug in Windows binary?

2003-10-06 Thread Herold Heiko
> From: Gisle Vanem [mailto:[EMAIL PROTECTED] > "Jens Rösner" <[EMAIL PROTECTED]> said: > ... > I assume Heiko didn't notice it because he doesn't have that function > in his kernel32.dll. Heiko and Hrvoje, will you correct this ASAP? > > --gv Probably. Currently I'm compiling and testing on

Re: Bug in Windows binary?

2003-10-05 Thread Hrvoje Niksic
"Gisle Vanem" <[EMAIL PROTECTED]> writes: > --- mswindows.c.org Mon Sep 29 11:46:06 2003 > +++ mswindows.c Sun Oct 05 17:34:48 2003 > @@ -306,7 +306,7 @@ > DWORD set_sleep_mode (DWORD mode) > { >HMODULE mod = LoadLibrary ("kernel32.dll"); > - DWORD (*_SetThreadExecutionState) (DWORD) =

Re: Bug in Windows binary?

2003-10-05 Thread Gisle Vanem
c 0x8000 > > I disabled my wgetrc as well and the output was exactly the same. > > I then tested > wget 1.9 beta 2003/09/18 (earlier build!) > from the same place and it works smoothly. > > Can anyone reproduce this bug? Yes, but the MSVC version crashed on my machin

Bug in Windows binary?

2003-10-05 Thread Jens Rösner
reproduce this bug? System is Win2000, latest Service Pack installed. Thanks for your assistance and sorry if I missed an earlier report of this bug, I know a lot has been done over the last weeks and I may have missed something. Jens -- NEU FÜR ALLE - GMX MediaCenter - für Fotos, Musik, Dateien

Re: BUG in --timeout (exit status)

2003-10-02 Thread Manfred Schwarb
OK, I see. But I do not agree. And I don't think it is a good idea to treat the first download special. In my opinion, exit status 0 means "everything during the whole retrieval went OK". My prefered solution would be to set the final exit status to the highest exit status of all individual downl

Re: BUG in --timeout (exit status)

2003-10-02 Thread Hrvoje Niksic
This problem is not specific to timeouts, but to recursive download (-r). When downloading recursively, Wget expects some of the specified downloads to fail and does not propagate that failure to the code that sets the exit status. This unfortunately includes the first download, which should prob

BUG in --timeout (exit status)

2003-10-02 Thread Manfred Schwarb
Hi, doing the following: # /tmp/wget-1.9-beta3/src/wget -r --timeout=5 --tries=1 http://weather.cod.edu/digatmos/syn/ --11:33:16-- http://weather.cod.edu/digatmos/syn/ => `weather.cod.edu/digatmos/syn/index.html' Resolving weather.cod.edu... 192.203.136.228 Connecting to weather.cod.ed

Re: dificulty with Debian wget bug 137989 patch

2003-09-30 Thread Hrvoje Niksic
g', which is not available on many platforms that Wget supports. The issue will likely be addressed in 1.10. Having said that: > I tried the patch Debian bug report 137989 and didnt work. Can > anybody explain: > 1 - why I have to make to directories for patch work: one > wget

dificulty with Debian wget bug 137989 patch

2003-09-29 Thread jayme
I tried the patch Debian bug report 137989 and didnt work. Can anybody explain: 1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one wget-1.8.2 ? 2 - why after compilation the wget still cant download the file > 2GB ? note : I cut the patch for debian use ( the fi

Re: wget bug

2003-09-26 Thread Hrvoje Niksic
Jack Pavlovsky <[EMAIL PROTECTED]> writes: > It's probably a bug: bug: when downloading wget -mirror > ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, > but when downloading wget ftp://somehost.org/somepath/3*, wget saves > the files as 3acv14%7Eani

Re: wget bug

2003-09-26 Thread DervishD
Hi Jack :) * Jack Pavlovsky <[EMAIL PROTECTED]> dixit: > It's probably a bug: > bug: when downloading > wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, > wget saves it as-is, but when downloading > wget ftp://somehost.org/somepath/3*, wget sa

wget bug

2003-09-26 Thread Jack Pavlovsky
It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg -- The human knowledge belongs to the world

RE: bug maybe?

2003-09-23 Thread Matt Pease
how do I get off this list? I tried a few times before & got no response from the server. thank you- Matt > -Original Message- > From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] > Sent: Tuesday, September 23, 2003 8:53 PM > To: Randy Paries > Cc: [EMAIL PROTECTED]

Re: bug maybe?

2003-09-23 Thread Hrvoje Niksic
"Randy Paries" <[EMAIL PROTECTED]> writes: > Not sure if this is a bug or not. I guess it could be called a bug, although it's no simple oversight. Wget currently doesn't support large files.

bug maybe?

2003-09-23 Thread Randy Paries
Not sure if this is a bug or not. i can not get a file over 2GB (i get a MAX file Exceeded error message) this is on a redhat 9 box. GNU Wget 1.8.2, Thanks Randy

Re: bug in wget 1.8.1/1.8.2

2003-09-16 Thread Hrvoje Niksic
e a new limit of lines? No, there's no built-in line limit, what you're seeing is a bug. I cannot see anything wrong inspecting the code, so you'll have to help by providing a gdb backtrace. You can get it by doing this: * Compile Wget with `-g' by running `make CFLAGS=-g' i

bug in wget 1.8.1/1.8.2

2003-09-16 Thread Dieter Drossmann
Hello, I use a extra file with a long list of http entries. I included this file with the -i option. After 154 downloads I got an error message: Segmentation fault. With wget 1.7.1 everything works well. Is there a new limit of lines? Regards, Dieter Drossmann

Re: possible bug in exit status codes

2003-09-15 Thread Aaron S. Hawley
I can verify this in the cvs version. it appears to be isolated to the recursive behavior. /a On Mon, 15 Sep 2003, Dawid Michalczyk wrote: > Hello, > > I'm having problems getting the exit status code to work correctly in > the following scenario. The exit code should be 1 yet it is 0

possible bug in exit status codes

2003-09-14 Thread Dawid Michalczyk
Hello, I'm having problems getting the exit status code to work correctly in the following scenario. The exit code should be 1 yet it is 0 [EMAIL PROTECTED]:~$ wget -d -t2 -r -l1 -T120 -nd -nH -R gif,zip,txt,exe,wmv,htmll,*[1-99] www.cnn.com/foo.html DEBUG output created by Wget 1.8.2 on linux

Re: bug in wget - wget break on time msec=0

2003-09-13 Thread Hrvoje Niksic
"Boehn, Gunnar von" <[EMAIL PROTECTED]> writes: > I think I found a bug in wget. You did. But I believe your subject line is slightly incorrect. Wget handles 0 length time intervals (see the assert message), but what it doesn't handle are negative amounts. And

bug in wget - wget break on time msec=0

2003-09-13 Thread Boehn, Gunnar von
Hello, I think I found a bug in wget. My GNU wget version is 1.82 My system GNU/Debian unstable I use wget to replay our apache logfiles to a test webserver to try different tuning parameters. Wget fails to run through the logfile and give out the error message that "msec >=

Maybe a bug in wget?

2003-09-09 Thread n_fujikawa
Dear Sir; We are using wget-1.8.2 and it's very convinient for our routine program. By the way, now we have a trouble with the return code from wget in case of trying to use it with -r option, When wget with -r option fails in a ftp connection, wget returns a code 0. If no -r option, it r

Re: *** Workaround found ! *** (was: Hostname bug in wget ...)

2003-09-05 Thread Hrvoje Niksic
[EMAIL PROTECTED] writes: > I found a workaround for the problem described below. > > Using option -nh does the job for me. > > As the subdomains mentioned below are on the same IP > as the "main" domain wget seems not to compare their > names but the IP only. I believe newer versions of Wget don

*** Workaround found ! *** (was: Hostname bug in wget ...)

2003-09-05 Thread webmaster
ce weekend ! Regards Klaus --- Forwarded message follows --- From: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Date sent: Thu, 4 Sep 2003 12:53:39 +0200 Subject: Hostname bug in wget ... Priority: normal ... or a silly

Hostname bug in wget ...

2003-09-04 Thread webmaster
... or a silly sleepless webmaster !? Hi, Version == I use the GNU wget version 1.7 which is found on OpenBSD Release 3.3 CD. I use it on i386 architecture. How to reproduce == wget -r coolibri.com (adding the "span hosts" option did not improve) Problem category =

recursive & no-parent bug in 1.8.2

2003-09-01 Thread John Wilkes
I recently upgraded to wget 1.8.2 from an unknown earlier version. In doing recursive http retrievals, I have noticed inconsistent behavior. If I specifiy a directory without the trailing slash in the url, the "--no-parent" option is ignored, but if the trailing slash is present, it works as expec

RE: Bug in total byte count for large downloads

2003-08-26 Thread Herold Heiko
gt; From: Stefan Recksiegel > [mailto:[EMAIL PROTECTED] > Sent: Monday, August 25, 2003 6:49 PM > To: [EMAIL PROTECTED] > Subject: Bug in total byte count for large downloads > > > Hi, > > this may be known, but > > [EMAIL PROTECTED]:/scratch/suse82> wge

Bug in total byte count for large downloads

2003-08-25 Thread Stefan Recksiegel
Hi, this may be known, but [EMAIL PROTECTED]:/scratch/suse82> wget --help GNU Wget 1.5.3, a non-interactive network retriever. gave me FINISHED --18:32:38-- Downloaded: -1,713,241,830 bytes in 5879 files while [EMAIL PROTECTED]:/scratch/suse82> du -c 6762560 total would be correct. Best wishes

WGET 1.9 bug? it doesn't happen in 1.8.2!!!!!

2003-08-19 Thread jmsbc
Well i had replaced 1.8.2 with 1.9 b/c of the timeout fix, which was nice. Now i've come across a problem that does not occur in 1.8.2. if i give the command... ./wget -T 15 -r -l 15 -D edu http://www.psu.edu in wget 1.9, it will download stuff for a couple seconds, and then stop at this

bug: no check accept domain when server redirect

2003-08-14 Thread Василевский Сергей
I use wget 1.8.2: -r -nH -P /usr/file/somehost.com somehost.com http://somehost.com Bug description: If some script http://somehost.com/cgi-bin/rd.cgi return http header with status 302 and redirect to http://anotherhost.com then the first page of http://anotherhost.com/index.html accepted and

bug: no check accept domain when server redirect

2003-08-14 Thread Василевский Сергей
I use wget 1.8.2: -r -nH -P /usr/file/somehost.com somehost.com http://somehost.com Bug description: If some script http://somehost.com/cgi-bin/rd.cgi return http header with status 302 and redirect to http://anotherhost.com then the first page of http://anotherhost.com/index.html accepted and

Re: bug in --spider option

2003-08-14 Thread Aaron S. Hawley
On Mon, 11 Aug 2003, dEth wrote: > Hi everyone! > > I'm using wget to check if some files are downloadable, I also use to > determine the size of the file. Yesterday I noticed that wget > ignores --spider option for ftp addresses. > It had to show me the filesize and other parameters, but it began

Bug when continuing download and requesting non-existent file over proxy

2003-08-14 Thread Kilian Hagemann
Hi there, I'm pretty sure that I found a bug in the latest (at least according to the FreeBSD ports tree) version of wget. It occurs upon continuing a partially retrieved file using a proxy. I set my ftp_proxy environment variable appropriately. The FreeBSD ports mechanism, in case you&#

Bug, feature or my fault?

2003-08-14 Thread DervishD
Hi all :)) After asking in the wget list (with no success), and after having a look at the sources (a *little* look), I think that this is a bug, so I've decided to report here. Let's go to the matter: when I download, thru FTP, some hierarchy, the spaces are translat

Wget 1.8.2 timestamping bug

2003-08-11 Thread Angelo Archie Amoruso
Hi All, I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with Athlon 550 MHz cpu, 128 MB Ram. I've encountered a strange issue, which seem really a bug, using the timestamping option. I'm trying to retrieve the http://www.nic.it/index.html page. The HEAD HTTP method returns th

bug in --spider option

2003-08-11 Thread dEth
Hi everyone! I'm using wget to check if some files are downloadable, I also use to determine the size of the file. Yesterday I noticed that wget ignores --spider option for ftp addresses. It had to show me the filesize and other parameters, but it began to download the file :( That's too bad. Can

Wget 1.8.2 timestamping bug

2003-08-10 Thread Angelo Archie Amoruso
Hi All, I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with Athlon 550 MHz cpu, 128 MB Ram. I've encountered a strange issue, which seem really a bug, using the timestamping option. I'm trying to retrieve the http://www.nic.it/index.html page. The HEAD HTTP method returns th

Re: Bug, feature or my fault?

2003-08-08 Thread Aaron S. Hawley
On Wed, 6 Aug 2003, DervishD wrote: > Hi all :)) > > After asking in the wget list (with no success), and after having > a look at the sources (a *little* look), I think that this is a bug, > so I've decided to report here. note, the bug and the help lists are cur

RE: Wget 1.8.2 timestamping bug

2003-08-06 Thread Post, Mark K
To: [EMAIL PROTECTED] Subject: Wget 1.8.2 timestamping bug Hi All, I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with Athlon 550 MHz cpu, 128 MB Ram. I've encountered a strange issue, which seem really a bug, using the timestamping option. I'm trying to retrieve the http://www.

Timeout bug (1.8.2)

2003-08-03 Thread Andrey Sergeev
wget -gON -t3 -N -w60 -T10 -c --passive-ftp ftp://[EMAIL PROTECTED]/lastday/*.* Wget: BUG: unknown command `timeout', value `10' Sometimes wget can fall asleep. it would be nice to have normal timeout.

Re: wget bug: mirror doesn't delete files deleted at the source

2003-08-01 Thread Aaron S. Hawley
On Fri, 1 Aug 2003, Mordechai T. Abzug wrote: > I'd like to use wget in mirror mode, but I notice that it doesn't > delete files that have been deleted at the source site. Ie.: > > First run: the source site contains "foo" and "bar", so the mirror now > contains "foo" and "bar". > > Before

wget bug: mirror doesn't delete files deleted at the source

2003-07-31 Thread Mordechai T. Abzug
I'd like to use wget in mirror mode, but I notice that it doesn't delete files that have been deleted at the source site. Ie.: First run: the source site contains "foo" and "bar", so the mirror now contains "foo" and "bar". Before second run: the source site deletes "bar" and replaces it

wget for win32; small bug

2003-06-21 Thread Mark
although this is a windows bug, it effects this program. when leeching files with the name "prn" "com1" eg. prn.html wget will freeze up becuse windows will not allow it to save a file with that name. A possable soultion, saving the file as "prn_.html" just a suggestion. -pionig

<    1   2   3   4   5   6   7   8   9   >