directory-prefix bug in Win32

2006-12-11 Thread Denis Golovan
Hi! When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta 1(with spider patch) command-line switches wget does not pay attention to neither of them. It saves files in the current directory. Wget v1.10.2 worked right. Hope, this bug won't live long :).

More detail on bug

2006-12-11 Thread Denis Golovan
When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta 1(with spider patch) command-line switches wget does not pay attention to neither of them. It saves files in the current directory. Wget v1.10.2 worked right. Such incorrent behaviour appeares only if server http

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
character using --restrict-file-names=windows, but unfortunately this does not fix the problem because the browser will un-escape the URL and will still continue to look for a file with a colon in it. I am not sure of the best way to address this bug, because I am not sure if it possible to escape

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
, then it will create a link like this: a href=Category:Fish title=Category:FishFish/a Unfortunately, this is not a valid URL, because the browser interprets the 'Category:' as the protocol Category, not the local filename 'Category:' I am not sure of the best way to address this bug, because I am

Re: Wget auth-md5 bug

2006-12-07 Thread Eugene Y. Vasserman
Hello, I was wondering what the status of this report is: has it even been received? I've gotten no acknowledgement in two weeks or so. Thanks, Eugene Thus spake Eugene Y. Vasserman on Mon, 20 Nov 2006: From: Eugene Y. Vasserman [EMAIL PROTECTED] Subject: Wget auth-md5 bug Date: Mon, 20 Nov

Bug in 1.10.2 vs 1.9.1

2006-12-03 Thread Juhana Sadeharju
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1. First, the bin/wgetdir is defined as wget -p -E -k --proxy=off -e robots=off --passive-ftp -o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50 --waitretry=10 $@ The download command is wgetdir http

Re: wget bug in finding files after disconnect

2006-11-18 Thread Georg Schulte Althoff
Paul Bickerstaff [EMAIL PROTECTED] wrote in news:[EMAIL PROTECTED]: I'm using wget version GNU Wget 1.10.2 (Red Hat modified) on a fedora core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b on a WinXP laptop. Both display the same faulty behaviour which I don't believe

wget bug

2006-11-01 Thread lord maximus
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.

BUG - .listing has sprung into existence

2006-10-30 Thread xxx
Hi, I am using Wget is 1.10.2 unter Windows XP. If I run (sensible data is replaced by ##): wget --dont-remove-listing -b -o %temp%\0.log -P %temp%\result\ ftp://##.##.##.##/result/*0.* Everything works fine. If I execute the same command again I get the following error: ##/result/.listing

Bug ?

2006-10-30 Thread Necach
Hello, At night I wanted to download new Fedora Core 6 DVD and wget downloaded it all, closed connection and then tried to retry download. See below: ~ $ wget ftp://ftp.uninett.no/pub/linux/Fedora/core/6/i386//iso/FC-6-i386-DVD.iso --00:06:42--

Re: BUG - .listing has sprung into existence

2006-10-30 Thread Steven M. Schweda
From: Sebastian Doctor, it hurts when I do this. Don't do that. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547

Re: new wget bug when doing incremental backup of very large site

2006-10-21 Thread Steven M. Schweda
From dev: I checked and the .wgetrc file has continue=on. Is there any way to surpress the sending of getting by byte range? I will read through the email and see if I can gather some more information that may be needed. Remove continue=on from .wgetrc? Consider: -N, --timestamping

bug in --delete-after

2006-10-17 Thread Janek Pankiewicz
Hello, this what happened when I tried to cache www.nytimes.com web page with command wget -nd -p --delete-after http://www.nytimes.com wget 1.10.2 With regards Jan Pankiewicz write(2, Connecting to 192.168.0.4:3128., 34Connecting to 192.168.0.4:3128... ) = 34 socket(PF_INET,

bug in --delete-after

2006-10-17 Thread Janek Pankiewicz
Hello, this happened when I tried to cache www.nytimes.com web page with command wget -nd -p --delete-after http://www.nytimes.com With regards Jan Pankiewicz write(2, Connecting to 192.168.0.4:3128., 34Connecting to 192.168.0.4:3128... ) = 34 socket(PF_INET, SOCK_STREAM, IPPROTO_IP) = 3

--reject downloads then deletes - bug/feature?

2006-10-16 Thread Morgan Read
Hello I'm running --reject expecting that the files are skipped from downloading, but instead they're downloaded then deleted. The whole point of using the option was to avoid downloading a database which runs to over 12,000 files (before I terminated wget!). Is this correct behaviour? If it

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
when wget already finds a local file with the same name and sends a range request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a bug. To get around

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
wget already finds a local file with the same name and sends a range request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a bug. To get around

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
wget already finds a local file with the same name and sends a range request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a bug. To get around

Re: new wget bug when doing incremental backup of very large site

2006-10-15 Thread Steven M. Schweda
be a bug, but finding it could be difficult. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547

Im not sure this is a bug or feature... (2GB limit?)

2006-10-14 Thread Tima Dronenko
Hello :) Im not sure this is a bug or feature... I cant down load files bigger than 2GB using wget. Timofey. p.s. my log= /// wget -c http://uk1x1.fileplanet.com/%5E1530224706/ftp1/052006

Re: Im not sure this is a bug or feature... (2GB limit?)

2006-10-14 Thread Fred Holmes
What operating system are you using? It may be a feature of your operating system. At 02:19 AM 10/14/2006, Tima Dronenko wrote: Hello :) Im not sure this is a bug or feature... I cant down load files bigger than 2GB using wget. Timofey. p.s. my log

Re: Im not sure this is a bug or feature... (2GB limit?)

2006-10-14 Thread Steven M. Schweda
From: Tima Dronenko Im not sure this is a bug or feature... wget -V If your wget version is before 1.10, it's a feature. At or after 1.10, it's a bug. (In some cases, the bug is in the server.) Steven M

wget --delete-after parameter bug

2006-10-12 Thread Golam Sarwar
Im using Wget 1.10.2 (on windows )with the following parameters: wget -r -l 1 -nd -p -T 10 --delete-after http://www.google.com/ wget crashes each time when trying to delete the files with --delete-after parameter. But, with the same parameter for other websites like,

Re: Bug

2006-09-15 Thread Mauro Tortonesi
Reece ha scritto: Found a bug (sort of). When trying to get all the images in the directory below: http://www.netstate.com/states/maps/images/ It gives 403 Forbidden errors for most of the images even after setting the agent string to firefox's, and setting -e robots=off After a packet

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-28 Thread Mauro Tortonesi
. I meanwhile found, however, another new problem with time-stamping, which mainly occurs in connection with a proxy-cache, I will report that in a new thread. Same for a small problem with the SSL configuration. thank you very much for the useful bug reports you keep sending us ;-) -- Aequam

Bug

2006-08-26 Thread Reece
Found a bug (sort of). When trying to get all the images in the directory below: http://www.netstate.com/states/maps/images/ It gives 403 Forbidden errors for most of the images even after setting the agent string to firefox's, and setting -e robots=off After a packet capture, it appears

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-21 Thread Mauro Tortonesi
Jochen Roderburg ha scritto: Zitat von Jochen Roderburg [EMAIL PROTECTED]: Zitat von Hrvoje Niksic [EMAIL PROTECTED]: Mauro, you will need to look at this one. Part of the problem is that Wget decides to save to index.html.1 although -c is in use. That is solved with the patch attached

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-20 Thread Jochen Roderburg
Zitat von Jochen Roderburg [EMAIL PROTECTED]: Zitat von Hrvoje Niksic [EMAIL PROTECTED]: Mauro, you will need to look at this one. Part of the problem is that Wget decides to save to index.html.1 although -c is in use. That is solved with the patch attached below. But the other part is

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-17 Thread Mauro Tortonesi
(copia locale) @@ -1,3 +1,9 @@ +2006-08-16 Mauro Tortonesi [EMAIL PROTECTED] + + * http.c: Fixed bug which broke --continue feature. Now if -c is + given, http_loop sends a HEAD request to find out the destination + filename before resuming download. + 2006-08-08 Hrvoje Niksic

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-17 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: you're right, of course. the patch included in attachment should fix the problem. since the new HTTP code supports Content-Disposition and delays the decision of the destination filename until it receives the response header, the best solution i could

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-17 Thread Mauro Tortonesi
Hrvoje Niksic ha scritto: Mauro Tortonesi [EMAIL PROTECTED] writes: you're right, of course. the patch included in attachment should fix the problem. since the new HTTP code supports Content-Disposition and delays the decision of the destination filename until it receives the response header,

Re: RES: BUG

2006-08-16 Thread Mauro Tortonesi
-feira, 10 de julho de 2006 07:04 Para: Tony Lewis Cc: 'Junior + Suporte'; [EMAIL PROTECTED] Assunto: Re: BUG Tony Lewis ha scritto: Run the command with -d and post the output here. in this case, -S can provide more useful information than -d. be careful to obfuscate passwords, though!!! hi

Bug report: backup files missing when using wget -K

2006-08-14 Thread Ken Kubota
). See attached files for details. The script calling wget is WGET. There was no .wgetrc file. You probably know the bug described at: http://www.mail-archive.com/wget@sunsite.dk/msg07686.html Remove the two ./CLEAN commands in the script to test recursive re-download with it. I cannot

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-09 Thread Mauro Tortonesi
Hrvoje Niksic wrote: Noèl Köthe [EMAIL PROTECTED] writes: a wget -c problem report with the 1.11 alpha 1 version (http://bugs.debian.org/378691): I can reproduce the problem. If I have already 1 MB downloaded wget -c doesn't continue. Instead it starts to download again: Mauro, you will

wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't work with HTTP]

2006-08-08 Thread Noèl Köthe
Hello, a wget -c problem report with the 1.11 alpha 1 version (http://bugs.debian.org/378691): I can reproduce the problem. If I have already 1 MB downloaded wget -c doesn't continue. Instead it starts to download again: Weitergeleitete Nachricht [EMAIL PROTECTED]:~$ strace

Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]

2006-08-08 Thread Jochen Roderburg
Zitat von Hrvoje Niksic [EMAIL PROTECTED]: Mauro, you will need to look at this one. Part of the problem is that Wget decides to save to index.html.1 although -c is in use. That is solved with the patch attached below. But the other part is that hstat.local_file is a NULL pointer when

Re: bug/feature request

2006-07-27 Thread Marc Schoechlin
] Subject: bug/feature request To: [EMAIL PROTECTED] Hi, i´m not sure if that is a feature request or a bug. Wget does not collect all page requisites of a given URL. Many sites are referencing components of these sites in cascading style sheets, but wget does not collect these components

bug/feature request

2006-07-25 Thread Marc Schoechlin
Hi, i´m not sure if that is a feature request or a bug. Wget does not collect all page requisites of a given URL. Many sites are referencing components of these sites in cascading style sheets, but wget does not collect these components as page requisites. A example: --- $ wget -q -p -k -nc -x

Maybe a bug

2006-07-21 Thread eduardo martins

Re: Bug in wget 1.10.2 makefile

2006-07-17 Thread Mauro Tortonesi
Daniel Richard G. ha scritto: Hello, The MAKEDEFS value in the top-level Makefile.in also needs to include DESTDIR='$(DESTDIR)'. fixed, thanks. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.tortonesi.com University of Ferrara -

Re: Documentation (manpage) bug

2006-07-17 Thread Mauro Tortonesi
Linda Walsh ha scritto: FYI: On the manpage, where it talks about no-proxy, the manpage says: --no-proxy Don't use proxies, even if the appropriate *_proxy environment variable is defined. For more information about the use of proxies with Wget,

Re: I got one bug on Mac OS X

2006-07-17 Thread HUAZHANG GUO
one bug on Mac OS X Dear Sir/Madam, while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/ I got most of the files, but lost some of them. I think I know where the problem is: if the link is broken into two lines in the index.html: PLecture

Re: I got one bug on Mac OS X

2006-07-16 Thread Hrvoje Niksic
of an A tag. Unrelated to this particular bug, please note that rfc1866 is not the place to look for an up-to-date HTML specification. HTML has been maintained by W3C for many years, so it's best to look there, e.g. HTML 4.01 spec, or possibly XHTML.

I got one bug on Mac OS X

2006-07-15 Thread HUAZHANG GUO
Dear Sir/Madam,while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/I got most of the files, but lost some of them.I think I know where the problem is:if the link is broken into two lines in the index.html:PLecture 1 (Jan 17): Exploring

RE: I got one bug on Mac OS X

2006-07-15 Thread Tony Lewis
[mailto:[EMAIL PROTECTED] Sent: Tuesday, July 11, 2006 7:48 AMTo: [EMAIL PROTECTED]Subject: I got one bug on Mac OS X Dear Sir/Madam, while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/ I got most of the files, but lost some of them

Re: I got one bug on Mac OS X

2006-07-15 Thread Steven P. Ulrick
of line within the HREF attribute of an A tag. Tony _ From: HUAZHANG GUO [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 11, 2006 7:48 AM To: [EMAIL PROTECTED] Subject: I got one bug on Mac OS X Dear Sir/Madam, while I was trying to download using the command: wget -k -np

Bug in wget 1.10.2 makefile

2006-07-14 Thread Daniel Richard G.
Hello, The MAKEDEFS value in the top-level Makefile.in also needs to include DESTDIR='$(DESTDIR)'. (build log excerpt) + make install DESTDIR=/tmp/wget--1.10.2.build/__dest__ cd src make CC='cc' CPPFLAGS='-D__EXTENSIONS__ -D_REENTRANT -Dsparc' ... install.bin

Re: BUG

2006-07-10 Thread Mauro Tortonesi
Tony Lewis ha scritto: Run the command with -d and post the output here. in this case, -S can provide more useful information than -d. be careful to obfuscate passwords, though!!! -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi

BUG

2006-07-03 Thread Junior + Suporte
Dear, I using wget to send login request to a site, when wget is saving the cookies, the following error message appear: Error in Set-Cookie, field `Path'Syntax error in Set-Cookie: tu=661541|802400391 @TERRA.COM.BR; Expires=Thu, 14-Oct-2055 20:52:46 GMT; Path= at position 78. Location:

RE: BUG

2006-07-03 Thread Tony Lewis
Title: RE: BUG Run the command with -d and post the output here. Tony _ From: Junior + Suporte [mailto:[EMAIL PROTECTED]] Sent: Monday, July 03, 2006 2:00 PM To: [EMAIL PROTECTED] Subject: BUG Dear, I using wget to send login request

RE: Bug in GNU Wget 1.x (Win32)

2006-06-22 Thread Herold Heiko
, con, prn). Maybe it is possible to query the os about the currently active device names and rename the output files if neccessary ? I reproduced the bug with Win32 versions 1.5.dontremeber, 1.10.1 and 1.10.2. I did also test version 1.6 on Linux but it was not affected. That is since

Bug in GNU Wget 1.x (Win32)

2006-06-21 Thread Þröstur
Hello there, I have to say that Wget is one of the most useful tools out there(from my point of view of course). I'm using the Win32 version of it to make life with XP little more bearable. (faking Internet Explorer like mad, all over the place) Well on to the thing i call a bug. The bug only

[WGET BUG] - Can not retreive image from cacti

2006-06-19 Thread Thomas GRIMONET
Hello, We are using version 1.10.2 of wget under Ubuntu and Debian. So we have many scripts that get some images from a cacti site. These scripts ran perfectly with version 1.9 of wget but they can not get image with version 1.10.2 of wget. Here you can find an example of our

Re: Buffer overflow bug in base64_encode

2006-06-19 Thread Hrvoje Niksic
[EMAIL PROTECTED] writes: I discovered a buffer overflow bug in the base64_encode() function, located at line 1905 in file src\utils.c. Note that this bug is in the latest version of the program (version 1.10.2) The bug appears to be that the function is assuming that the input data

Documentation (manpage) bug

2006-06-17 Thread Linda Walsh
FYI: On the manpage, where it talks about no-proxy, the manpage says: --no-proxy Don't use proxies, even if the appropriate *_proxy environment variable is defined. For more information about the use of proxies with Wget,

wget 1.11 alpha1 - bug with timestamping option

2006-06-17 Thread Jochen Roderburg
Hi, I have tried out the wget alpha under Linux and found that the timestamping option (which I usually have defined) does not work correctly. First thing I saw, that on *every* download I got a line Remote file is newer, retrieving. in the output, even when there was no local file. That

Bug in GNU Wget 1.x (Win32)

2006-06-15 Thread Þröstur
Hello there, I have to say that Wget is one of the most useful tools out there(from my point of view of course). I'm using the Win32 version of it to make life with XP little more bearable. (faking Internet Explorer like mad, all over the place) Well on to the thing i call a bug. The bug only

A bug in wget 1.10.2

2006-06-07 Thread Joaquim Andrade
Hello, i'm using wget 1.10.2 in Windows, the windows binary version, and it have a bug when downloading with -c and with a input file. If the first file of the list is the one to be continued, wget do it fine, if not, wgettry to download the files from the beginning, and it says

BUG: wget with option -O creates empty files even if the remote file does not exist

2006-06-01 Thread Eduardo M KALINOWSKI
I'm using wget version 1.10.2. If a try to download a nonexistant file with a command like like this wget http://www.somehost.com/nonexistant.html and the file does not exist, wget reports a 404 error and no file is created. However, if I specify the file where to place the output,

Re: BUG: wget with option -O creates empty files even if the remote file does not exist

2006-06-01 Thread Steven M. Schweda
From: Eduardo M KALINOWSKI wget http://www.somehost.com/nonexistant.html -O localfile.html then file localfile.html will always be created, and will have length of zero even if the remote file does not exist. Because with -O, Wget opens the output file before it does any network

Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]

2006-05-19 Thread Mauro Tortonesi
Noèl Köthe wrote: Hello, a forwarded report from http://bugs.debian.org/366434 could this behaviour be added to the doc/manpage? i wonder if it makes sense to add generic support for multiple headers in wget, for instance by extending the --header option like this: wget --header=Pragma:

Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]

2006-05-19 Thread Hrvoje Niksic
headers. According to HTTP, a duplicate header field is equivalent to a single header header with multiple values joined using the , separator -- which the bug report mentions.

RE: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not suppor ted]

2006-05-19 Thread Herold Heiko
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] i wonder if it makes sense to add generic support for multiple headers in wget, for instance by extending the --header option like this: wget --header=Pragma: xxx --header=dontoverride,Pragma: xxx2 someurl That could be a problem if you

Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not suppor ted]

2006-05-19 Thread Mauro Tortonesi
Herold Heiko wrote: From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] i wonder if it makes sense to add generic support for multiple headers in wget, for instance by extending the --header option like this: wget --header=Pragma: xxx --header=dontoverride,Pragma: xxx2 someurl That could be

bug?

2006-05-16 Thread yy :)
Hi, I ran "wget -P /tmp/.test http://192.168.1.10" in SUSE system (SLES 9) and found that it saved the file in /tmp/_test. This command works fine inRedHat, is it a bug? wget version: wget-1.9.1-45.12 Thanks, VanessaGet your ringtones, operator logos and picture messages from MSN Mobile.

Re: bug?

2006-05-16 Thread Hrvoje Niksic
yy :) [EMAIL PROTECTED] writes: I ran wget -P /tmp/.test [1]http://192.168.1.10; in SUSE system (SLES 9) and found that it saved the file in /tmp/_test. This command works fine inRedHat, is it a bug? I believe the bug is introduced by SuSE in an attempt to protect the user. Try reporting

[Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]

2006-05-14 Thread Noèl Köthe
Hello, a forwarded report from http://bugs.debian.org/366434 could this behaviour be added to the doc/manpage? thx. Package: wget Version: 1.10.2-1 It's meaningful to have multiple 'Pragma:' headers within an http request, but wget will silently issue only a single one of them if they

bug ?

2006-05-05 Thread Marcel . Maas
Hello, great program but I am having a problem with it. The debug says: The sizes do not match (local 16668160) -- retrieving. ftp://ftp.invetech.com.au/Project%20Simon/bMX_Project_S_Invetech_Capability_%20Summary_a2.ppt Windows: 15.8 MB (16,668,160 bytes) cmd: 16,668,160

Re: Wget Bug: recursive get from ftp with a port in the url fails

2006-04-13 Thread Hrvoje Niksic
Jesse Cantara [EMAIL PROTECTED] writes: A quick resolution to the problem is to use the -nH command line argument, so that wget doesn't attempt to create that particular directory. It appears as if the problem is with the creation of a directory with a ':' in the name, which I cannot do

Wget Bug: recursive get from ftp with a port in the url fails

2006-04-12 Thread Jesse Cantara
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is: wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*Where Directory contains multiple subdirectories, all

Re: Bug in ETA code on x64

2006-04-03 Thread Thomas Braby
- Original Message - From: Hrvoje Niksic [EMAIL PROTECTED] Date: Tuesday, March 28, 2006 7:23 pm in progress.c line 880: eta_hrs = (int)(eta / 3600, eta %= 3600); eta_min = (int)(eta / 60, eta %= 60); eta_sec = (int)(eta); This is weird. Did you compile the code

Re: Bug in ETA code on x64

2006-04-03 Thread Hrvoje Niksic
Thomas Braby [EMAIL PROTECTED] writes: eta_hrs = (int) (eta / 3600), eta %= 3600; Yes that also works. The cast is needed on Windows x64 because eta is a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a warning is issued. The same is the case on 32-bit Windows, and also

Bug report

2006-04-01 Thread Gary Reysa
Hi, I don't really know if this is a Wget bug, or some problem with my website, but, either way, maybe you can help. I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred pages (260MB of storage total). Someone did a Wget on my site, and managed to log 111,000 hits

Re: Bug report

2006-04-01 Thread Frank McCown
Gary Reysa wrote: Hi, I don't really know if this is a Wget bug, or some problem with my website, but, either way, maybe you can help. I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred pages (260MB of storage total). Someone did a Wget on my site, and managed to log

A very little bug.

2006-03-30 Thread fRoGGz SecuBox Labs
Tested on: GNU Wget 1.9.1 (Win32) Tested on: GNU Wget 1.10.2 (Win32) Example: wget http://Check.Your.CPU.Usage/con; Or wget http://Check.Your.CPU.Usage/con.txt; You can also used aux, prn, con, lpt1, ltp2, com1, com2, ... Regards, fRoGGz ([EMAIL PROTECTED]) SecuBox Labs -

Re: Bug in ETA code on x64

2006-03-29 Thread Greg Hurrell
El 28/03/2006, a las 20:43, Tony Lewis escribió: Hrvoje Niksic wrote: The cast to int looks like someone was trying to remove a warning and botched operator precedence in the process. I can't see any good reason to use , here. Why not write the line as: eta_hrs = eta / 3600; eta %=

Re: Bug in ETA code on x64

2006-03-29 Thread Hrvoje Niksic
Thomas got his version of progress.c because it seems that the change has introduced the bug.

Re: Bug in ETA code on x64

2006-03-28 Thread Hrvoje Niksic
Thomas Braby [EMAIL PROTECTED] writes: With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64 I was getting no ETA until late in the transfer, when I'd get things like: 49:49:49 then 48:48:48 then 47:47:47 etc. So I checked the eta value in seconds and it was correct, so

RE: Bug in ETA code on x64

2006-03-28 Thread Tony Lewis
Hrvoje Niksic wrote: The cast to int looks like someone was trying to remove a warning and botched operator precedence in the process. I can't see any good reason to use , here. Why not write the line as: eta_hrs = eta / 3600; eta %= 3600; This makes it much less likely that someone

Possible bug

2006-03-08 Thread Lawrence E Schwartz
Hello, Sometimes passwords contain @s. When they do, it seems to cause wget problems if the URL has the password encoded in it (for example, ftp://username:[EMAIL PROTECTED]@/directory). The same sort of URL encoding works fine in wput. Thank you for the fine software, Larry

Wget 1.10.2 bug

2006-03-08 Thread cerise
Running this command: rm *.jpg ; wget -O usscole_90.jpg -nc --random-wait --referer=http://www.pianoladynancy.com/recovery_usscole.htm -- http://www.pianoladynancy.com/images/usscole_90.jpg generates the error: File `usscole_90.jpg' already there; not retrieving. However: rm

Re: Wget 1.10.2 bug

2006-03-08 Thread Steven M. Schweda
It seems to me that the -O option has wget touching the file which wget then detects. Close enough. With -O, Wget opens the output file before it does any transfers, so when the program gets serious about the transfer, the file will exist, and that will confuse the -nc processing.

Bug in TOLOWER macro when STANDALONE (?)

2006-03-06 Thread Beni Serfaty
I Think I found a bug when STANDALONE is defined on hash.cI hope I'm not missing something here...(Please cc me the replies)@@ -63,7 +63,7 @@ if not enough memory */ # define xfree free# define countof(x) (sizeof (x) / sizeof ((x)[0]))-# define TOLOWER(x) ('A' = (x) (x) = 'Z' ? (x) - 32 : (x

Re: Bug in TOLOWER macro when STANDALONE (?)

2006-03-06 Thread Hrvoje Niksic
Beni Serfaty [EMAIL PROTECTED] writes: I Think I found a bug when STANDALONE is defined on hash.c I hope I'm not missing something here... Good catch, thanks. I've applied a slightly different fix, appended below. By the way, are you using hash.c in a project? I'd like to hear if you're

wget bug: doesn't CWD after ftp failure

2006-03-05 Thread Nate Eldredge
Hi folks, I think I have found a bug in wget where it fails to change the working directory when retrying a failed ftp transaction. This is wget 1.10.2 on FreeBSD-6.0/amd64. I was trying to use wget to get files from a broken ftp server which occasionally sends garbled responses, causing

Bug? -k not compatible with -O

2006-03-02 Thread Greg McCann
for the default output filename processcandquicksearch rather than the filename that I specified with the -O option. This seems to be a bug, though I can work around it with... wget -k blah blah blah mv default filename my filename (Note: I am using wget version 1.9.1) Best regards, Greg McCann

BUG: timestamping and output file do not work together

2006-02-16 Thread Martin Kos
hi i've just posted my comments on the mailinglist [1]. wget doesn't behave the right way if i use the out --output-document option and --timestamping together. wget tries to compare the url-file with the original file instead with the --output-document file. why i got to this problem was

Buffer overflow bug in base64_encode

2006-01-06 Thread rick
Hello all, I discovered a buffer overflow bug in the base64_encode() function, located at line 1905 in file src\utils.c. Note that this bug is in the latest version of the program (version 1.10.2) The bug appears to be that the function is assuming that the input data is a size that is an even

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Hrvoje Niksic
[EMAIL PROTECTED] (Steven M. Schweda) writes: and adding it fixed many problems with FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my two-step CWD for a relative path? That is: [...] That should work too. On Unix-like FTP servers,

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Steven M. Schweda
From: Hrvoje Niksic [...] On Unix-like FTP servers, the two methods would be equivalent. Right. So I resisted temptation, and kept the two-step CWD method in my code for only a VMS FTP server. My hope was that some one would look at the method, say That's a good idea, and change the if

wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Hello, current wget seems to have the following bug in the ftp retrieval code: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Arne Caspari [EMAIL PROTECTED] writes: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD into the directory first. I think the correct

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Mauro Tortonesi
Hrvoje Niksic wrote: Arne Caspari [EMAIL PROTECTED] writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET A/B/F.X first, then: CWD A/B GET F.X if

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Thank you all for your very fast response. As a further note: When this error occurs, wget bails out with the following error message: No such directory foo/bar. I think it should instead be Could not access foo/bar: Permission denied or similar in such a situation. /Arne Mauro Tortonesi

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: Hrvoje Niksic wrote: Arne Caspari [EMAIL PROTECTED] writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Hrvoje Niksic [EMAIL PROTECTED] writes: That might work. Also don't prepend the necessary prepending of $CWD to those paths. Oops, I meant don't forget to prepend

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic Also don't [forget to] prepend the necessary [...] $CWD to those paths. Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. As you might recall from my changes for VMS FTP servers (if you had ever looked at them), this scheme causes no end

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Daniel Stenberg
On Fri, 25 Nov 2005, Steven M. Schweda wrote: Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. I agree. What good would prepending do? It will most definately add problems such as those Steven describes. -- -=- Daniel Stenberg -=-

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic Prepending is already there, Yes, it certainly is, which is why I had to disable it in my code for VMS FTP servers. and adding it fixed many problems with FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my

FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
According to the wget release notes for 1.10 *** Talking to SSL/TLS servers over proxies now actually works. Previous versions of Wget erroneously sent GET requests for https URLs. Wget 1.10 utilizes the CONNECT method designed for this purpose. However, I have tried versions 1.10, 1.10.1, and

<    1   2   3   4   5   6   7   >