RE: file numbering bug
PROTECTED] Sent: Thursday, March 08, 2007 11:50 AM To: WGET@sunsite.dk Cc: [EMAIL PROTECTED] Subject: Re: file numbering bug From: Robert Dick > When serializing sucessive copies of a page, the serial number appears > at the end of the extension, i.e, what should be file1.html is called >
Re: file numbering bug
From: Robert Dick > When serializing sucessive copies of a page, the serial number appears > at the end of the extension, i.e, what should be file1.html is called > file.html.1 I'm using wget ver. 1.10.2. with the default options on > Windows ME ... I can see how that might annoy a Windows use
file numbering bug
Hi! When serializing sucessive copies of a page, the serial number appears at the end of the extension, i.e, what should be file1.html is called file.html.1 I'm using wget ver. 1.10.2. with the default options on Windows ME ... Cheers, rd
ntlm "already authenticated" bug and fix.
Hi Mauro (I'm guessing here - got this from the web page) Here is a patch against 1.10.2 which fixes an issue I found when using NTLM with Microsoft's Intermittent Information Server (IIS). The issue is not with wget, but rather a bug in IIS. Nevertheless, here is the fix and a desc
Re: wget-1.10.2 cookie expiry bug
Thanks for the report and the (correct) analysis. This patch fixes the problem in the trunk. 2007-01-23 Hrvoje Niksic <[EMAIL PROTECTED]> * cookies.c (parse_set_cookie): Would erroneously discard cookies with unparsable expiry time. Index: src/cookies.c ==
wget-1.10.2 cookie expiry bug
(Resend as I've received no reply to the original message.) Kind wget maintainers, I believe I found a bug in the wget cookie expiry handling. Recently I was using wget receiving back a cookie with an expiration of "Sun, 20-Sep-2043 19:37:28 GMT". This fits inside a 32-bi
Re: Possibly bug
From: Yuriy Padlyak > Have been downloading slackware-11.0-install-dvd.iso, but It seems wget > downloaded more then filesize and I found: > > "-445900K .. .. .. .. ..119% > 18.53 KB/s" > > in wget-log. As usual, it would help if you provided some b
Re: Possibly bug
The file was probably being uploaded when you started downloading it, so the HTTP server continued sending data even over the initially reported filesize. Just stop wget, and start it again with option -c to resume download. MT Le mercredi 17 janvier 2007 à 18:16 +0200, Yuriy Padlyak a écrit :
Possibly bug
Hi, Have been downloading slackware-11.0-install-dvd.iso, but It seems wget downloaded more then filesize and I found: "-445900K .. .. .. .. ..119% 18.53 KB/s" in wget-log. Regards, Yuriy Padlyak
Re: Bug in 1.10.2 vs 1.9.1
Juhana Sadeharju wrote: Hello. Wget 1.10.2 has the following bug compared to version 1.9.1. First, the bin/wgetdir is defined as wget -p -E -k --proxy=off -e robots=off --passive-ftp -o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50 --waitretry=10 $@ The download command
More detail on bug
> When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta > 1(with spider patch) command-line > switches wget does not pay attention to neither of them. It saves files in > the current directory. Wget v1.10.2 worked right. Such incorrent behaviour appeares only if server http
directory-prefix bug in Win32
Hi! When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta 1(with spider patch) command-line switches wget does not pay attention to neither of them. It saves files in the current directory. Wget v1.10.2 worked right. Hope, this bug won't live long :).
WGet Bug: Local URLs containing colons do not work
, then it will create a link like this: Fish Unfortunately, this is not a valid URL, because the browser interprets the 'Category:' as the protocol "Category", not the local filename 'Category:' I am not sure of the best way to address this bug, because I am not sur
WGet Bug: Local URLs containing colons do not work
er using --restrict-file-names=windows, but unfortunately this does not fix the problem because the browser will un-escape the URL and will still continue to look for a file with a colon in it. I am not sure of the best way to address this bug, because I am not sure if it possible to escape the &
Re: Wget auth-md5 bug
Hello, I was wondering what the status of this report is: has it even been received? I've gotten no acknowledgement in two weeks or so. Thanks, Eugene Thus spake Eugene Y. Vasserman on Mon, 20 Nov 2006: > From: "Eugene Y. Vasserman" <[EMAIL PROTECTED]> > Subject: Wg
Bug in 1.10.2 vs 1.9.1
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1. First, the bin/wgetdir is defined as wget -p -E -k --proxy=off -e robots=off --passive-ftp -o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50 --waitretry=10 $@ The download command is wgetdir http
Re: wget bug in finding files after disconnect
Paul Bickerstaff <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]: > I'm using wget version "GNU Wget 1.10.2 (Red Hat modified)" on a fedora > core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b > on a WinXP laptop. Both display the same faulty behaviour which I don't > bel
wget bug in finding files after disconnect
I'm using wget version "GNU Wget 1.10.2 (Red Hat modified)" on a fedora core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b on a WinXP laptop. Both display the same faulty behaviour which I don't believe was present in earlier versions of wget that I've used. When the internet
wget-1.10.2 cookie expiry bug
Kind wget maintainers, I believe I found a bug in the wget cookie expiry handling. Recently I was using wget receiving back a cookie with an expiration of "Sun, 20-Sep-2043 19:37:28 GMT". This fits inside a 32-bit unsigned long but unfortunately overflows a 32-bit signed long
wget bug
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.
Re: BUG - .listing has sprung into existence
From: Sebastian "Doctor, it hurts when I do this." "Don't do that." Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
Bug ?
Hello, At night I wanted to download new Fedora Core 6 DVD and wget downloaded it all, closed connection and then tried to retry download. See below: ~ $ wget ftp://ftp.uninett.no/pub/linux/Fedora/core/6/i386//iso/FC-6-i386-DVD.iso --00:06:42-- ftp://ftp.uninett.no/pub/linux/Fedora/cor
BUG - .listing has sprung into existence
Hi, I am using Wget is 1.10.2 unter Windows XP. If I run (sensible data is replaced by ##): wget --dont-remove-listing -b -o %temp%\0.log -P %temp%\result\ ftp://##.##.##.##/result/*0.* Everything works fine. If I execute the same command again I get the following error: ##/result/.listing h
BUG - .listing has sprung into existence
Hi, I am using Wget is 1.10.2 unter Windows XP. If I run (sensible data is replaced by ##): wget --dont-remove-listing -b -o %temp%\0.log -P %temp%\result\ ftp://##.##.##.##/result/*0.* Everything works fine. If I execute the same command again I get the following error: ##/result/.listing
Re: new wget bug when doing incremental backup of very large site
>From dev: > I checked and the .wgetrc file has "continue=on". Is there any way to > surpress the sending of getting by byte range? I will read through the > email and see if I can gather some more information that may be needed. Remove "continue=on" from ".wgetrc"? Consider: -N, --tim
Re: new wget bug when doing incremental backup of very large site
server, and an actual URL may be needed to persue the diagnosis from there. The memory allocation failure could be a bug, but finding it could be difficult. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
bug in --delete-after
Hello, this happened when I tried to cache www.nytimes.com web page with command wget -nd -p --delete-after http://www.nytimes.com With regards Jan Pankiewicz write(2, "Connecting to 192.168.0.4:3128.."..., 34Connecting to 192.168.0.4:3128... ) = 34 socket(PF_INET, SOCK_STREAM, IPPROTO_IP) = 3
bug in --delete-after
Hello, this what happened when I tried to cache www.nytimes.com web page with command wget -nd -p --delete-after http://www.nytimes.com wget 1.10.2 With regards Jan Pankiewicz write(2, "Connecting to 192.168.0.4:3128.."..., 34Connecting to 192.168.0.4:3128... ) = 34 socket(PF_INET, SOCK_STREA
--reject downloads then deletes - bug/feature?
Hello I'm running --reject expecting that the files are skipped from downloading, but instead they're downloaded then deleted. The whole point of using the option was to avoid downloading a database which runs to over 12,000 files (before I terminated wget!). Is this correct behaviour? If it is
Re: new wget bug when doing incremental backup of very large site
the fault may lie with the server, and an actual URL may be needed to persue the diagnosis from there. The memory allocation failure could be a bug, but finding it could be difficult. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
new wget bug when doing incremental backup of very large site
when wget already finds a local file with the same name and sends a "range" request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a bug. To
new wget bug when doing incremental backup of very large site
when wget already finds a local file with the same name and sends a "range" request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a bug. To
new wget bug when doing incremental backup of very large site
t is run when wget already finds a local file with the same name and sends a "range" request. Maybe there is some data structure that keeps getting added to so it exhausts the memory on my test box which has 2GB. There were no other programs running on the test box. This may be a
Re: Im not sure this is a bug or feature... (2GB limit?)
From: Tima Dronenko > Im not sure this is a bug or feature... wget -V If your wget version is before 1.10, it's a feature. At or after 1.10, it's a bug. (In some cases, the bug is in the server.)
Re: Im not sure this is a bug or feature... (2GB limit?)
What operating system are you using? It may be a "feature" of your operating system. At 02:19 AM 10/14/2006, Tima Dronenko wrote: >Hello :) > >Im not sure this is a bug or feature... > >I cant down load files bigger than 2GB using wget. >
Im not sure this is a bug or feature... (2GB limit?)
Hello :) Im not sure this is a bug or feature... I cant down load files bigger than 2GB using wget. Timofey. p.s. my log= /// wget -c http://uk1x1.fileplanet.com/%5E1530224706/ftp1/052006
wget --delete-after parameter bug
I’m using Wget 1.10.2 (on windows )with the following parameters: wget -r -l 1 -nd -p -T 10 --delete-after http://www.google.com/ wget crashes each time when trying to delete the files with --delete-after parameter. But, with the same parameter for other websites like, http://ww
Re: Bug
Reece ha scritto: Found a bug (sort of). When trying to get all the images in the directory below: http://www.netstate.com/states/maps/images/ It gives 403 Forbidden errors for most of the images even after setting the agent string to firefox's, and setting -e robots=off After a p
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
excellent. I meanwhile found, however, another new problem with time-stamping, which mainly occurs in connection with a proxy-cache, I will report that in a new thread. Same for a small problem with the SSL configuration. thank you very much for the useful bug reports you keep sending us ;-)
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Zitat von Mauro Tortonesi <[EMAIL PROTECTED]>: > > > > The timestamping issues I reported in above mentioned message are now also > > repaired by the patch you mailed last week here. > > Only the small *cosmetic* issue remains that it *always* says: > >Remote file is newer, retrieving. > > eve
Bug
Found a bug (sort of). When trying to get all the images in the directory below: http://www.netstate.com/states/maps/images/ It gives 403 Forbidden errors for most of the images even after setting the agent string to firefox's, and setting -e robots=off After a packet capture, it appears
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Jochen Roderburg ha scritto: Zitat von Jochen Roderburg <[EMAIL PROTECTED]>: Zitat von Hrvoje Niksic <[EMAIL PROTECTED]>: Mauro, you will need to look at this one. Part of the problem is that Wget decides to save to index.html.1 although -c is in use. That is solved with the patch attached
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Zitat von Jochen Roderburg <[EMAIL PROTECTED]>: > Zitat von Hrvoje Niksic <[EMAIL PROTECTED]>: > > > Mauro, you will need to look at this one. Part of the problem is that > > Wget decides to save to index.html.1 although -c is in use. That is > > solved with the patch attached below. But the ot
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Hrvoje Niksic ha scritto: Mauro Tortonesi <[EMAIL PROTECTED]> writes: you're right, of course. the patch included in attachment should fix the problem. since the new HTTP code supports Content-Disposition and delays the decision of the destination filename until it receives the response header
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Mauro Tortonesi <[EMAIL PROTECTED]> writes: > you're right, of course. the patch included in attachment should fix > the problem. since the new HTTP code supports Content-Disposition > and delays the decision of the destination filename until it > receives the response header, the best solution i
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
; ((hstat.len == hstat.contlen) || Index: ChangeLog === --- ChangeLog (revisione 2178) +++ ChangeLog (copia locale) @@ -1,3 +1,9 @@ +2006-08-16 Mauro Tortonesi <[EMAIL PROTECTED]> + + * http.c: Fixed bug which broke
Re: RES: BUG
7.32 MB/s) - `arte.jsp' saved [3416] Luiz Carlos Zancanella Junior -Mensagem original- De: Mauro Tortonesi [mailto:[EMAIL PROTECTED] Enviada em: segunda-feira, 10 de julho de 2006 07:04 Para: Tony Lewis Cc: 'Junior + Suporte'; [EMAIL PROTECTED] Assunto: Re: BUG Tony L
Bug report: backup files missing when using "wget -K"
LOG.IRIX64.recursive.short). See attached files for details. The script calling wget is "WGET". There was no ".wgetrc" file. You probably know the bug described at: http://www.mail-archive.com/wget@sunsite.dk/msg07686.html Remove the two "./CLEAN" commands in the script to test
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Hrvoje Niksic wrote: Noèl Köthe <[EMAIL PROTECTED]> writes: a wget -c problem report with the 1.11 alpha 1 version (http://bugs.debian.org/378691): I can reproduce the problem. If I have already 1 MB downloaded wget -c doesn't continue. Instead it starts to download again: Mauro, you will n
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Zitat von Hrvoje Niksic <[EMAIL PROTECTED]>: > Mauro, you will need to look at this one. Part of the problem is that > Wget decides to save to index.html.1 although -c is in use. That is > solved with the patch attached below. But the other part is that > hstat.local_file is a NULL pointer when
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Noèl Köthe <[EMAIL PROTECTED]> writes: > a wget -c problem report with the 1.11 alpha 1 version > (http://bugs.debian.org/378691): > > I can reproduce the problem. If I have already 1 MB downloaded wget -c > doesn't continue. Instead it starts to download again: Mauro, you will need to look at th
wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't work with HTTP]
Hello, a wget -c problem report with the 1.11 alpha 1 version (http://bugs.debian.org/378691): I can reproduce the problem. If I have already 1 MB downloaded wget -c doesn't continue. Instead it starts to download again: Weitergeleitete Nachricht > [EMAIL PROTECTED]:~$ strace
Re: bug/feature request
PROTECTED]> > Subject: bug/feature request > To: [EMAIL PROTECTED] > > Hi, > > i´m not sure if that is a feature request or a bug. > Wget does not collect all page requisites of a given URL. > Many sites are referencing components of these sites in cascading style &
bug/feature request
Hi, i´m not sure if that is a feature request or a bug. Wget does not collect all page requisites of a given URL. Many sites are referencing components of these sites in cascading style sheets, but wget does not collect these components as page requisites. A example: --- $ wget -q -p -k -nc -x
Re: Maybe a bug
On Sat, 22 Jul 2006, eduardo martins wrote: > hxxp://vp.video.google.com/videodownload?version=0&secureurl=swAAAEyWHDQ1BZdFGnJOurGFQ8XzwUdnC05S7sJSVvYH2QieipSgdZMMjfCy6CMF4XCGLAuqXc6egyRSj4rckwDLEC5i7VNUeJiDFMb-6UzrQcsYT4Y_hWfCGMxVBi9C2AMCuIwO2AmgoQ39OqHp6HglLe905loQ8H5ZMjC4KAB8J4xKeJim-uYnNL1d6RF
Maybe a bug
hxxp://vp.video.google.com/videodownload?version=0&secureurl=swAAAEyWHDQ1BZdFGnJOurGFQ8XzwUdnC05S7sJSVvYH2QieipSgdZMMjfCy6CMF4XCGLAuqXc6egyRSj4rckwDLEC5i7VNUeJiDFMb-6UzrQcsYT4Y_hWfCGMxVBi9C2AMCuIwO2AmgoQ39OqHp6HglLe905loQ8H5ZMjC4KAB8J4xKeJim-uYnNL1d6RFDhhbXZzj3xRfgOiY5b2-KD10kcEbhP6laPI3wXNJd67SJZ
Re: I got one bug on Mac OS X
o: [EMAIL PROTECTED] Subject: I got one bug on Mac OS X Dear Sir/Madam, while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/ I got most of the files, but lost some of them. I think I know where the problem is: if the link is broken i
Re: Documentation (manpage) "bug"
Linda Walsh ha scritto: FYI: On the manpage, where it talks about "no-proxy", the manpage says: --no-proxy Don't use proxies, even if the appropriate *_proxy environment variable is defined. For more information about the use of proxies with Wget,
Re: Bug in wget 1.10.2 makefile
Daniel Richard G. ha scritto: Hello, The MAKEDEFS value in the top-level Makefile.in also needs to include DESTDIR='$(DESTDIR)'. fixed, thanks. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.tortonesi.com University of Ferrara - Dept
RE: I got one bug on Mac OS X
Hrvoje Niksic wrote: > HTML has been maintained by W3C for many years I knew that (but forgot) -- just went to ietf.org out of habit looking for Internet specifications. Tony
Re: I got one bug on Mac OS X
; provision for end of line within the HREF attribute of an A tag. Unrelated to this particular bug, please note that rfc1866 is not the place to look for an up-to-date HTML specification. HTML has been maintained by W3C for many years, so it's best to look there, e.g. HTML 4.01 spec, or possibly XHTML.
Re: I got one bug on Mac OS X
ext. > > I don't see any provision for end of line within the HREF attribute > of an A tag. > > Tony > _ > > From: HUAZHANG GUO [mailto:[EMAIL PROTECTED] > Sent: Tuesday, July 11, 2006 7:48 AM > To: [EMAIL PROTECTED] > Subject: I got one bug on Mac O
RE: I got one bug on Mac OS X
From: HUAZHANG GUO [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 11, 2006 7:48 AMTo: [EMAIL PROTECTED]Subject: I got one bug on Mac OS X Dear Sir/Madam, while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/ I got most of the files, but lo
I got one bug on Mac OS X
Dear Sir/Madam,while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/I got most of the files, but lost some of them.I think I know where the problem is:if the link is broken into two lines in the index.html:Lecture 1 (Jan 17): Exploring Confor
Bug in wget 1.10.2 makefile
Hello, The MAKEDEFS value in the top-level Makefile.in also needs to include DESTDIR='$(DESTDIR)'. (build log excerpt) + make install DESTDIR=/tmp/wget--1.10.2.build/__dest__ cd src && make CC='cc' CPPFLAGS='-D__EXTENSIONS__ -D_REENTRANT -Dsparc' ... install.bin /tg/freeport/src/wget/w
Re: BUG
Tony Lewis ha scritto: Run the command with -d and post the output here. in this case, -S can provide more useful information than -d. be careful to obfuscate passwords, though!!! -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.torto
RE: BUG
Title: RE: BUG Run the command with -d and post the output here. Tony _ From: Junior + Suporte [mailto:[EMAIL PROTECTED]] Sent: Monday, July 03, 2006 2:00 PM To: [EMAIL PROTECTED] Subject: BUG Dear, I using wget to send
BUG
Dear, I using wget to send login request to a site, when wget is saving the cookies, the following error message appear: Error in Set-Cookie, field `Path'Syntax error in Set-Cookie: tu=661541|802400391 @TERRA.COM.BR; Expires=Thu, 14-Oct-2055 20:52:46 GMT; Path= at position 78. Location: http://ww
RE: Bug in GNU Wget 1.x (Win32)
ard comx, lptx, con, prn). Maybe it is possible to query the os about the currently active device names and rename the output files if neccessary ? > I reproduced the bug with Win32 versions 1.5.dontremeber, > 1.10.1 and 1.10.2. I did also test version 1.6 on Linux but it > was not affec
Bug in GNU Wget 1.x (Win32)
Hello there, I have to say that Wget is one of the most useful tools out there(from my point of view of course). I'm using the Win32 version of it to make life with XP little more bearable. (faking Internet Explorer like mad, all over the place) Well on to the thing i call a bug. The bug
Re: Buffer overflow bug in base64_encode
[EMAIL PROTECTED] writes: > I discovered a buffer overflow bug in the base64_encode() function, > located at line 1905 in file src\utils.c. Note that this bug is in the > latest version of the program (version 1.10.2) The bug appears to be that > the function is assuming that the i
Re: [WGET BUG] - Can not retreive image from cacti
>From Thomas GRIMONET: > [...] > File is created but it is empty. That's normal with "-O" if Wget fails for some reason. It might help the diagnosis to see the actual Wget command instead of the code which generates the Wget commsnd. If that doesn't show you anything, then adding "-d" to
[WGET BUG] - Can not retreive image from cacti
Hello, We are using version 1.10.2 of wget under Ubuntu and Debian. So we have many scripts that get some images from a cacti site. These scripts ran perfectly with version 1.9 of wget but they can not get image with version 1.10.2 of wget. Here you can find an example of our scripts:
wget 1.11 alpha1 - bug with timestamping option
Hi, I have tried out the wget alpha under Linux and found that the timestamping option (which I usually have defined) does not work correctly. First thing I saw, that on *every* download I got a line Remote file is newer, retrieving. in the output, even when there was no local file. That look
Documentation (manpage) "bug"
FYI: On the manpage, where it talks about "no-proxy", the manpage says: --no-proxy Don't use proxies, even if the appropriate *_proxy environment variable is defined. For more information about the use of proxies with Wget,
Bug in GNU Wget 1.x (Win32)
Hello there, I have to say that Wget is one of the most useful tools out there(from my point of view of course). I'm using the Win32 version of it to make life with XP little more bearable. (faking Internet Explorer like mad, all over the place) Well on to the thing i call a bug. The bug
A bug in wget 1.10.2
Hello, i'm using wget 1.10.2 in Windows, the windows binary version, and it have a bug when downloading with -c and with a input file. If the first file of the list is the one to be continued, wget do it fine, if not, wget try to download the files from the beginning, and it says th
Re: BUG: wget with option -O creates empty files even if the remote file does not exist
From: Eduardo M KALINOWSKI > wget http://www.somehost.com/nonexistant.html -O localfile.html > > then file "localfile.html" will always be created, and will have length > of zero even if the remote file does not exist. Because with "-O", Wget opens the output file before it does any network a
BUG: wget with option -O creates empty files even if the remote file does not exist
I'm using wget version 1.10.2. If a try to download a nonexistant file with a command like like this wget http://www.somehost.com/nonexistant.html and the file does not exist, wget reports a 404 error and no file is created. However, if I specify the file where to place the output, with
Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not suppor ted]
Herold Heiko wrote: From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] i wonder if it makes sense to add generic support for multiple headers in wget, for instance by extending the --header option like this: wget --header="Pragma: xxx" --header=dontoverride,"Pragma: xxx2" someurl That could
RE: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not suppor ted]
> From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] > i wonder if it makes sense to add generic support for > multiple headers > in wget, for instance by extending the --header option like this: > > wget --header="Pragma: xxx" --header=dontoverride,"Pragma: > xxx2" someurl That could be a proble
Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]
fact keep) "append" was that HTTP pretty much disallows duplicate headers. According to HTTP, a duplicate header field is equivalent to a single header header with multiple values joined using the "," separator -- which the bug report mentions.
Re: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]
Noèl Köthe wrote: Hello, a forwarded report from http://bugs.debian.org/366434 could this behaviour be added to the doc/manpage? i wonder if it makes sense to add generic support for multiple headers in wget, for instance by extending the --header option like this: wget --header="Pragma: x
Re: bug?
"yy :)" <[EMAIL PROTECTED]> writes: > I ran "wget -P /tmp/.test [1]http://192.168.1.10"; in SUSE system (SLES 9) > and found that it saved the file in /tmp/_test. > This command works fine inRedHat, is it a bug? I believe the bug is introduced by SuSE in an
bug?
Hi, I ran "wget -P /tmp/.test http://192.168.1.10" in SUSE system (SLES 9) and found that it saved the file in /tmp/_test. This command works fine inRedHat, is it a bug? wget version: wget-1.9.1-45.12 Thanks, VanessaGet your ringtones, operator logos and picture messages from MSN Mobile.
[Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not supported]
Hello, a forwarded report from http://bugs.debian.org/366434 could this behaviour be added to the doc/manpage? thx. > Package: wget > Version: 1.10.2-1 > It's meaningful to have multiple 'Pragma:' headers within an http > request, but wget will silently issue only a single one of them if > t
bug ?
Hello, great program but I am having a problem with it. The debug says: The sizes do not match (local 16668160) -- retrieving. ftp://ftp.invetech.com.au/Project%20Simon/bMX_Project_S_Invetech_Capability_%20Summary_a2.ppt Windows: 15.8 MB (16,668,160 bytes) cmd: 16,668,160 bMX_Project_S_In
Re: Wget Bug: recursive get from ftp with a port in the url fails
"Jesse Cantara" <[EMAIL PROTECTED]> writes: > A quick resolution to the problem is to use the "-nH" command line > argument, so that wget doesn't attempt to create that particular > directory. It appears as if the problem is with the creation of a > directory with a ':' in the name, which I cannot
Wget Bug: recursive get from ftp with a port in the url fails
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is: "wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*"Where "Directory" contain
Re: Bug in ETA code on x64
Thomas Braby <[EMAIL PROTECTED]> writes: >> eta_hrs = (int) (eta / 3600), eta %= 3600; > > Yes that also works. The cast is needed on Windows x64 because eta is > a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a > warning is issued. The same is the case on 32-bit Windows, an
Re: Bug in ETA code on x64
- Original Message - From: Hrvoje Niksic <[EMAIL PROTECTED]> Date: Tuesday, March 28, 2006 7:23 pm > > in progress.c line 880: > > > >eta_hrs = (int)(eta / 3600, eta %= 3600); > >eta_min = (int)(eta / 60, eta %= 60); > >eta_sec = (int)(eta); > > This is weird. Did you compi
Re: Bug report
Gary Reysa wrote: Hi, I don't really know if this is a Wget bug, or some problem with my website, but, either way, maybe you can help. I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred pages (260MB of storage total). Someone did a Wget on my site, and managed t
Bug report
Hi, I don't really know if this is a Wget bug, or some problem with my website, but, either way, maybe you can help. I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred pages (260MB of storage total). Someone did a Wget on my site, and managed to log 111,000 hit
Re: Bug in ETA code on x64
El 29/03/2006, a las 14:39, Hrvoje Niksic escribió: I can't see any good reason to use "," here. Why not write the line as: eta_hrs = eta / 3600; eta %= 3600; Because that's not equivalent. Well, it should be, because the comma operator has lower precedence than the assignment operato
A very little bug.
Tested on: GNU Wget 1.9.1 (Win32) Tested on: GNU Wget 1.10.2 (Win32) Example: wget "http://Check.Your.CPU.Usage/con"; Or wget "http://Check.Your.CPU.Usage/con.txt"; You can also used aux, prn, con, lpt1, ltp2, com1, com2, ... Regards, fRoGGz ([EMAIL PROTECTED]) SecuBox Labs - http://secubox.sh
Re: Bug in ETA code on x64
(see http://tinyurl.com/evo5a, http://tinyurl.com/ff4pp and numerous other locations). I'd still like to know where Thomas got his version of progress.c because it seems that the change has introduced the bug.
Re: Bug in ETA code on x64
El 28/03/2006, a las 20:43, Tony Lewis escribió: Hrvoje Niksic wrote: The cast to int looks like someone was trying to remove a warning and botched operator precedence in the process. I can't see any good reason to use "," here. Why not write the line as: eta_hrs = eta / 3600; eta %
RE: Bug in ETA code on x64
Hrvoje Niksic wrote: > The cast to int looks like someone was trying to remove a warning and > botched operator precedence in the process. I can't see any good reason to use "," here. Why not write the line as: eta_hrs = eta / 3600; eta %= 3600; This makes it much less likely that someone
Re: Bug in ETA code on x64
Thomas Braby <[EMAIL PROTECTED]> writes: > With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64 > I was getting no ETA until late in the transfer, when I'd get things > like: > > 49:49:49 then 48:48:48 then 47:47:47 etc. > > So I checked the eta value in seconds and it was corre
Bug in ETA code on x64
Hello, With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64 I was getting no ETA until late in the transfer, when I'd get things like: 49:49:49 then 48:48:48 then 47:47:47 etc. So I checked the eta value in seconds and it was correct, so the code in progress.c line 880: e