-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Micah,
Many thanks with all your very timely help. I have had no issues since
following you instructions to upgrade to 1.11.4 and installing it in the /opt
directory. I used:
$ ./configure --prefix=/opt/wget
And point to ist specifically:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Brock Murch wrote:
I try to keep a mirror of NASA atteph ancilliary data for modis processing. I
know that means little, but I have a cron script that runs 2 times a day.
Sometimes it works, and others, not so much. The sh script is listed at the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Micah Cowan wrote:
I believe we made some related fixes more recently. You provided a great
amount of useful information, but one thing that seems to be missing (or
I missed it) is the Wget version number. Judging from the log, I'd say
it's 1.10.2
Micah,
Thanks for your quick attention to this. Yous, I probably forgot to include
the version #
[EMAIL PROTECTED] atteph]# wget --version
GNU Wget 1.10.2 (Red Hat modified)
Copyright (C) 2005 Free Software Foundation, Inc.
This program is distributed in the hope that it will be useful,
but
Hello.
During download with wget I've redirected output into file with the
following command:
$ LC_ALL=C wget -o output
'ftp://mirror.yandex.ru/gentoo-distfiles/distfiles/OOo_3.0.0rc4_20080930_LinuxIntel_langpack_en-GB.tar.gz'
I've set LC_ALL and LANG explicitly to be sure that this is not
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
I try to keep a mirror of NASA atteph ancilliary data for modis processing. I
know that means little, but I have a cron script that runs 2 times a day.
Sometimes it works, and others, not so much. The sh script is listed at the
end of this email
that to a minimum.
Anyway, I've been researching unicode and utf-8 recently, so I'm gonna try
to tackle bug #21793 https://savannah.gnu.org/bugs/?21793.
-David A Coon
down
the line about best practices and such, though I'll try to keep that to
a minimum.
Anyway, I've been researching unicode and utf-8 recently, so I'm gonna
try to tackle bug #21793 https://savannah.gnu.org/bugs/?21793.
Hi David, and welcome!
If you haven't already, please see
http
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
vinothkumar raman wrote:
We need to give out the time stamp the local file in the Request
header for that we need to pass on the local file's time stamp from
http_loop() to get_http() . The only way to pass on this without
altering the signature
pages if it is not returning a 304 response
Is it so?
On Fri, Aug 29, 2008 at 11:06 PM, Micah Cowan [EMAIL PROTECTED] wrote:
Follow-up Comment #4, bug #20329 (project wget):
verbatim-mode's not all that readable.
The gist is, we should go ahead and use If-Modified-Since, perhaps even now
Hi all,
We need to give out the time stamp the local file in the Request
header for that we need to pass on the local file's time stamp from
http_loop() to get_http() . The only way to pass on this without
altering the signature of the function is to add a field to struct url
in url.h
Could we
Hi all,
We need to give out the time stamp the local file in the Request
header for that we need to pass on the local file's time stamp from
http_loop() to get_http() . The only way to pass on this without
altering the signature of the function is to add a field to struct url
in url.h
Could we
This mean we should remove the previous HEAD request code and use
If-Modified-Since by default and have it to handle all the request and
store pages if it is not returning a 304 response
Is it so?
On Fri, Aug 29, 2008 at 11:06 PM, Micah Cowan [EMAIL PROTECTED] wrote:
Follow-up Comment #4, bug
Micah Cowan wrote: The thing is, though, those two threads should be running
wgets under separate processes
Yes, the two threads are running wgets under seperate processes with system.
What operating system are you running? Vista?mipsel-linux with kernel v2.4
built from gcc v3.3.5
Best
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
kuang-cheng chao wrote:
Dear Micah:
Thanks for your work of wget.
There is a question about two wgets run simultaneously.
In method resolve_bind_address, wget assumes that this is called once.
However, this will cause two domain name with
Micah Cowan wrote:
Have you reproduced this, or is this in theory? If the latter, what has led
you to this conclusion? I don't see anything in the code that would cause
this behavior.
I reproduce this. But I can't make sure the really problem is in
resolve_bind_address.
In the attached
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
k.c. chao wrote:
Micah Cowan wrote:
Have you reproduced this, or is this in theory? If the latter, what has
led you to this conclusion? I don't see anything in the code that would
cause this behavior.
I reproduce this. But I can't make sure
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
Hi,
I am getting a strange bug when I use wget to download a binary file
from a URL versus when I manually download.
The attached ZIP file contains two files:
05.upc --- manually downloaded
dum.upc
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
Hi,
Thanks for the prompt response.
I am using
GNU Wget 1.10.2
I tried a few things on your suggestion but the problem remains.
1. I exported the cookies file in Internet Explorer and specified
that in the Wget
Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3bwget
ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/;
-P c:\Downloads\
--- command end ---
wget cant convert .listing-file into a html-file
regards
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sir Vision wrote:
Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3bwget
ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/;
-P c:\Downloads\
--- command
wget-1.11.1 (and 1.10/1.10.1) don't handle the .listing file properly when -c
is used.
It just appends to that file instead of replacing it which means that wget
tries to download each
file twice when you run the same command twice. Have a look at this log:
wget -m -nd -c
:
Hi,
I posted this bug over two years ago:
http://marc.info/?l=wgetm=113252747105716w=4
From the release notes I see that this is still not resolved. Are
there any plans to fix this any time soon?
I'm not sure that's a bug. It's more of an architectural choice.
Wget currently
Hi,
I posted this bug over two years ago:
http://marc.info/?l=wgetm=113252747105716w=4
From the release notes I see that this is still not resolved. Are
there any plans to fix this any time soon?
Thanks
Mark
Micah Cowan [EMAIL PROTECTED] writes:
The new Wget flags empty Set-Cookie as a syntax error (but only
displays it in -d mode; possibly a bug).
I'm not clear on exactly what's possibly a bug: do you mean the fact
that Wget only calls attention to it in -d mode?
That's what I meant.
I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
Generally, if Wget considers a header to be in error (and hence
ignores it), the user probably needs to know about that. After all,
it could be the symptom of a Wget bug, or of an unimplemented
extension the server
Hi,
I got a bug on wget when executing:
wget -a log -x -O search/search-1.html --verbose --wait 3
--limit-rate=20K --tries=3
http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
Segmentation fault (core dumped)
I created directory search.
The above creates a file search/search
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Diego Campo wrote:
Hi,
I got a bug on wget when executing:
wget -a log -x -O search/search-1.html --verbose --wait 3
--limit-rate=20K --tries=3
http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
Segmentation fault (core
-Cookie headers. That got
fixed when I converted the Set-Cookie parser to use extract_param.
The new Wget flags empty Set-Cookie as a syntax error (but only
displays it in -d mode; possibly a bug).
to release soon as
version 1.11.*
I think the old Wget crashed on empty Set-Cookie headers. That got
fixed when I converted the Set-Cookie parser to use extract_param.
The new Wget flags empty Set-Cookie as a syntax error (but only
displays it in -d mode; possibly a bug).
I'm not clear on exactly
Hello,
I'm wondering if I've found a bug in the excellent wget.
I'm not asking for help, because it turned out not to be the reason
one of my scripts was failing.
The possible bug is in the derivation of the filename from a URL which
contains UTF-8.
The case is:
wget http://en.wikipedia.org
On 10/4/07, Brian Keck [EMAIL PROTECTED] wrote:
I would have sent a fix too, but after finding my way through http.c
retr.c I got lost in url.c.
You and me both. A lot of the code needs re-written.. there's a lot of
spaghetti code in there. I hope Micah chooses to do a complete
re-write for
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
On 10/4/07, Brian Keck [EMAIL PROTECTED] wrote:
I would have sent a fix too, but after finding my way through http.c
retr.c I got lost in url.c.
You and me both. A lot of the code needs re-written.. there's a lot of
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Brian Keck wrote:
Hello,
I'm wondering if I've found a bug in the excellent wget.
I'm not asking for help, because it turned out not to be the reason
one of my scripts was failing.
The possible bug is in the derivation of the filename from
Micah Cowan [EMAIL PROTECTED] writes:
It is actually illegal to specify byte values outside the range of
ASCII characters in a URL, but it has long been historical practice
to do so anyway. In most cases, the intended meaning was one of the
latin character sets (usually latin1), so Wget was
---BeginMessage---
Hi,I am using wget 1.10.2 in Windows 2003.And the same problem like Cantara.
The file system is NTFS.
Well I find my problem is, I wrote the command in schedule tasks like this:
wget -N -i D:\virus.update\scripts\kavurl.txt -r -nH -P
d:\virus.update\kaspersky
well, after
Hrvoje Niksic wrote:
Subject:
Re: Wget Bug: recursive get from ftp with a port in the url fails
From:
baalchina [EMAIL PROTECTED]
Date:
Mon, 17 Sep 2007 19:56:20 +0800
To:
[EMAIL PROTECTED]
To:
[EMAIL PROTECTED]
Message-ID:
[EMAIL PROTECTED]
MIME-Version:
1.0
Content-Type
Hello,
What the heck was this code supposed to do in ftp-ls.c? If there is only a
single
space between the previous token and the filesize, then t points at the
NULL
character, and filesize is thought to be 0, resulting in a mismatch
everytime.
ptok is already pointing at the start of the
On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
sprintf(filecopy, \%.2047s\, file);
This fix breaks the FTP protocol, making wget instantly stop working
with many conforming servers, but apparently start working with yours;
the RFCs are very clear that the file name argument starts
On 7/15/07, Rich Cook [EMAIL PROTECTED] wrote:
I think you may well be correct. I am now unable to reproduce the
problem where the server does not recognize a filename unless I give
it quotes. In fact, as you say, the server ONLY recognizes filenames
WITHOUT quotes and quoting breaks it. I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
sprintf(filecopy, \%.2047s\, file);
This fix breaks the FTP protocol, making wget instantly stop working
with many conforming servers, but apparently start working with
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
On OS X, if a filename on the FTP server contains spaces, and the remote
copy of the file is newer than the local, then wget gets thrown into a
loop of No such file or directory endlessly. I have changed the
following in
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Mauro Tortonesi wrote:
Micah Cowan ha scritto:
Update of bug #20323 (project wget):
Status: Ready For Test = In
Progress
___
Follow-up Comment #3
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Joshua David Williams wrote:
URL:
http://savannah.gnu.org/bugs/?20466
...
Details:
This patch forces the --no-directories option if we're not actually keeping
the files we're downloading (as in the --delete-after and --spider options).
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
The following bug was submitted to Debian's bug tracker.
I'm curious what people think about this suggestion.
Don't we already check for something like redirected output (and force
the progress indicator to dots)? It seems to me
On Mon, 9 Jul 2007 15:06:52 +1200
[EMAIL PROTECTED] wrote:
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
Any ideas?
hi
Mauro Tortonesi schrieb:
On Mon, 9 Jul 2007 15:06:52 +1200
[EMAIL PROTECTED] wrote:
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Matthew Woehlke wrote:
Micah Cowan wrote:
The wget-notify mailing list
(http://addictivecode.org/mailman/listinfo/wget-notify) will now also be
receiving notifications of bug updates from GNU Savannah, in addition to
subversion commits
Micah Cowan wrote:
Matthew Woehlke wrote:
Micah Cowan wrote:
...any reason to not CC bug updates here also/instead? That's how e.g.
kwrite does thing (also several other lists AFAIK), and seems to make
sense. This is 'bug-wget' after all :-).
It is; but it's also 'wget'.
Hmm, so it is; my
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
Any ideas?
thank you
Dr Nikolaus Hermanspahn
Advisor (Science)
National Radiation
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Tony Lewis wrote:
The “Report a Bug” section of http://www.gnu.org/software/wget/ should
encourage submitters to send as much relevant information as possible
including wget version, operating system, and command line. The
submitter should also
Micah Cowan wrote:
This information is currently in the bug submitting form at Savannah:
That looks good.
I think perhaps such things as the wget version and operating system
ought to be emitted by default anyway (except when -q is given).
I'm not convinced that wget should ordinarily emit
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Micah Cowan wrote:
Tony Lewis wrote:
The “Report a Bug” section of http://www.gnu.org/software/wget/ should
encourage submitters to send as much relevant information as possible
including wget version, operating system, and command line
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
The wget-notify mailing list
(http://addictivecode.org/mailman/listinfo/wget-notify) will now also be
receiving notifications of bug updates from GNU Savannah, in addition to
subversion commits.
- --
Micah J. Cowan
Programmer, musician
From various:
[...]
char filecopy[2048];
if (file[0] != '') {
sprintf(filecopy, \%.2047s\, file);
} else {
strncpy(filecopy, file, 2047);
}
[...]
It should be:
sprintf(filecopy, \%.2045s\, file);
[...]
I'll admit to being old and grumpy, but am I the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Steven M. Schweda wrote:
From various:
[...]
char filecopy[2048];
if (file[0] != '') {
sprintf(filecopy, \%.2047s\, file);
} else {
strncpy(filecopy, file, 2047);
}
[...]
It should be:
sprintf(filecopy,
: Wednesday, July 04, 2007 10:18 AM
To: [EMAIL PROTECTED]
Subject: bug and patch: blank spaces in filenames causes looping
On OS X, if a filename on the FTP server contains spaces, and the
remote copy of the file is newer than the local, then wget gets
thrown into a loop of No such file
);
It should be:
sprintf(filecopy, \%.2045s\, file);
in order to leave room for the two quotes.
Tony
-Original Message-
From: Rich Cook [mailto:[EMAIL PROTECTED]
Sent: Wednesday, July 04, 2007 10:18 AM
To: [EMAIL PROTECTED]
Subject: bug and patch: blank spaces in filenames causes looping
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Tony Lewis [EMAIL PROTECTED] writes:
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
arbitrary limits on file name length.
If it uses the heap, then
Tony Lewis [EMAIL PROTECTED] writes:
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, \%.2047s\, file);
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and arbitrary limits on file
name
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was supposed to indicate how to free the string.
Virden, Larry W. [EMAIL PROTECTED] writes:
Tony Lewis [EMAIL PROTECTED] writes:
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
arbitrary limits on file name length.
If it uses the heap, then doesn't that open a hole where a
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it? I'd use asprintf, but I'm afraid to
suggest that here as it may not be portable.
On Jul 5, 2007, at 10:45 AM, Hrvoje Niksic wrote:
Tony Lewis [EMAIL PROTECTED] writes:
There is a buffer overflow
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was supposed to indicate how to free the
Please remove me from this list. thanks,
John Bruso
From: Rich Cook [mailto:[EMAIL PROTECTED]
Sent: Thu 7/5/2007 12:30 PM
To: Hrvoje Niksic
Cc: Tony Lewis; [EMAIL PROTECTED]
Subject: Re: bug and patch: blank spaces in filenames causes looping
On Jul 5, 2007
Rich Cook [EMAIL PROTECTED] writes:
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)? Or do I need to do more to make that happen, or...? Thanks!
On Jul 5, 2007, at 12:52 PM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the future)?
Or do I need to do more to make that happen, or...? Thanks!
Well, I need a chance to
Thanks for the follow up. :-)
On Jul 5, 2007, at 3:52 PM, Micah Cowan wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)?
Or do
Hello,
using Wget 1.10.2 I noticed that the man page description for
--no-proxy says:
For more information about the use of proxies with Wget,
... and that's all. The original contains an @xref, which gets
swallowed by texi2pod.
I don't know how/if it should be repaired, but I thought
Mario Ander schrieb:
Hi everybody,
I think there is a bug storing cookies with wget.
See this command line:
C:\Programme\wget\wget --user-agent=Opera/8.5 (X11;
U; en) --no-check-certificate --keep-session-cookies
--save-cookies=cookie.txt --output-document=-
--debug --output-file
Matthias Vill schrieb:
Mario Ander schrieb:
Hi everybody,
I think there is a bug storing cookies with wget.
See this command line:
C:\Programme\wget\wget --user-agent=Opera/8.5 (X11;
U; en) --no-check-certificate --keep-session-cookies
--save-cookies=cookie.txt --output-document
Hi everybody,
I think there is a bug storing cookies with wget.
See this command line:
C:\Programme\wget\wget --user-agent=Opera/8.5 (X11;
U; en) --no-check-certificate --keep-session-cookies
--save-cookies=cookie.txt --output-document=-
--debug --output-file=debug.txt
--post-data=name
To: recipient-removed
Subject: RE: File issue using WGET
Your FTP server must have changed the output of the listing format or,
more precisely, the string representation of some of the components has
changed such that only one space separates the group name from the
file-size. The bug is, of course
Highlord Ares wrote:
it tries to download web pages named similar to
http://site.com?variable=yesmode=awesome
http://site.com?variable=yesmode=awesome
Since is a reserved character in many command shells, you need to quote
the URL on the command line:
wget
when I run wget on a certain sites, it tries to download web pages named
similar to http://site.com?variable=yesmode=awesome. However, wget isn't
saving any of these files, no doubt because of some file naming issue? this
problem exists in both the Windows unix versions.
hope this helps
This does not look like a valid URL to me - shouldn't there be a slash at the
end of the domain name?
Also, when talking about a bug (or anything else), it is always helpful if you
specify the wget version (number).
From: [EMAIL PROTECTED] [mailto:[EMAIL
Greetings,
Stumbled across a bug yesterday reproduced in both v1.8.2 and 1.10.2.
Apparently, recursive get tries to open the file for reading after
downloading, to download subsequent files. Problem is, when used with
-O - to deliver to stdout, it cannot open that file, so you get
A quick search at http://www.mail-archive.com/wget@sunsite.dk/; for
-O found:
http://www.mail-archive.com/wget@sunsite.dk/msg08746.html
http://www.mail-archive.com/wget@sunsite.dk/msg08748.html
The way -O is implemented, there are all kinds of things which are
incompatible with
Neil wrote:
When giving it some thought I think a
valid argument could be made that the string in the CSS document is not
exactly
an URL but it is certainly URL-like.
The URL-like strings in CSS are actually standard URLs, either absolute or
relative, so they shouldn't be a big deal to
J.F.Groff wrote:
Amazingly I found this feature request in a 2003 message to this very
mailing
list. Are there only a few lunatics like me who think this should be
included?
Wget is written and maintained by volunteers. What you need to find is a
lunatic willing to volunteer to write the code
Hi Tony,
Amazingly I found this feature request in a 2003 message to this very
mailing
list. Are there only a few lunatics like me who think this should be
included?
Wget is written and maintained by volunteers. What you need to find is a
lunatic willing to volunteer to write the code to
Oh wait. Somebody already did the patch!
http://www.mail-archive.com/[EMAIL PROTECTED]/msg09502.html
http://article.gmane.org/gmane.comp.web.wget.patches/1867
I guess it's up to maintainers to decide whether to include this in
the standard wget distribution. In the meantime, hearty thanks to
Hi
If i connect with wget 1.10.2 (Debian Etch Ubuntu Feisty Fawn) to a
secure host, that uses multiple cnames in the certificate i get the
following error:
[EMAIL PROTECTED]:~$ wget https://host.domain.tld
--10:18:55-- https://host.domain.tld/
= `index.html'
Resolving
I think I found a bug in CSS processing. This was auto-generated and I'm far
from a CSS expert (quite the opposite). But, as far as I can tell (see
snippet below), it is supposed to be loaded from a directory named - that
is off of the main URL. For example, if the origination site is
http
Hrvoje Niksic [EMAIL PROTECTED] writes:
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It's starting to look like a consensus. A Google search for:
wget DONE_CWD
finds:
http://www.mail-archive.com/wget@sunsite.dk/msg08741.html
That bug is fixed in subversion, revision 2194.
I
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It's starting to look like a consensus. A Google search for:
wget DONE_CWD
finds:
http://www.mail-archive.com/wget@sunsite.dk/msg08741.html
That bug is fixed in subversion, revision 2194.
Hello,
If wget cannot connect to the FTP server the first time,
it fails to CD properly after checking the path with PWD.
Here is a -d listing when connecting after failing. Thanks!
Jason
$cmd = wget -d --limit-rate=999k --tries=0 --no-remove-listing -N
$ftp/*.rpm;
--11:06:12--
I downloaded 1.10.2 source code.
u-cmd goes from 0x1B to 0x19, dropping DO_CMD on the second call
to ftp.c:getftp() after connection failure. I'm trying to debug THE loop.
Jason
_
Watch free concerts with Pink, Rod Stewart, Oasis
This is inverted in ftp.c:
if (con-csock != -1)
con-st = ~DONE_CWD;
else
con-st |= DONE_CWD;
If not error, request cwd?
If error, cwd done?
It's backwards. Changing != to == solves the bug.
Thanks!
Jason
It's starting to look like a consensus. A Google search for:
wget DONE_CWD
finds:
http://www.mail-archive.com/wget@sunsite.dk/msg08741.html
Steven M. Schweda [EMAIL PROTECTED]
382 South
From: Robert Dick
When serializing sucessive copies of a page, the serial number appears
at the end of the extension, i.e, what should be file1.html is called
file.html.1 I'm using wget ver. 1.10.2. with the default options on
Windows ME ...
I can see how that might annoy a Windows user,
: [EMAIL PROTECTED]
Subject: Re: file numbering bug
From: Robert Dick
When serializing sucessive copies of a page, the serial number appears
at the end of the extension, i.e, what should be file1.html is called
file.html.1 I'm using wget ver. 1.10.2. with the default options on
Windows ME
Hi Mauro (I'm guessing here - got this from the web page)
Here is a patch against 1.10.2 which fixes an issue I found when using
NTLM with Microsoft's Intermittent Information Server (IIS).
The issue is not with wget, but rather a bug in IIS. Nevertheless, here
is the fix and a description
Thanks for the report and the (correct) analysis. This patch fixes
the problem in the trunk.
2007-01-23 Hrvoje Niksic [EMAIL PROTECTED]
* cookies.c (parse_set_cookie): Would erroneously discard cookies
with unparsable expiry time.
Index: src/cookies.c
(Resend as I've received no reply to the original message.)
Kind wget maintainers,
I believe I found a bug in the wget cookie expiry handling. Recently
I was using wget receiving back a cookie with an expiration of Sun,
20-Sep-2043 19:37:28 GMT.
This fits inside a 32-bit unsigned long
Hi,
Have been downloading slackware-11.0-install-dvd.iso, but It seems wget
downloaded more then filesize and I found:
-445900K .. .. .. .. ..119%
18.53 KB/s
in wget-log.
Regards,
Yuriy Padlyak
The file was probably being uploaded when you started downloading it, so
the HTTP server continued sending data even over the initially reported
filesize.
Just stop wget, and start it again with option -c to resume download.
MT
Le mercredi 17 janvier 2007 à 18:16 +0200, Yuriy Padlyak a écrit :
From: Yuriy Padlyak
Have been downloading slackware-11.0-install-dvd.iso, but It seems wget
downloaded more then filesize and I found:
-445900K .. .. .. .. ..119%
18.53 KB/s
in wget-log.
As usual, it would help if you provided some basic
Juhana Sadeharju wrote:
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1.
First, the bin/wgetdir is defined as
wget -p -E -k --proxy=off -e robots=off --passive-ftp
-o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50
--waitretry=10 $@
The download command
1 - 100 of 678 matches
Mail list logo