Your message dated Wed, 18 Jul 2007 15:04:36 +0100 (WEST)
with message-id <[EMAIL PROTECTED]>
and subject line #178467 wget does not honour --delete-after for some sites
has caused the attached Bug report to be marked as done.
This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.
(NB: If you are a system administrator and have no idea what I am
talking about this indicates a serious mail system misconfiguration
somewhere. Please contact me immediately.)
Debian bug tracking system administrator
(administrator, Debian Bugs database)
--- Begin Message ---
Package: wget
Version: 1.8.2-8
Severity: normal
When using wget as:
wget --spider --force-html --delete-after \
-e robots=off \
--timeout 4 --tries 2 \
--user-agent 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0)' \
"$URL"
it leaves html files lying around, not honouring the --delete-after
option. It does this, for example, for the following sites:
ftp://ftp.belnet.be
ftp://wilma.cs.brown.edu/pub/comp.lang.postscript/
ftp://incoming.redhat.com/
ftp://rtfm.mit.edu/
ftp://ftp.lysator.liu.se/pub/sf-texts/html_index/sf_full.html
An example of the last lines of the wget command output:
13:22:33 (427.73 KB/s) - `.listing' saved [438]
Removed `.listing'.
Wrote HTML-ized index to `index.html.3' [843].
Another example of the last lines of the wget command output:
13:36:18 (530.27 KB/s) - Data connection: Connection timed out; Giving up.
Removing the --force-html has no effect on the bug.
-- System Information
Debian Release: testing/unstable
Architecture: i386
Kernel: Linux snob 2.4.20vbc02 #1 Tue Jan 21 23:52:38 WET 2003 i586
Locale: LANG=C, LC_CTYPE=pt_PT
Versions of packages wget depends on:
ii libc6 2.3.1-10 GNU C Library: Shared libraries an
ii libssl0.9.6 0.9.6g-6 SSL shared libraries
--- End Message ---
--- Begin Message ---
Version: 1.10.2
Noèl Köthe wrote:
> ...
> Can you still reproduce this problem with wget 1.10.2?
> ...
No, I can't. I've tried the same command on the same sites
(those still up, anyway), and it worked as expected.
Actually, the problem must have been with the --spider
option, and it has apparently been corrected.
I'm closing the bug, since the problem appears to have
gone away.
Thanks.
--
Carlos Sousa
http://vbc.dyndns.org/
--- End Message ---