wget alpha: -r --spider, number of broken links

2006-07-20 Thread Stefan Melbinger

Hi again,

I don't think that non-existing robots.txt-files should be reported as 
broken links (as long as they are not referenced by some page).


Current output, if spanning over 2 hosts (e.g., -D 
www.domain1.com,www.domain2.com):


-
Found 2 broken links.

http://www.domain1.com/robots.txt referred by:
(null)
http://www.domain2.com/robots.txt referred by:
(null)
-

What do you think?

Greets,
  Stefan



Re: wget alpha: -r --spider, number of broken links

2006-07-20 Thread Mauro Tortonesi

Stefan Melbinger ha scritto:

I don't think that non-existing robots.txt-files should be reported as 
broken links (as long as they are not referenced by some page).


Current output, if spanning over 2 hosts (e.g., -D 
www.domain1.com,www.domain2.com):


-
Found 2 broken links.

http://www.domain1.com/robots.txt referred by:
(null)
http://www.domain2.com/robots.txt referred by:
(null)
-

What do you think?


hi stefan,

of course you're right. but you are also late ;-)

in fact, this bug is already fixed in the current version of wget, which 
you can retrieve from our source code repository:


http://www.gnu.org/software/wget/wgetdev.html#development

thank you very much for your report anyway.

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi  http://www.tortonesi.com

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it