Robots = off directive

2004-02-16 Thread chatiman
Hello,


I'm trying to download a robots.txt protexted directory and I'm having the
following problem:

- wget downloads the files but delete them after they are downloaded with
the following :message (translated from french):
Destroyed  because it must be rejected

How can I prevent this ?

Thanks

PS: I'm using wget 1.8.1-6




Strange behavior

2004-02-04 Thread chatiman
I'm trying to download a "directory" recursively with 
wget -rL http://site/dir/script.php
wget retrieves all the pages looking like
http://site/dir/script.php?param1=value1

but not the following :
http://site/dir/script.php?param1=value1&page=pageno


What's wrong ?

Please reply me directly as I'm not on the list

Thanks




skip robots

2004-02-04 Thread chatiman
I onced used the "skip robots" directive in the wgetrc file.
But I can't find it anymore in wget 1.9.1 documentation.
Did it disapeared from the doc or from the program ?

Please answer me, as I'm not subscribed to this list