Dear Gnu Developers,
We just ran into a situation where we had to spider a site of our
own on a outsourced service because the company was going out of
business. Because wget respects the robots.txt file, however, we
could not get an archive made until we had the outsourced company
delete
-e robots=off
Jon W. Backstrom wrote:
Dear Gnu Developers,
We just ran into a situation where we had to spider a site of our
own on a outsourced service because the company was going out of
business. Because wget respects the robots.txt file, however, we
could not get an archive made