a.sm...@ukgrid.net wrote:
Hi,
I'd like to have a robots.txt on a site that has the following apache
httpd config:
<Location />
SetHandler perl-script
PerlHandler RT::Mason
</Location>
But if I install a robots.txt to the DocumentRoot and test it via wget I
just download the front page of the site, as its handled by perl-script.
It it possible to have a robots.txt in this situation?
thanks for any tips, Andy.
Ideas :
1) Try a <FilesMatch ^robots\.txt$> section inside the above section, to reset the handler
to the default Apache (that may not be so easy)
2) Create a Mason handler to handle the URL "robots.txt" and return the file "as
is"