On Monday, March 14, 2011 14:53:35 André Warnier wrote:
> > I'd like to have a robots.txt on a site that has the following apache
> >
> > httpd config:
> >
> > <Location />
> > SetHandler perl-script
> > PerlHandler RT::Mason
> > </Location>
> >
> >
> > But if I install a robots.txt to the DocumentRoot and test it via wget I
> > just download the front page of the site, as its handled by perl-script.
> > It it possible to have a robots.txt in this situation?
> >
> >
> >
> > thanks for any tips, Andy.
> >
> >
>
> Ideas :
> 1) Try a <FilesMatch ^robots\.txt$> section inside the above section, to
> reset the handler to the default Apache (that may not be so easy)
SetHandler None (imho)
> 2) Create a Mason handler to handle the URL "robots.txt" and return the
> file "as is"
or instead of the Location block:
PerlMapToStorageHandler "sub { \
use Apache2::Const -compile=>DECLINED; \
use Apache2::RequestRec (); \
use Apache2::RequestUtil (); \
unless( $_[0]->uri eq '/robots.txt' ) { \
$_[0]->add_config(['SetHandler perl-script', \
'PerlHandler RT::Mason']); \
} \
return Apache2::Const::DECLINED; \
}"
Torsten Förtsch
--
Need professional modperl support? Hire me! (http://foertsch.name)
Like fantasy? http://kabatinte.net