On Fri, 12 Jul 2013 15:34:22 +0000 (UTC)
mrl <m...@psfc.mit.edu> wrote:
> Is there a way to block .php from being indexed by crawlers, but
> allow other type files to be indexed?  When the crawlers access the
> php files, they are executed, creating lots of error messages (and
> taking up cpu cycles).  Thanks. 

Google for "robots.txt".

-- 
D'Arcy J.M. Cain
System Administrator, Vex.Net
http://www.Vex.Net/ IM:da...@vex.net
Voip: sip:da...@vex.net

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org

Reply via email to