> From: Mathias Walter [mailto:[EMAIL PROTECTED]
> How can I prevent crawler and bots according to their user agent?
>
> I've put a robots.txt in webapps/ROOT, but this file is not
> read again.

So, to check, the crawlers are not reading your robots.txt and are crawling 
your site anyway?

> I'd like to stop crawlers by their useragent string.

What do you mean by "stop"?  Do you want to return 404s or similar when a 
request with a particular user agent string is received?  If so, the obvious 
approach would be to write a Filter that is placed in front of your webapp, or 
a Valve that is placed in the request processing chain, that examines the user 
agent string in the request and returns an appropriate response if you don't 
like the agent.

                - Peter

---------------------------------------------------------------------
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to