are you meaning in your logs or actually stopping/reducing the webbots?

Dean

Peter Chubb wrote:
Hi,
        I'm looking at my website logs (using the webalizer package) and
most of the hits seem to be some sort of robot (Google tops the bill
at 24% of the hits,  9.7Gbytes downloaded this month so far... but
then we see them as the chief external referring website as well)


How can I easily filter out all the 'bots and get an estimate of how
many *real* people are using the website?  It used to be relatively
easy when there were fewer search engines, but now there seem to be dozens.



--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to