you should be able to filter out based on agent, depends how you are logging.

i use mod_log_sql to log accesses to mysql (rather than to files)
it serves to keep all the data in once place, also making it easy
to dump out and bzip up for archiving. it also means you can use
sql to crunch out custom stats really quickly. that or a little perl
and its done.

the down side is, awstats and other things dont seem to be able to
feed straight from mod_log_sql. modlogan does, but its now defunct
and i cant get it to work on non-3306 port or non /tmp/mysql.sock
. bah.

please note: mod_log_sql will feed into more databases than just
mysql. also note, its worth running a second instance of mysql
just for the logs, *especially* if you are like myself and
run replication across your entire 'application' mysql database.

if you are using awstats or something similar, you should be able
to filter out the agent.

Dean

Peter Chubb wrote:
"Dean" == Dean Hamstead <[EMAIL PROTECTED]> writes:

Dean> are you meaning in your logs or actually stopping/reducing the
Dean> webbots?  Dean

I mean filtering the logs so I can see who's using the website.  I'm
not particularly concerned with stopping the robots, but continued funding to
keep the site up depends partly on showing that real people use the site.

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to