Dear All,

Having tested sessions (via Embperl and Apache::Session) I have
discovered an awful lot of rows in my database being created by robots.
My assumption is that the robots don't support cookies so every time
they hit a page on the site a new session is created in the database.
When they trawl a discussion board, for example, they can very quickly
create hundreds of sessions.

How do people deal with this?  The only thing I can think to do is to
set a separate cookie on every page on the site, and then only try to
create session data if that cookie is actually set.  The disadvantage
here is that it's a pain to do and even then I couldn't start a session
on the first page they look at.

Any ideas gratefully received, as always.

Mike



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to