>
> Having tested sessions (via Embperl and Apache::Session) I have
> discovered an awful lot of rows in my database being created by robots.
> My assumption is that the robots don't support cookies so every time
> they hit a page on the site a new session is created in the database.
> When they trawl a discussion board, for example, they can very quickly
> create hundreds of sessions.
>
> How do people deal with this? The only thing I can think to do is to
> set a separate cookie on every page on the site, and then only try to
> create session data if that cookie is actually set. The disadvantage
> here is that it's a pain to do and even then I couldn't start a session
> on the first page they look at.
>
One solution would be to delete the session if the useragent is a know
robot, but you have to add code like
[-
$r = shift ;
$r -> DeleteSession if ($ENV{HTTP_USER_AGENT} =~ /robot|otherrobot/) ;
-]
to the end of each of your pages. We can add something like this to the
Embperl core, that disables session for certain useragents. Not sure if this
is really a solution?
Gerald
-------------------------------------------------------------
Gerald Richter ecos electronic communication services gmbh
Internetconnect * Webserver/-design/-datenbanken * Consulting
Post: Tulpenstrasse 5 D-55276 Dienheim b. Mainz
E-Mail: [EMAIL PROTECTED] Voice: +49 6133 925151
WWW: http://www.ecos.de Fax: +49 6133 925152
-------------------------------------------------------------
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]