Most search engine bots/crawlers don't store cookies when they are
crawling your site. A way to solve this is creating a list with known
bots and their useragent strings. If a visitor visits your site check
if it's a bot or not and if so look in your sessions table if it has
been here before so you can reuse that session. Bare in mind that this
solves the problem for most bots, however there are spam bots that act
the same and are harder to track since they tend to use 'normal'
browser useragent strings.

On Mar 13, 9:35 am, wowfka <a.lic...@gmail.com> wrote:
> Hi,
>
>   Have little prob with search engine bots :) I am storing sessions in
> database and also track visitors in site, with records from that
> database, recently i saw multiple records with same IP adress tracked
> it, and found that it is search engine bots, google,yahoo, etc there
> was many records with same ip each url generates seperate session id.
> It should behave so? each bot acess to url create new session id?
> Maybe i missing something.
>    Another question is it good solution to track users from cake
> session database? I can create another database and store there
> visiting users information, but don't want to create unnecessary-
> dublicate code.
>
> Thanks
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to cake-php@googlegroups.com
To unsubscribe from this group, send email to 
cake-php+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to