I was thinking about database option also, but wasnt sure if is not wasting of database connections which will be already used for serving page data and mogilefs images (though they are mostly cached). I have no clue so far, how many database connections could dedicated server handle, so I am trying to spare them. The second way is also an option, but its not much scalable when you grow to few hundred thousands database entries, I cant imagine to go trough all of them just to find out the 1000 ID's which got click/hit in last few minutes (in case of 5 mins updates by cron), such stats updates would have to be in longer periods, eg. once or twice a day to dont hit memcache so hard. I will probably try flatfiles instead.
Goodwill ______________________________________________________________ >insert into foo (key, activity) values ( $key, 1 ) ... >on duplicate key update activity = activity + 1; > >And I have a cron job that flushes this memory table to a disk based table every few minutes, to mitigate the potential window of loss. > >I have considered doing stats another way: to record document keys to memcache, and then run pollers across memcache instances collecting all potential instances of the keys. > >php: memcache->set( key ); memcache->incr( key ); >cron: foreach keys do memcache->get( key ); db->insert/update(key); > >I have a few hundred thousand documents, so I'd prolly run a few cron pollers so expedite the polling. I don't need to dump keys from memcache because the keys are already pks in the database. > >Review your requirements and see to what degree dataloss is an issue. If you traffic recording permits lossiness, this could fit the bill. > >Jed >
