On 4 Feb 2005, at 14:16, James Smith wrote:

On Fri, 4 Feb 2005, Denis Banovic wrote:
I have a very similar app running in mod_perl with about 1/2 mio hits a day. I need to do some optimisation, so I'm just interessted what optimisations that you are using brought you the best improvements.
Was it preloading modules in the startup.pl or caching the 1x1 gif image, or maybe optimising the database cache ( I'm using mysql ).
I'm sure you are also having usage peaks, so it would be interessting how many hits(inserts)/hour can a single server machine handle approx.

Simplest thing to do is hijack the referer logs, and then parse them at the end. You just need to add a unique ID for each session (via a cookie or in the URL) which is added to the logs [or placed in a standard logged variable]


I totally agree with James. I'm thinking of switching to just using a log file for this rather than it being live (as I only generate reports once a day). I'm actually using a log based system for user tracking, which was implemented after this counter. The counter system is used to count how many times a product appears in search results / how many times someone views it in detail, a good tip, if you have 20 products on the page, Do not call the counter for every one, just pass all the id's in - obvious but if your implementing it in a rush you might miss! It used to be part of the main search code, but this prevented caching.


The optimisations I did were:

Put in startup.pl (and read the image in as a global from BEGIN block)

use Apache::DBI->connect_on_init() so DBH is from the pool of connections and not a new one each time (I'm using MySQL as well).

I have a light (non mod_perl) apache at the front, which proxies to a mod_perl apache that runs the module, and the database is on a 3rd machine.

I've not got to the point of it overloading the system, so I haven't investigated the actual hit rate.

Cheers

Leo



Reply via email to