Hi,

I have created a highly configurable content management system in modperl to
build websites, which consists of several modules, some preload when apache
initiates, some when a request initiates and some when needed.

I had the opportunity to test the system with 2000 lan connected users to do
some performance checking, and found out that for each request the server
received, the child process which handled the request used a little more
memory each time. In this test I had around 500.000 hits pr day, and
apache/modperl used up all the 1Gb of memory the server had in a very short
time. I checked and rechecked my code and could not find any kind of memory
leak in it, so finally I had to configure the Apache to kill child processes
after answering 100 requests, otherwise they would consume too much memory.

Have anybody else experienced this kind of problem?

The system runs on a dual 800 Mhz PIII with 1Gb ram on FreeBSD 4.2 and all
content is fetched/stored in a MySql running on the same server.


Another problem I encountered... when doing HTTP upload's, apache/modperl
uses 7-8 times the size of the uploaded file, of memory. If I uploaded a
10Mb file, the server would typically use up to 170Mb for the child
answering the request, and the child would not free all the memory used when
the request had been handled. Can this really be true?

Yours,

// Per Moeller

Reply via email to