Michael,

I would probably simply run a script something like

'while [ 1 ]; do date >>ps.log; ps axv | egrep "perl|tomcat|apache"
ps.log;sleep 15; done'
Then compare results over time (probably using a perl script to parse
and accumulate the data you need)

sar is also nice to log and give overall stats e.g.

[EMAIL PROTECTED]:~$  sar -r 2 5
Linux 2.6.15-27-386 (reepy)     14/02/07

10:26:16    kbmemfree kbmemused  %memused kbbuffers  kbcached
kbswpfree kbswpused  %swpused  kbswpcad
10:26:18        48964    142516     74.43     15372     46796
984856     19164      1.91         4
10:26:20        48964    142516     74.43     15372     46796
984856     19164      1.91         4
10:26:22        48964    142516     74.43     15376     46796
984856     19164      1.91         4
10:26:24        48964    142516     74.43     15376     46796
984856     19164      1.91         4
10:26:26        48836    142644     74.50     15376     46796
984856     19164      1.91         4
Average:        48938    142542     74.44     15374     46796
984856     19164      1.91         4


Regards, Martin

On 2/14/07, Michael Lake <[EMAIL PROTECTED]> wrote:
Amos Shapira wrote:
> On 13/02/07, Mike Lake <[EMAIL PROTECTED]> wrote:
>
>>
>> Googling for 'memory profiler web applications' and things brings up
>> things that you use to find memory leaks in apps which I dont want.
>> Naturally top just gives me instantaneous values which don't mean much
>> when a web app is only getting a few hits a minute or even less.
>> Thats why I want to get an average over a few hours or so.
>>
>> Also I don't have Gnome or any gui thing on this server so it has to be
>> command line or a perl or bash or other program that can be run from
>> command line. Output to file would be perfect.
>>
>> Does anyone have suggestions? What do people here use for getting stats
>> on programs like this?
>
>
> I'm not sure there is anything special about web applications - after
> all to the system they should look as just another process, although it 
usually
> generates lots of network traffic.
Yes, thats correct. I would just look at the sum total of all the Perl 
processes=the
Perl app that is running vs sum of all Java stuff=the Tomcat.

> "exmap" seems to be something about this, I haven't used it but from its
> Debian package dependencies it looks like it depends on GTK2 so it must be
> some sort of a GUI-based application. But maybe you can run it remotely
> with its window opened on your local $DISPLAY.
I found that using apt-cache on my laptop but to install it it will pull in GTK.
As you mention it I have just looked it up on the web.
Its using GTK for display only and its a perl script underneath that does the
analysis. See its homepage at: http://www.berthels.co.uk/exmap/

"Exmap is a tool to allow the real memory usage of a collection of
processes to be examined. A linux kernel loadable module is used to
export information to userspace, which is examined by a perl/gtk
application to build a picture of how pages are shared amongst
processes and their shared libraries."

BUT!

"Exmap is linux-specific, since it uses a linux kernel loadable module. 
Additionally,
the kernel module requires a fairly recent kernel (2.6.8 works, as may some 
earlier
2.6) in order to successfully compile or run."

The server I have is a vserver running 2.4.22 kernel. So exmap is out anyways.
Thanks for the suggestion.

Mike
--
Michael Lake
Computational Research Support Unit
Science Faculty, UTS
Ph: 9514 2238



--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html



--
Regards, Martin

Martin Visser
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to