Do you have enough RAM/SWAP? I use Analog on Solaris and have
never had a problem with large log files. My current logfile
is over 750MB. I just ran an analysis using Analog 3.0 on the
750+ MB file and it only took 11 minutes 39 seconds.
One thing I have found in general, is that if you cut the log,
and gzip it the zcat it into analog, it runs much faster.
You can write a script to copy the logfile to a safe place,
gzip it, then zcat it into analog and produce an output file.
You may or may not want to zero out the log file after you copy
it but before you gzip it.
Regards,
mark
From: "otto j. simon" <[EMAIL PROTECTED]> AT internet on 12/03/98
12:53 PM
To: [EMAIL PROTECTED] AT internet@ccMTA-GEMS-CO-01
cc: (bcc: Mark S. Kaprow/KGMP/CO/GSA/GOV)
Subject: [analog-help] Getting slow
Hi all!
I am using Analog 3.11 on a Linux system.
Everything works fine until I try to process larger quantities
of log lines.
After processing about 500.000 lines the whole system is
getting *really* slow almost halting my web server - even a
nice/renice doesn't help. DNS lookups are already turned off.
Any idea what to do?
- Otto -
---------------------------------------------------------
Otto J. Simon CEO of simon media GmbH
Andreas-Hofer-Platz 9 A-8010 Graz / Austria, Europe
vox: +43/316/813 8240 fax: +43/316/813 8246
[EMAIL PROTECTED] http://www.sime.com
---------------------------------------------------------
---------------------------------------------------------------
-----
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe analog-help" in the main BODY OF THE MESSAGE.
---------------------------------------------------------------
-----