In an effort to reduce the amount of log files that need to be run thru
for log analysis, I'm attempting to set up a routine to process in the
early AM the previous day's logs and save to a file for that specific day.
I can get it to run thru ALL of my logs and output for the previous day
(all
On Thu, 20 Jan 2000, Jim Sander wrote:
> > My solution only needs 9 SUBDOMAIN commands, which is a bit more efficient
> > (in terms of memory and speed).
>
>But of course it will match on too many things- for instance:
> '1cust224.tnt1.san-bernardino.ca.da.uu.net'
>
Errr, good point. I did
> My solution only needs 9 SUBDOMAIN commands, which is a bit more efficient
> (in terms of memory and speed).
But of course it will match on too many things- for instance:
'1cust224.tnt1.san-bernardino.ca.da.uu.net'
This is real example from a real logfile, so it's quite likely that
every
At 15.29 19/01/2000 +, Stephen Turner wrote:
>Hmmm, it is a bit complicated.
> SUBDOMAIN *.*.*.*
>will do it, but that's not very satisfactory if you have other domains as
>well as the numerical ones. So the alternative is
> SUBDOMAIN 1*.*.*,2*.*.*,3*.*.*,... etc.
Thanks a lot!
As Ken Tho
On Thu, 20 Jan 2000, Marco Bernardini wrote:
> At 15.29 19/01/2000 +, Stephen Turner wrote:
>
> >Hmmm, it is a bit complicated.
> > SUBDOMAIN *.*.*.*
> >will do it, but that's not very satisfactory if you have other domains as
> >well as the numerical ones. So the alternative is
> > SUBDOM
Hi,
I am having a problem with our server. Our daily logfiles are approximately 9.5 Meg
(compressed).
I am trying to get some sort of logfile that will just keep a 24 month 'Monthly Report'
and a 30 day 'Daily Report'.
It seems the cachefile causes analog to run out of memory after about 2 mont