Sheesh.

Wil ya'll just help a man with a perl problem instead of battering him with
other ways to do it?

Sometimes people like ot pose a challenge to themselves and see if it can be
done.

Instead of being counterproductive and refering peopel to other things, help
the man!

I wish I could but at the moment, I'm trying to get some sleep..

Good Night,
Dennis Stout


> I am not handling this at the Apache level because I have about 40+
> different sites running on this one box. I want to be able to run server
> wide statistics from the 'master' log file, then I want to strip it apart
> into the separate pieces, pipe that piece to Analog and dump the results
> into the vhost folder for the site owner to view.
>
> I don't want to have to collect and run through 40 individual files, I want
> to have the system do the work creating temporary vhost log files and
> processing them.
>
> That mod to my shell script is kind of what I am looking for. I guess I
> wasn't sure if I could do that with Perl as well.
> ---------------------------------------
> Jeremy Schwartz                Starmark
> Interactive Developer
> [EMAIL PROTECTED]        954-761-1600
>
> > From: [EMAIL PROTECTED]
> > Date: Mon, 3 Feb 2003 12:32:05 -0600
> > To: "Jeremy Schwartz" <[EMAIL PROTECTED]>, "Andy Lester"
<[EMAIL PROTECTED]>
> > Cc: [EMAIL PROTECTED], [EMAIL PROTECTED]
> > Subject: Re: Search for a string
> >
> >
> > ------------------------------------------------
> > On Mon, 03 Feb 2003 13:09:47 -0500, Jeremy Schwartz <[EMAIL PROTECTED]>
> > wrote:
> >
> >> Not trying to reinvent the wheel.
> >>
> >> I am using Analog for the analysis.
> >>
> >> I am trying to split the server combined log into individual vhost logs.
I
> >> can then run each through Analog to produce individual reports.
> >
> >>> Don't reinvent the wheel.  There are a number of fine log analysis
> >>> utilities, such as analog.
> >>>
> >>> xoa
> >
> > Out of curiousity is there a reason why you are not handling this at the
> > Apache level?  Each vhost can have its own set of logs at the start that
then
> > would not need to be pulled apart.  Is this a possible scenario for you
going
> > forward? (granted it doesn't help now).  It would seem that your task
would be
> > better handled with shell script possibly since you already have the
command
> > line for creating the file(s) from the main log, so then just wrap that
> > command in a foreach that takes your directory names as input.
> >
> > Something along the lines of:
> >
> > #!/bin/sh
> >
> > for dir in `ls -1 /webroot/`; do
> > cat /var/log/httpd/access_log | grep "$dir" >
> > /var/log/httpd/access_log_$dir
> > done
> >
> > I am no shell hacker and the above is untested, but you get the idea.  In
> > general Perl would not be a good choice for performing something so simple
> > that already has a command line solution available.
> >
> > If you were going to do it in Perl, rather than looking for each vhost in
the
> > log file, you would be better off unpacking or splitting, etc. the log
line
> > and storing that line to an array that is associated with the particular
vhost
> > in the log line and then printing each vhost's array to a file, or you
would
> > have to open a filehandle for each vhost at the beginning of the script
and
> > then just print the line to whichever filehandle is associated with a
> > particular vhost.  Stepping through every line of the log file foreach of
the
> > vhosts in Perl would probably be a really bad way to handle things.
> >
> > I would still suggest letting Apache do the splitting by not storing one
main
> > log with all vhost content, it is much easier to put the logs back
together to
> > get a complete picture than it is to disect them after the fact.
> >
> > http://danconia.org
> >
> > --
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> >
>
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>

Reply via email to