Hi,

On Jan 7, 2008 2:37 PM, G T Smith
<[EMAIL PROTECTED]> wrote:
> Marcin Floryan wrote:
> > Hi!
> >
> > Anyone have any view over the best way to watch for changes in log
> > files to do some analysis?
...
> > * to use Perl File::Tail to listen on a file and process any text that 
> > arrives
> > * to use tail -f and pipe the output to my software
...

> Personal view is that it would be best to use a Perl module if it exists
> in a script daemon, rather than use a command line call and pipe data to
> a perl script.
>
> Not on performance grounds, but more because one can design the script
> to handle unusual events and manage processing accordingly (especially
> if you are backending with a database that in itself may be adding to
> the logs you are monitoring).

I guess, it depends on what one wants to do with the logs.
I used "tail" in a set of tests and found it very flexible and
convenient. E.g. I did not want to know exactly in what log file the
message I'm looking for should appear. I did something like

"tail -n 0 -F /var/log/messages /var/log/secure ... |  tee <some
file>" and captured the result of tee within perl script with
"expect".

So when the message appeared I had the file <some file> with all log
messages up to this moment and was able to grep there or to do
anything else I wanted.

(-F works even if the log file is not present when you start "tail" or
is "logrotated").

Regards,
-- 
Mark Goldstein
-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to