Hi,

On Sat, Apr 13, 2013 at 4:50 PM, Asad <asad.hasan2...@gmail.com> wrote:

> Hi All ,
>
>          Greetings !
>
>        I have completed O'Reily first book Learning Perl . Started writing
> small perl programs . However I basically do log file analysis , hence was
> thinking is there any perl code around in any of the book or if anyone
> already developed for the following requirement :
>
> Nowadays I manually read the log files to read the errors which are either
> fatal or warning or ignorable . The purpose is I want to develop a web page
> as in I upload this log file and it searches for those keywords (fatal or
> warning or ignorable) and if found those keywords display the 10line before
> it and after it .
>
> Please share your thoughts. Unable to start .
>
>
>
I agree with David, using a "cache" works very well.

But if your log files are not so large ( that I don't know :) ), you can
consider reading the file into an array ( of course, there are several
modules that does it for you, if you want ), then loop through the array,
once you get the line you wanted using regex, then take a range 10 lines
from and 10 lines to.

For Example:
[code]
use warnings;
use strict;
use File::Slurp qw(read_file);
use constant LIMIT => 3;    ## Number of lines wanted

my $file = $ARGV[0];

my @lines = read_file($file);

for ( 0 .. $#lines ) {
    if ( $lines[$_] =~ m{^\s+?\b(it|mary|to)\b}i ) {
        print join " " => '*', @lines[ $_ - LIMIT .. $_ + LIMIT ], $/;
    }
}
[/code]

ON the CLI:
Usage: perl_script.pl Mary_has_a_little_lamp_file.txt

You have to install the module `File::Slurp` if you don't have it.
You might also consider using  the module `Tie::File` this comes with your
Perl installation.




-- 
Tim

Reply via email to