Folks,

as I emailed earlier, I am working on a log parsing program. This program
needs to be able to handle ~1500 logs that total ~100MB in size. Currently,
we are only pulling out certain information from the logs, but this can
change at anytime. I am trying to figure out a clean way to opendir, read
all the files in and then split the files into 3 - 5 separate lists so I can
fork children to parse out the required errors. I am not certain as of yet,
how to accomplish the splitting of files and then forking.

In the meantime I am thinking of ways to cleanly parse the logfiles counting
the individual errors. I was thinking of a if (/pattern/ig ) { $var ++; },
but this seems kludgie and like it would take a great deal of time to parse
the quantity of logs I have. Could you recommend some clean ways to parse
300 - 400 logiles pulling out certain information with little system impact?
I do not think parsing each file in a foreach (@arrayOfFiles) is the right
choice do you ?

Regards,
Ron

Paul,
anyword on that skeleton code you mentioned ???


Reply via email to