Deb said:
> re: Unix
>
> From the command-line, I'm currently running a find command piped to
> xargs grep:
>
> find . -f print | xargs egrep "some string to look for"
>
> There is an occassional requirement that this be done and it must parse
> a hierarchy of directories and files which number in the many
> thousands.
>
> Run from the commandline takes a long, long time.  I even re-niced the
> command -10 as root, but it still takes hours.
>
> Would perl be able to optimize the search and grep better than what I
> am currently doing?


It's possible, since perl is optimised somewhat for this sort of thing. 
Of course, it also depends on how you write your Perl.  You might find a
Perl solution is slower too.

> Ideas, jokes and rants are appreciated...


Bang "tcgrep" into google and see if that brings you any joy.


-- 
Paul Johnson - [EMAIL PROTECTED]
http://www.pjcj.net




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to