You might do better to make an altenation:
$keywords = join("|", @keywords);
if ( $wholefile =~ /^\s*($keywords)\s*$/mo ) {   # do you want 'i' ?

rather than the loop.  Worth a benchmark, but one search for multiple 
matches is probably faster than multiple searches for some sized files. 

You also might do better to process the file and strip out the keyword 
chunks, put them in a hash.
my %keywords;
while ( my ($keyword) = $wholefile =~ /^\s*([A-Z]+)\s*/gm ) {
    $keywords{$keyword}++;
}

...
 if ( $keywords{uc($lookup_keyword)} ) {
     # found one

depends, again on how often you're needing to search.  If, say you've got 
50 files and 100 searches, it may save time to read them each once, get 
the keywords and then make 100 hash lookups (you could append the filename 
to the $keywords{$keyword} instead of just a counter) as opposed to 50 
full reads times a 100.  If the files are static, you'll win big by 
putting the index info to a file (you can use Storeable modules to save 
the hash even) and just opening that the next time.

Or, look at htdig (www.htdig.org).

a

Andy Bach, Sys. Mangler
Internet: [EMAIL PROTECTED] 
VOICE: (608) 261-5738  FAX 264-5932

Rule #8 Don't close the latch on an empty DLT drive
_______________________________________________
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to