Hi list, I have a perl script i wrote, its very simple it lists the content of a file line by line. With only 19 line to readin when i watch the program in top being called from the web it will hog 296 megs of ram for its execution. That is way to much for a script, windows only requires 64megs to operate semi normal? :) The flat file database it reads in is like this: |xxx|xxx|xxx|xxx|xxx|xxx|\n Nothing major here. Here is the basic script in a nutshell minus the html and stuff: $user = $input{'userid'}; $pass = $input{'password'}; print &PrintHeader; print <<EOM; some html here EOM open(GFAV,"reminder/reminder$user"); #granbs the users note file @FAV = <GFAV>; close(GFAV); open(DATA, "online.dat"); flock(DATA,LOCK_SH); seek(DATA, 0, 0); while (<DATA>) { chop($_); @add = split(/\|/, $_); my $exist = "no"; my $xxexist = 0; my $sinote = "xx/si$add[1]"; my $note = "xx/$add[1]"; $exist = "yes" if -e $note; if ($add[1] ne "") { #make sure to skip any entries the have a blank first element my $cnt = 1; print "<td>$add[4]</td><td>$add[5]</td><td>$add[6]</td><td>$add[7]</td>"; foreach $favline(@FAV) { chop($favline); @fa=split(/\|/,$favline); if ($add[1] eq $fa[1]) { print "Some additional html on reminder if found for this entry"; last;}} flock (DATA, LOCK_UN); close(DATA); exit; }} this is the big memory hogger? I have a script that does kinda the same thing except it scans a file around 10,000 lines checks a date and if the date matches grabs the detail from the keywork from another file and prints the line to the browser. Much much more opening and scanning of files etc... and that scripts barely takes any memory at all? I for somereason, maby tires eyes? Can not see above where this script should be comsuming that much memory 296 megs for 19 line data file?? I would appreciate any incite anyone may have to this? Thanks alot, John _______________________________________________ Perl-Unix-Users mailing list. To unsubscribe go to http://listserv.ActiveState.com/mailman/subscribe/perl-unix-users