Re: maximum file size for while(FILE) loop? - maybe HASH problem?

2007-01-20 Thread Tom Phoenix

On 1/19/07, Bertrand Baesjou [EMAIL PROTECTED] wrote:


While running my script it seems to use around a gigabyte of memory
(there is 1GB of RAM and 1GB of swap in the system), might this be the
problem?


If you're running low on memory, unless you're working on an
inherintly large problem, your algorithm is probably wasting some
memory.


foreach $line (INFILE) {
$somestorage{$linecounter}=$value;
$linecounter++;
}


Well, that builds a big hash for nothing. Unless you're trying to waste memory?


print $linecounter;


You should probably put a newline at the end of your output.


system(pwd) == 0 or die system failed: $?;



5198365system failed: 0 at ./sample1.pl line 22.


You're trying to run the command pwd, which seems to have failed.
The value of $? is zero though, which would normally indicate success.
Perhaps it's zero because the command couldn't be executed at all?
(Maybe low memory?) Does the pwd command normally work from
system()? (You're not comparing this pwd to the shell built-in, are
you?)

Hope this helps!

--Tom Phoenix
Stonehenge Perl Training

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/




Re: maximum file size for while(FILE) loop? - maybe HASH problem?

2007-01-19 Thread Bertrand Baesjou

Ken Foskey wrote:

On Fri, 2007-01-19 at 13:16 +0100, Bertrand Baesjou wrote:
  

Hi,

I am trying to read data from a file, I do this by using the while 
(FILE){ $line} construction.
However with files with a size of roughly bigger than 430MB it seems to 
crash the script :S Syntax seems all fine (perl -wc - syntax OK).


I was thinking that maybe it was running to the end of a 32 bit counter 
(but that would be 536 MB right?)? Can anybody offer an other solution 
to work with such large files and perl?



No idea,  a little script sample might be good.

  
While running my script it seems to use around a gigabyte of memory 
(there is 1GB of RAM and 1GB of swap in the system), might this be the 
problem?


The script below gives the error:
#!/usr/local/bin/perl
##
#

use POSIX;

my $inputFile = $ARGV[0];   #file we are 
reading from

my $outputFile = ./tmp/overall-memory.dat;
my %somestorage;
my $linecounter = 0;
my $value=40;
my $bool = 0;

open(INFILE, $inputFile) or die(Could not open log 
file.);# open for input

open(OUTFILE, $outputFile);

foreach $line (INFILE) {

   $somestorage{$linecounter}=$value;
   $linecounter++;
}
close(INFILE);
close(OUTFILE);
print $linecounter;
system(pwd) == 0 or die system failed: $?;

#wc -l samples/1169209055.trcxml
5198365 samples/1169209055.trcxml
# ./sample1.pl samples/1169209055.trcxml
5198365system failed: 0 at ./sample1.pl line 22.

Any ideas how to solve my problem?
Tnx!
   Bertrand

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Re: maximum file size for while(FILE) loop? - maybe HASH problem?

2007-01-19 Thread Paul Johnson
On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote:

 foreach $line (INFILE) {

See, this isn't a while loop, as you have in the subject.

That is the cause of your problems.

-- 
Paul Johnson - [EMAIL PROTECTED]
http://www.pjcj.net

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/