Re: memory issues?

2007-01-19 Thread Bertrand Baesjou

Paul Johnson wrote:

On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote:

  

foreach $line (INFILE) {



See, this isn't a while loop, as you have in the subject.

That is the cause of your problems.
  
Damn, not very awake today I think. I also left an old subject line 
in But if that is the cause, where lies the solution?


Tnx.


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Re: memory issues?

2007-01-19 Thread Octavian Rasnita

From: Bertrand Baesjou [EMAIL PROTECTED]


Paul Johnson wrote:

On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote:

  

foreach $line (INFILE) {



See, this isn't a while loop, as you have in the subject.

That is the cause of your problems.
  
Damn, not very awake today I think. I also left an old subject line 
in But if that is the cause, where lies the solution?




Use while() instead of foreach().

Octavian



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/




Re: memory issues?

2007-01-19 Thread Bertrand Baesjou

Octavian Rasnita wrote:

From: Bertrand Baesjou [EMAIL PROTECTED]


Paul Johnson wrote:

On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote:

 

foreach $line (INFILE) {



See, this isn't a while loop, as you have in the subject.

That is the cause of your problems.
  
Damn, not very awake today I think. I also left an old subject line 
in But if that is the cause, where lies the solution?




Use while() instead of foreach().

Octavian


Thank you very much, this is indeed the solution.






-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Re: memory issues?

2007-01-19 Thread Xavier Noria

On Jan 19, 2007, at 5:53 PM, Bertrand Baesjou wrote:


Thank you very much, this is indeed the solution.


The explanation is that when you process lines this way

  foreach my $line (FH) { ... }

the readline operator is evaluated in list context and, thus, the  
file is slurped into a single list with all the lines. So, first the  
file is fully put into memory, and then foreach starts.


In the line-oriented while loop

  while (my $line = FH) { ... }

you are fetching line by line[*], and so memory is kept under control  
(at least as much in control as the length of the lines in that file,  
that wouldn't solve the problem if the file was GBs of data in a  
single line, you see how it works).


-- fxn

[*] Under the hood there are actually some buffers, but that's the idea.


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/




Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy

On Thu, 7 Feb 2002, Brian Hayes wrote:

 Hello all.  I need to read through a large (150 MB) text file line by
 line.  Does anyone know how to do this without my process swelling to
 300 megs?

As long as you aren't reading that file into an array (which would be a
foolish thing to do, IMHO), I don't see why the process would swell to 300
megs.

-- Brett
  http://www.chapelperilous.net/

-  longf_ffree;/* free file nodes in fs */
+  longf_ffree;/* freie Dateiknoten im Dateisystem */
-- Seen in a translation


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: memory issues reading large files

2002-02-07 Thread Brian Hayes

 You should be using something like
 
 open(FILE, $file) or die $!\n;
 while(FILE){
   ## do something
   }
 close FILE;
 __END__

This is what I am doing, but before any of the file is processed, the
whole text file is moved into memory.  The only solution I can think of
is to break apart the text file and read thru each smaller part...but I
would like to avoid this.  I was hoping someone knew how perl interacts
with memory and knew how to trick it into not reading the whole file at
one time.  

My main concern is scalability.  This file will continue to grow daily,
and in a year I don't want my app taking up a gig of mem.

If nothing can be done reading a file, what about piping output from
another problem?  I use artsases (anyone familiar with the arts++
package) to generate the 150meg file, but I could easily pipe the
results to my perl program instead of building the file.  As far as
memory and file handlers are concerned, what differences are there between
reading from a text file vs another program?

Thanks for your help,
Brian


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: memory issues reading large files

2002-02-07 Thread Brian Hayes

It appears the problem was using the foreach statement instead of while.
I have not tested this extensively, but using foreach the whole text
file (or output of pipe) is read into memory before continuing, but
using while (and probably for) each line is processed as it is read.  

Thanks for all your help,
Brian

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy

On Thu, 7 Feb 2002, Brian Hayes wrote:

 It appears the problem was using the foreach statement instead of while.
 I have not tested this extensively, but using foreach the whole text
 file (or output of pipe) is read into memory before continuing, but
 using while (and probably for) each line is processed as it is read.

Yes indeed it does, because foreach operates on a list, where as while
operates on a boolean value returned from an expression (which means
scalar context).  So if you do

foreach(FILE) { }

It will put FILE into the list context and slurp the entire thing up
into a list.  'for' will do the same thing.  It also operates on a list
(foreach and for are aliases for each other).

-- Brett
  http://www.chapelperilous.net/

As you grow older, you will still do foolish things, but you will do them
with much more enthusiasm.
-- The Cowboy


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]