Re: memory issues?
Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line () { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn, not very awake today I think. I also left an old subject line in But if that is the cause, where lies the solution? Tnx. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/
Re: memory issues?
From: "Bertrand Baesjou" <[EMAIL PROTECTED]> Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line () { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn, not very awake today I think. I also left an old subject line in But if that is the cause, where lies the solution? Use while() instead of foreach(). Octavian -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/
Re: memory issues?
Octavian Rasnita wrote: From: "Bertrand Baesjou" <[EMAIL PROTECTED]> Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line () { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn, not very awake today I think. I also left an old subject line in But if that is the cause, where lies the solution? Use while() instead of foreach(). Octavian Thank you very much, this is indeed the solution. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/
Re: memory issues?
On Jan 19, 2007, at 5:53 PM, Bertrand Baesjou wrote: Thank you very much, this is indeed the solution. The explanation is that when you process lines this way foreach my $line () { ... } the readline operator is evaluated in list context and, thus, the file is slurped into a single list with all the lines. So, first the file is fully put into memory, and then foreach starts. In the line-oriented while loop while (my $line = ) { ... } you are fetching line by line[*], and so memory is kept under control (at least as much in control as the length of the lines in that file, that wouldn't solve the problem if the file was GBs of data in a single line, you see how it works). -- fxn [*] Under the hood there are actually some buffers, but that's the idea. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/
memory issues reading large files
Hello all. I need to read through a large (150 MB) text file line by line. Does anyone know how to do this without my process swelling to 300 megs? I have not been following the list, so sorry if this question has recently come up. I did not find it answered in the archives. Thanks, Brian -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: memory issues reading large files
On Thu, 7 Feb 2002, Brian Hayes wrote: > Hello all. I need to read through a large (150 MB) text file line by > line. Does anyone know how to do this without my process swelling to > 300 megs? As long as you aren't reading that file into an array (which would be a foolish thing to do, IMHO), I don't see why the process would swell to 300 megs. -- Brett http://www.chapelperilous.net/ - longf_ffree;/* free file nodes in fs */ + longf_ffree;/* freie Dateiknoten im Dateisystem */ -- Seen in a translation -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: memory issues reading large files
You should be using something like open(FILE, $file) or die "$!\n"; while(){ ## do something } close FILE; __END__ if you use something like local $/; $contents = ; __END__ then you are mistaken... my perlscripts go up to almost a gig of mem sometimes (foolish yes), but quick to write it! ;) -Original Message- From: Brett W. McCoy [mailto:[EMAIL PROTECTED]] Sent: Thursday, February 07, 2002 3:49 PM To: Brian Hayes Cc: [EMAIL PROTECTED] Subject: Re: memory issues reading large files On Thu, 7 Feb 2002, Brian Hayes wrote: > Hello all. I need to read through a large (150 MB) text file line by > line. Does anyone know how to do this without my process swelling to > 300 megs? As long as you aren't reading that file into an array (which would be a foolish thing to do, IMHO), I don't see why the process would swell to 300 megs. -- Brett http://www.chapelperilous.net/ - longf_ffree;/* free file nodes in fs */ + longf_ffree;/* freie Dateiknoten im Dateisystem */ -- Seen in a translation -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] The views and opinions expressed in this email message are the sender's own, and do not necessarily represent the views and opinions of Summit Systems Inc. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: memory issues reading large files
On Thu, 7 Feb 2002, Brian Hayes wrote: > > You should be using something like > > > > open(FILE, $file) or die "$!\n"; > > while(){ > > ## do something > > } > > close FILE; > > __END__ > > This is what I am doing, but before any of the file is processed, the > whole text file is moved into memory. The only solution I can think of > is to break apart the text file and read thru each smaller part...but I > would like to avoid this. I was hoping someone knew how perl interacts > with memory and knew how to trick it into not reading the whole file at > one time. Can you show the code you have? The entire file shouldn't be loading into memory before you start reading it line by line, should it? -- Brett http://www.chapelperilous.net/ Hors d'oeuvres -- a ham sandwich cut into forty pieces. -- Jack Benny -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: memory issues reading large files
> You should be using something like > > open(FILE, $file) or die "$!\n"; > while(){ > ## do something > } > close FILE; > __END__ This is what I am doing, but before any of the file is processed, the whole text file is moved into memory. The only solution I can think of is to break apart the text file and read thru each smaller part...but I would like to avoid this. I was hoping someone knew how perl interacts with memory and knew how to trick it into not reading the whole file at one time. My main concern is scalability. This file will continue to grow daily, and in a year I don't want my app taking up a gig of mem. If nothing can be done reading a file, what about piping output from another problem? I use artsases (anyone familiar with the arts++ package) to generate the 150meg file, but I could easily pipe the results to my perl program instead of building the file. As far as memory and file handlers are concerned, what differences are there between reading from a text file vs another program? Thanks for your help, Brian -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: memory issues reading large files
It appears the problem was using the foreach statement instead of while. I have not tested this extensively, but using foreach the whole text file (or output of pipe) is read into memory before continuing, but using while (and probably for) each line is processed as it is read. Thanks for all your help, Brian -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: memory issues reading large files
On Thu, 7 Feb 2002, Brian Hayes wrote: > It appears the problem was using the foreach statement instead of while. > I have not tested this extensively, but using foreach the whole text > file (or output of pipe) is read into memory before continuing, but > using while (and probably for) each line is processed as it is read. Yes indeed it does, because foreach operates on a list, where as while operates on a boolean value returned from an expression (which means scalar context). So if you do foreach() { } It will put into the list context and slurp the entire thing up into a list. 'for' will do the same thing. It also operates on a list (foreach and for are aliases for each other). -- Brett http://www.chapelperilous.net/ As you grow older, you will still do foolish things, but you will do them with much more enthusiasm. -- The Cowboy -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]