On 7/16/05, Chris Devers <[EMAIL PROTECTED]> wrote:
> On Thu, 14 Jul 2005, Beast wrote:
> 
> > I have prototype that should parse big log files (680MB) converted
> > into nice GUI apps. It's not nice if the machine totaly freeze during
> > testing. (linux 512MB/2GB swap).
> 
> Are you trying to read the whole file in at once, or are you trying to
> read through it using data as you walk along?
> 
> Hint: the latter approach can be far more memory efficient.
> 
> There are OS tricks that can help accomodate a misbehaving program, but
> they rarely work as well as reconsidering algorithms in that program.
> 
> 
> --
> Chris Devers

Exactly.

The short answer, here, is "yes". But that isn't very helpful. So why
don't you tell us what you mean by limit, and what your goal is.
'ulimit' and it's clones and relatives will keep a system from
crashing by causing the program to abort if it uses more than its
allotted memory. That saves you system, but your program will never
finish executing.

What you probably want to do is examine the approach you're taking.
Figure out why you're using so much memory, and refactor the code.

I'm sure that with some specific examples people on this list will be
happy to help you figure out how to do that.

In other words, if you want a useful answer from this list, submit the
problematic code.

HTH

-- jay
--------------------------------------------------
This email and attachment(s): [  ] blogable; [ x ] ask first; [  ]
private and confidential

daggerquill [at] gmail [dot] com
http://www.tuaw.com  http://www.dpguru.com  http://www.engatiki.org

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to