That's pretty much it. But in your example, you could also just do a foreach
(<FILE>) instead of reading the entire FILE into @SOMEVAR. That should help
too.
-Pete
> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
> Sent: 04 September 2001 02:13
> To: Erick Nelson; perlunix
> Subject: [Perl-unix-users] Freeing Up Memory
>
>
>
> Hi Perl Hacker! I have a set of scripts in use on my system
> that i would
> like to start tweaking. They are all hand written and not
> someone elses
> work. I would like to start reducing the memory it takes to
> run the scripts.
> I want to start implementing undef declarations in the
> scripts, is there a
> method to this madness, do you actually have to declare
> before undef'ing
> them? A simple example of memory hogging is:
>
> # Opening a file
> sub somesub {
> open(FILE, "user.data");
> @SOMEVAR = <FILE>
> foreach $_ (@SOMEVAR) {
> chop ($_);
> @SPLITED=split(/\|/, $_);
> ... some code here
> undef @SPLITTED;
> }
> undef @SOMEVAR;
> }
>
> is this the correct format for reducing memory usage as this
> sample routing
> pulls in a large file and splits it up then analyes it and spits it to
> screen via web.? The typical size of the perl viewed via top
> during usage is
> around 125 to 150 megs each execution, which when 15-20
> people are running
> it means you need a real mean machine. Typically my machine
> just hangs and
> starts to que the processes and people get overly anxios and
> hit reload just
> doubling and so on the executions of these scripts.
>
> Many Thanks,
> John
>
> _______________________________________________
> Perl-Unix-Users mailing list. To unsubscribe go to
> http://listserv.ActiveState.com/mailman/subscribe/perl-unix-users
>
_______________________________________________
Perl-Unix-Users mailing list. To unsubscribe go to
http://listserv.ActiveState.com/mailman/subscribe/perl-unix-users