On 27 Feb 2002, at 10:55, Adam Kennedy wrote:

> If I might ask, how is it affected by the following for you.
> 
> 1. Explicitly finish the statement ( may or may not have an effect )
>     $sth->finish;

No effect. And 'officially' discouraged by Tim Bunce himself in this 
context (over and over again).

> 
> 2. Explicitly undef'ing the variables after the disconnect.
>     undef $sth;
>     undef $dbh;

No effect. Which seems logic to me, because i) with the *all_*ref() 
methods and large result sets memory is mainly consumed by the data 
structure the results are stuffed into, not $dbh and $sth, and ii) at 
least on win2k (I am on) memory once allocated to a process is 
*never* returned to the system as long as the process lives. This is 
a general problem, not only Perl's.

> While forking might save you some memory, it's a huge overhead if you
> wanted to do it regularly.

The original question was about saving memory. I have heard that 
forking is highly efficient on *NIX Systems and does not create much 
overhead. But, as you can see from my previous post, for large result 
sets I prefer row-by-row processing on the fly or caching result sets 
on disk.

> Adam
> ----- Original Message -----
> From: "Bodo Eing" <[EMAIL PROTECTED]>
> To: "Shao, Chunning" <[EMAIL PROTECTED]>
> Cc: <[EMAIL PROTECTED]>
> Sent: Friday, February 22, 2002 8:07 PM
> Subject: Re: Freeing memory [getting even more OT]
> 
> 
> > chunning,
> >
> > see below
> >
> > > >AFAIK unused memory is not returned to the system from Perl
> > > >processes. A workaround is to fork() out the memory consuming
> > > >steps into a child process, the memory of which is freed when the
> > > >child exits.
> >
> > > >HTH
> >
> > > >Bodo
> >
> >
> > >Could you give us examples how to do it?  I have the same problem.
> >
> > >Thanks
> >
> > >chunning
> >
> > Below find an *extremely simple* example. If you watch your system
> monitor,
> > you will see that the eaten memory stays in the perl process'es
> > stomach
> even
> > after $ary = '' and disconnecting. Memory is freed only after
> > keyboard input(when the whole program terminates). If you run this
> > script with the outcommented lines commented in, everything is
> > forked out into a child process. Watching your monitor now, you can
> > see that memory is freed
> *before*
> > keyboard input. Search the archive of this list for more information
> > about sharing (or sometimes better not sharing) database and
> > statement handles between processes and related stuff.
> >
> > #!/usr/bin/perl
> >
> > use warnings;
> > use strict;
> > use vars qw();
> > use DBI;
> >
> > # if (my $pid = fork()) {
> > # wait;
> > # }
> > # else {
> >     my $dbh = DBI->connect(<your_dsn_here>, undef, undef,
> >     {RaiseError =>
> > 1}) or die $DBI::errstr;
> > my $sql = "SELECT * FROM <a_large_table_here>";
> > my $sth = $dbh->prepare($sql);
> > $sth->execute;
> > my $ary = $sth->fetchall_arrayref();
> > print scalar @$ary, "\n";
> > $ary = '';
> > $dbh->disconnect;
> > # exit;
> > # }
> > print "Done!\n";
> > my $cmd = <STDIN>;
> >
> > 1;
> > __DATA__
> >
> >
> > In my experience, the *all_*ref() methods are very handy for small
> > to
> medium
> > amount of data (below 10 MB or better less), but for large amount of
> > data, not only memory usage is heavy, but your program will also be
> > slowed down.
> If
> > you don't know in advance how many data you get returned and
> > everything between zero and terabytes is possible, it is
> > recommendable to i) process your results row by row on the fly or
> > ii) cache them on disk and decide
> then
> > how to proceed further (now knowing how many rows were returned).
> >
> > HTH
> >
> > Bodo
> >
> >
> 

Bodo

P.S.: In case of further questions please email me directly, since I 
have unsubscribed temporarily from all lists.

Reply via email to