> -----Original Message-----
> From: G.W. Haywood [mailto:[EMAIL PROTECTED]]
> Sent: Monday, October 30, 2000 10:06 AM
> To: Geoffrey Young
> Cc: [EMAIL PROTECTED]
> Subject: RE: maximum (practical) size of $r->notes
> 
> 
> Hi all,
> 
> On Mon, 30 Oct 2000, Geoffrey Young wrote:
> 
> > > From: G.W. Haywood [mailto:[EMAIL PROTECTED]]
> > > If it's a huge amount of data and you don't want to bloat your
> > > processes, why not pass a tempfile name/pointer/handle in 
> $r->notes
> > 
> > or (easier) just place a reference to a variable containing 
> your data in
> > pnotes instead of notes - that way a reference, and not the 
> data, is passed
> > around.  the data has to exist somewhere, but now you only 
> have one copy of
> > it...
> 
> If the data is in a Perl variable somewhere Perl will already have
> grabbed enough memory to store it.  Won't Perl then just keep that
> memory until the child dies, even if you undef the variable containing
> the data?  

that is my understanding... I guess that my point was that if you are going
to have the data in perl somewhere the memory is going to be taken (for
example, putting it in a tempfile but then local $/ and slurp).  pnotes
allows for passing by reference, so it really doesn't matter when you read
it in and where you use it, you still only have one copy...

but then again, it's monday morning and my coffee was weak today...



> My idea was to avoid this cause of process bloat.
> 
> If the data isn't in a Perl data structure is this always safe?

I dunno

--Geoff

> 
> 73,
> Ged.
> 
> 

Reply via email to