Well... here's an update.... I've found that it's actually the $tar->write() calls 
that are 
slowing the whole operation down... I'm guessing the write actually writes then reads 
the tar again...which as it grows in size, it will take longer and more cpu resources 
to 
read it EACH time... Which could be MANY depending on the number of 'flushes' 
being called. Soo... i removed the flush all together and suprisingly, everything 
worked 
as it should... Funny... I can't rememeber why i first put the flush in there to begin 
with... 
So everything seems Tip-Top... Sorry for the false alarm. (hopefully this isn't a 
double
false alarm.. but i'll be sure to let everyone know!! hehe)

        Thanks for listening!
        -Chris



On Saturday 14 December 2002 08:01 pm, Dr. Poo wrote:
> My name is Chris, HI!
>
>       I'm working on a backup application... and i'm using your Archive::Tar
> perl module (version .22). My question to you is, how can i "flush" the in
> memory archive after it has reached a certain condition...say like a
> certain size.
>
>       My problem is this...I'm trying to tar an unknown sized directory (with
> sub directories and sub sub directories, ect) and it can be anywhere from
> under one megabyte to almost a GIG! But i don't exactly have that much RAM
> (memory) if that even matters and after only 5 seconds or so when
> processing a directory that is about 250 megs perl already is taking up
> HUGE!!!! amounts of CPU resources (99%)... though strangely it's seems to
> only taking up about 3-5% of available memory. (i have only 64 installed)
> ?? Why ?? I guess i'm not understanding how the module handles the tar in
> memory. I'm getting the cpu and memory stats by watching the unix 'top'
> application.
>
>       I've tried to flush the tar object by calling the following when the
> 'buffer' reaches a certain size: $tar->write('foo.tar', 0);
>               $tar->remove(($tar->list_files()));
>               ...then start adding the remaining files to the emptied $tar object 
>(and
> flush again if the buffer exceeds the max size again)
>
>       Which makes everything go REALLY fast... as i want... BUT it actually
> removes the contents of the $tar file, not just from memory.. That doesn't
> make sense to me because the write() method (i thought) wrote what was in
> memory to the file specified.... So as i see it my 'flush' above would
> first update the tar file with the contents in memory, then remove the
> files in memory, then add the new files to memory and if need be flush
> again.
>
>       Should i flush the 'buffer' into mulitple spanning archives and combine
> them at the end of the whole process? Do you have any ideas?
>
>
>       Thanks a bundle!
>               -Chris

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to