On Thu, 28 Apr 2005, [koi8-r] Артем Аветисян wrote:
> Thank you! That works. 
> 
> However, as the zip gets bigger the perfomance suffers aloud. Which is
> no surprise for the programm has to reread and rewrite zip every cycle
> round.

A downside to the zip format is that there's a "central directory" that
needs to be updated when files are added to the archive.  The tar format
doesn't have anything like that, and it should be possible to append new
files to the end of a tar archive efficiently.  Unfortunately Archive::Tar
doesn't appear to support that sort of usage; as far as I can see, the
only options for IO are to read an entire archive into memory, or to write
a complete in-memory archive to disk.

If I had to do this, and scaling to really large numbers of files was
critical, I'd try using Compress::Zlib directly.  You can open a file
handle to a new gzip file and then stream data to it, which is compressed
and flushed to the disk file on the fly.  You'd have to come up with your
own way of distinguishing one file from the next and you'd have to write
your own reader for the resulting archive, but the performance should be
great and writing the archive ought to be quite simple.


TomP

-------------------------------------------------------------------------
Tom Pollard                                       [EMAIL PROTECTED]
Schrodinger, Inc.                                    646-366-9555 x102
-------------------------------------------------------------------------


_______________________________________________
Perl-Win32-Users mailing list
Perl-Win32-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to