> On Wed, 30 Jul 2003, Gordan wrote:
>
> > Here is what I propose. ZIP archives to be replaced with tar archives.

Why? Wouldn't an uncompressed/compressed zip be easier to produce since the
required code already is present in java?

> >Then, on a lower, "node" level, at insert time, the file being inserted
is checked.
> > If it's mime type or extension are indicative of compressed content, we
> > simply insert the file as is. We do the same if the node is explicitly
told
> > not to compress the file, e.g. via a flag.
> >
> > If the file still appears to be compressible after that (e.g. html,
text,
> > tar), bzip2 compression is applied to the file. The file is compressed
and
> > the pre/post compression size is checked. All files in Freenet are
padded to
> > the next power of 2 limit, so if the file after compression remains in
the
> > same size bracked (padded to the same power of 2 size), there is no
point in
> > compressing it. We just throw away the compressed file and insert it
> > uncompressed.
>
> You say there's no point in compressing it, but at that point you've
> already compressed it.  I mean, I guess you're saving that little bit of
> processing power on the eventual receiving end.

And bzip2 isn't the fastest compressor...

> > This means that compression/decompression overhead is avoided where no
> > space/bandwidth would be saved due to other design issues. Best of all
> > worlds, I think.
>
> Well, you're not saving "compression/decompression overhead".  Just
> "decompression overhead".  Your point remains, though.
>
> > However, there is something else I have been thinking about. Tar and
bzip2
> > would require extra Java libraries. They exist, but they would have to
be
> > bundled with Freenet. This would increase the download size, which is
not
> > desirable. ZIP handling is already built in, and would require no
additional
> > libraries. The downside is that it generally doesn't seem to compress as
well
> > as bz2, and sometimes the difference is quite noticable.

What about compression/decompression speed?

> Ugh.  Bundling issues with the JVM are already a pain in the ass.
> Bundling sucks.  (Although I think there may be a Third Way with regards
> to this.)
>
> > Thoughts please. :-)

Do not bundle, add the needed source if anything...

> The benefits of zipping data are apparent.  I'm just proposing thoughts on
> how to do it.

I am not so sure about that.. I think that the bulk of freenet data
(sizewise, not itemcount-wise) already is compressed pretty well. It would
seem to me that the only stuff in freenet that isn't already compressed is
the HTML pages/text items displayed by fproxy.

/N

_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to