Why not create a sample dataset and throw some large .txt files out there
and see what happens?  That way you'll know for certain if there's some bug
you're hitting, or if it's just not applicable to your current dataset.



On 1/27/08, Joachim Pihl <[EMAIL PROTECTED]> wrote:
>
> On Sat, 26 Jan 2008 23:33:16 +0100, Toby Thain <[EMAIL PROTECTED]> wrote:
>
> > On 26-Jan-08, at 2:24 AM, Joachim Pihl wrote:
>
> >> So far so good, "zfs get all" reports compression to be active. Now for
> >> the problem: After adding another 300GB of uncompressed .tif and
> >> .bin/.cue
> >> (audio CD) files,
> >
> > I wouldn't expect very much compression on that kind of data.
>
> I didn't expect miracles, but since WinRAR gave 13% compression on a
> representative data set with ZIP compression at the fastest setting, I was
> expecting to see a compression ratio > 1.05 at least, not == 1.00. Getting
> 5-10% more space on a drive never hurts.
>
> Could it be that I should specify compression algorithm? Documentation
> says it will use lzjb by default, so I did not set it explicitly.
>
>
> --
> Joachim
>
> Hvorfor bruke fremmedord, når det finnes adekvate norske substitutter?
> _______________________________________________
> zfs-discuss mailing list
> zfs-discuss@opensolaris.org
> http://mail.opensolaris.org/mailman/listinfo/zfs-discuss
>
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to