> Keep in mind that you can pass extra
> options to any compression program you want by using the custom compression
> support and a wrapper script like this:
>
>     #!/bin/bash
>     /path/to/program --options $@
>
> If you can get it on your distribution, I'd suggest looking into zstandard
> [1] for compression.  The default settings for it compress both better _and_
> faster than the default gzip settings.

According to their own website, https://facebook.github.io/zstd/, they
have the best compression ratios, however, lz4 provides the fastest
compression and decompression times with still competitive ratios.
The point is:  Optimize for the attribute *you* need more.  a faster
algorithm means you can spend less time in compression, a higher ratio
means you'll spend less space on disk (Obviously), so pick the
algorithm with the correct balance...

Also bear in mind that some data types (Images, auido, video, etc.)
are largely incompressible.  I don't recall if you've said what you're
backing up, but in these cases, it's usually better to take one
super-fast pass to zip up the metadata and not dwell on ratios much.

Finally, consider if you have mixed DLEs, for example, one storing
computed tomography results and another storing raw patient data, you
can use different algorithms on them, such as lz4 for improved speed
on the CT images, and Zstd for the higher compression on the patient
data

Reply via email to