there may be some academic projects to compress particular types of data 
in particularly clever ways, but you have listed all the main general 
purpose lossless tools. If possible you might use lossy compression, 
which could mean recoding image content at lower quality settings, but 
this really applies to photo type images, video and audio.
Most of the command line compressors have a default setting which is a 
reasonable tradeoff between speed and compression. Have a look at the 
man pages to see if there are flags you can flip to scrunch things up a 
bit tighter.
Finally, the reason things are .tar.gz is that they are slapped together 
first with tar, then the whole archive is compressed with gzip. This is 
more efficient than .gz.tar because the compressor can use similarities 
between different files. For example if you had a directory of fairy 
tales and they all started "Once upon a time" the compressor could store 
that string once for the whole collection and just insert it at the 
start of each on decompression.


Isaac Close wrote:
> hello all,
>
> I'm (trying) to find the 'best' compression software/algorithm about. 
> Best as-in best compression ratio most, cpu time and memory footprint are not 
> a problem.
>
> I'm already well aware of 7zip, bzip2, rzip and ofcourse gzip, but looking on 
> google is not putting me beyond those.
>
> tar very much,
>
> Isaac.
>
>
>
>
>       
>
>   


-- 
Please post to: Hampshire@mailman.lug.org.uk
Web Interface: https://mailman.lug.org.uk/mailman/listinfo/hampshire
LUG URL: http://www.hantslug.org.uk
--------------------------------------------------------------

Reply via email to