On Tue, Mar 30, 2021 at 05:57:55PM +0500, ???? ??????? wrote:
> also, I read that slz stops compressing when CPU level reaches some
> threshold. is it related to all gzip, including zlib ?

It's neither, it's haproxy which does this, based on "maxcompcpuusage"
(but it defaults to 100, i.e. unlimited). And similarly you can limit
zlib's memory usage using maxzlibmem, which will make it refrain from
compressing once the limit is reached. For me the memory is the real
problem here. 30k concurrent streams will consume 9 GB.

> if so, we can safely stick with any compression lib (but I agree that
> having benchmarks would help people)

There are a few benchmarks on the site and the readme, and some can be
found in dedicated tools. We could indeed produce some more detailed
tests on cheap dedicated servers with limited bandwidth/unlimited data
to see which one is better for which case, but I recall a few figures
such as "use zlib below 50 Mbps output link size on Atoms if you've
got lots of RAM otherwise use slz".

Willy

Reply via email to