On 02/12/2015 11:04 PM, Quanah Gibson-Mount wrote:
Has anyone done any real testing of Redis as a bayes backend?  Talking
with one of our customer, with a trivial <60,000 accounts, they are seeing:

PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
22452 redis 20 0 28.7g 28g 740 S 9.6 72.4 1139:38 redis-server

28GB purely for a bayes DB for < 60k accounts is insanity.


2015-02-12 23:19 Axb wrote:
Note: You could be seeing the 28GB of memory usage for 10 users if you
configure it to do so.

My production Redis bayes with about 75k users looks like:
[...]
# Memory
used_memory_human:3.59G
used_memory_peak_human:3.68G
[...]
local_bayes_redis.cf:
bayes_seen_ttl 1d
bayes_token_ttl 5d


Right, keeping _ttl low is the main memory control mechanism.

In particular the bayes_seen_ttl can be kept very low with
no ill-effects. Then use the bayes_token_ttl for adjusting
memory usage - keep it as high as can be comfortably afforded.

The other valuable control is:
  bayes_auto_learn_on_error 1


Mark

Reply via email to