I agree with Scott that Dan has asked some very good questions, not only in
this thread, but others as well. And unlike me when I first started with
Declude, he is asking the *right* questions. Keep at it, Dan, since you are
on the right track. Your questions are helping others as well (myself
The average spam/ham ratio for reported logs in Message Sniffer is
70%-75%. That is, 70%-75% of messages on average are spam. This is a
small sample (about 20 systems on average) but it has been a very
consistent range.
_M
| -Original Message-
| From: [EMAIL PROTECTED]
| [mailto:[EMAIL
We've here a constant value of 10 - 12% during workday and around 30% on
weekends. This because during weekends there are not so much real
messages. The values are based on a system with 350 domains 650
mailboxes and 2500 incomming messages/day.
Reporting from 01/01/2003 to yesterday on my
Hey, Scott,
The first thing to remember is that no one spam test is perfect, and all
spam tests will have some false positives (which is what the weighting
system really helps out with).
Yes, that concept is pretty easy for me to understand from an intellectual
level. I mean if there was a
Howdy, Scott,
SPAM-NONE weightrange x x 0 4
SPAM-VLOW weightrange x x 5 9
SPAM-LOW weightrange x x 10 14
SPAM-MID weightrange x x 15 19
SPAM-HIGH weightrange x x 20 29
SPAM-VHIGH weight x x 30 0
So if I am understanding what you have written above correctly it looks
like
you
Dan,
Sniffer has made a huge difference for us. We weight the test a 12 and flag emails as
Spam at 15. We only ran for a couple of months without it, but I watch our logs very
closely and the benefit of using Sniffer is significant.
Sniffer is an entirely different type of test from Declude.
Believe it or not, I actually think I do understand what's going on with the
weighting definitions in the tests. Like these 2 default tests from
GLOBAL.CFG...
MAILFROMenvfrom x x 12 0
IPNOTINMX ipnotinmx x x 0 -3
The first field is the name of the test. The second field is the type