Zitat von /dev/rob0 <r...@gmx.co.uk>:

On Wed, Apr 18, 2012 at 04:33:31AM +0300, Henrik K wrote:
Still, is it too much to ask for looking at
things from many angles or backing up claims with any kind of
statistics or science instead of personal gut feelings?

Where/how would one collect such data? My mail stream differs from
yours, as does my spam problem. The best, meticulously gathered
statistics from one site won't be applicable to another site.

Unfortunately the gut is what we have. My gut feeling is that SPF
lookups are the surest way to make this scheme work without causing
some kind of problem. Yes, my MX is also the outbound relay, but at
bigger sites this is less likely.

Another gut feeling: greylisting is past its prime. I do it using
postscreen, but I sometimes consider disabling the deep protocol
tests. The DNSBL scoring system is what blocks most of my spam.

And that's how the "gut feelings" are differ. On our site greylisting is by far the most effective spam-block. For a long time we had problems because the RBL listings for spam sources only appear after they have dropped their spam to us, so pure RBL/DNSBL is near useless for us. With greylisting a big share of the spam bots don't come back anyway and the ones operate longer are finally listed in the RBLs at the time they would pass greylisting. Combined with a big automatic whitelist the negative impact from greylisting is near zero because all business partners and the like are whitelisted.

Regards

Andreas


Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

Reply via email to