List of urls

2010-10-25 Thread Richard Smits

Hello,

Does anyone know if it's possible to have a list of url's, and define a 
score for all of them in one line ?



Now i do like this :

uri url_1 /www.domain1.com/
uri url_2 /www.domain2.com/
uri url_3 /www.domain3.com/
uri url_4 /www.domain4.com/

score url_1 10
score url_2 10
score url_3 10
score url_4 10


But I want just one line to define the score. Are there more ways to do 
this ?


Greetings .. Richard



Russian spam

2010-01-25 Thread Richard Smits

Hello,

Does anyone knows any tricks to fight russian spam ? We are getting a 
lot of this for the last weeks.


I am looking at the RelayCountry plugin, but am worried that our russian 
customers will get more false positives.


It is difficult because SA does not recognize the russian charset, but 
are there some secret tricks i should know about ?


Greetings, Richard ...


Re: Returned mail spam

2008-04-17 Thread Richard Smits
Hos safe is it to pump up the score for the ANY_BOUNCE_MESSAGE ?
Is it bug free, so I can give it 5 or 10 points ?

Is anyone doing this ? (Maybe a step to far)

Greetings Richard

Matus UHLAR - fantomas wrote:
>> Graham Murray wrote:
>>> If you publish a suitable SPF record then you will not receive any
>>> backscatter (which is the subject of this thread) from sites which
>>> correctly implement SPF checking.
> 
> On 16.04.08 18:06, mouss wrote:
>> without spf, you will not receive any backscatter from sites which do 
>> not accept-then-bounce.
> 
> even with SPF ... SPF changes nothing here.
> 


Re: Advice on MTA blacklist

2007-10-09 Thread Richard Smits
> 
> Hello,
> 
> Which spam blacklists do you use in your MTA config. (postfix)
> smptd_client_restrictions
> 
> Currently we only use : reject_rbl_client list.dsbl.org
> 
> 
> We let spamassassin fight the rest of the spam. But the load of spam is
> getting to high for our organisation. Wich list is safe enough to block
> senders at MTA level ?
> 
> Spamhaus, or spamcop ?
> 
> I would like to hear some advice or maybe your current setup ?
> 
> Thank you for any advice we can use .
> 
> Greetings Richard
> 
> 
> I would use spamhaus for MTA reject and spamcop in SA.   I've also been
> evaluating a very interesting new RBL for several weeks called the
> "ivmSIP" rbl.  Its designed to work after RBLs like spamhaus to catch
> what they miss and it works quite well so far.  It's catching about 30%
> of the mail that makes it past both spamhaus and spamcop (and of course
> some of that mail is actually not spam :).  The web site for the new
> list isn't ready yet but you can ask for a trial feed by emailing  Mr.
> Rob McEwen at [EMAIL PROTECTED] 

Thanks for all the advice.. I think we will be using spamhaus. I am
running a test and it blocks a lot of spam. Currently I use the
sbl.spamhaus and pbl.spamhaus
Is this wise, or should I also use the xbl and switch to zen.spamhaus?




New distribution rule not working ?

2007-09-21 Thread Richard Smits
Hi,

In a spammail I found this rule :
RCVD_IN_DNSWL_MED=-4

But it is a spammail. I have never seen this rule before. Looks like a
DNS Whitelist ?

Greetings... Richard Smits


Re: How to decrease the bayes database size

2007-06-13 Thread Richard Smits

Stéphane LEPREVOST wrote:
 
Thanks Theo for these usefull answers.


As we're using auto_learn and never use sa-learn by hand, is there a more
particular risk if we simply delete the file ?

Here's the configuration we use about Bayes :

use_bayes 1
use_bayes_rules 1
bayes_auto_learn 1

-Message d'origine-
De : Theo Van Dinter [mailto:[EMAIL PROTECTED] 
Envoyé : mardi 12 juin 2007 17:06

À : users@spamassassin.apache.org
Objet : Re: How to decrease the bayes database size

On Tue, Jun 12, 2007 at 10:07:15AM +0200, Stéphane LEPREVOST wrote:
Thanks for this tip but what about the efficiency of the Bayes 
Database after this operation ?


The _seen database just tracks which mails have been learned from, and has
no effect on the ratings coming out of the Bayes system.

Is ther a way to export the real records of the file before deleting 
it and then re-import them back to it ? Shall we use something similar 
to check_whitelist and trim_whitelist tools ?


There'd be no point to that, entries are only deleted rarely (whenever you
do a "sa-learn --forget"), otherwise they're just added.

If you're not worried about relearning the same mail, then just delete the
seen DB file.

--
Randomly Selected Tagline:
Last year we drove across the country...  We switched on the driving...
 every half mile.  We had one cassette tape to listen to on the entire trip.
 I don't remember what it was.
-- Steven Wright




Thank you all for these usefull answers. I have deleted the bayes_seen 
file and things are looking better now. Not perfect.
Sometimes I get an amavisd process with a memory load of 2 GB. This 
seems really out of proportions.


17581 amavis25   0 2549M 2.1G   444 R21.9 72.1   3:15   1 amavisd

This process goes away, but really slows things down. Could this be a 
corrupt database, or should I look at a different angle ?


Greetings... Richard


How to decrease the bayes database size

2007-06-12 Thread Richard Smits

Hello,

We realy need some help here. It has come to our attention that our 
bayes database is 2.4 GB big. It is really slowing down our servers and 
they have a big cpu load.


Now we have tried the trick with the sa-learn --force-expire , and it 
deletes a lot of entrys, but the file is not getting any smaller.


79K  Jun 12 09:26 bayes_journal
20M  Jun 12 09:26 bayes_toks
2.5G Jun 12 09:26 bayes_seen*

Does anyone has some tricks to help us out ?

Greetings... Richard Smits


0.000  0  3  0  non-token data: bayes db version
0.000  0   14201082  0  non-token data: nspam
0.000  07760360  0  non-token data: nham
0.000  0 916962  0  non-token data: ntokens
0.000  0 1181559955  0  non-token data: oldest atime
0.000  0 1181633069  0  non-token data: newest atime
0.000  0 1181633115  0  non-token data: last journal 
sync atime

0.000  0 1181604237  0  non-token data: last expiry atime
0.000  0  43200  0  non-token data: last expire 
atime delta
0.000  0 360013  0  non-token data: last expire 
reduction count


--