Jeff Chan wrote:
> On Wednesday, December 8, 2004, 9:06:26 AM, Daryl O'Shea wrote:
>>It doesn't cause more lookups for anyone.  A local white list file would
>>reduces lookups at the expense of process size (and time if the white
>>list is very large).
>
>
> The SA developers chose an appropriately small exclusion list
> to hard code as the top 125 most often hit whitelist entries.
> Those top hits are largely invariant and would represent a
> large portion of the DNS queries if not excluded.  It doesn't
> make much sense to serve up a small, nearly invariant list
> with a DNS list, long TTLs or not.
>
> Jeff C.

Yes, as I noted later in the thread.

"There's got to be a reason why SpamAssassin currently only includes the top 100 or whatever excluded domains... either the rest of the data wasn't useful or it wasn't worth the performance hit having them in memory."


I only suggested another solution to what Chris was suggesting (having Rules-du-jour style (assumedly) masssive .cf file exlusions lists... which in my opinion aren't appropriate (massive lists that is) due to the memory overhead.


I'm fully aware, as I think everyone is now, of the exlusion list included with 3.0.


Daryl



Reply via email to