https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7689

Olivier Coutu <[email protected]> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |[email protected]

--- Comment #2 from Olivier Coutu <[email protected]> ---
(In reply to RW from comment #1)
> FWIW it was always linear. It's possible to create contrived rule sets that
> are non-linear, e.g. where each new rule depends all the previous rules. For
> real-world rules sets there's no reason to think that the mean amount of
> work per rule grows unbounded as new rules are added.

I agree that for cases when rules are mostly independant, it is close to
linear. We just happened to created rules that are highly dependant on one
another, with hundreds of total dependencies on some. Performance at runtime is
fine, but lint is getting unbearable.

There are some rules in the standard ruleset that have a high number of
dependancies, such as MONEY_FRAUD_3 which has 519 currently, but half of them
are duplicates. The patch cuts that number to 196 as an added bonus.

-- 
You are receiving this mail because:
You are the assignee for the bug.

Reply via email to