On Sunday, March 13, 2005, 5:36:55 AM, Raymond Dijkxhoorn wrote:
> Hi!

>>> Perhaps some kind person could write a reporting function in
>>> SpamAssassin for this?

>> Hmm, perhaps if we could extract *all* URI domains from messages
>> sent through XBLed senders then prioritize those say by frequency
>> of appearance, we could create a new SURBL list of spamvertised
>> domains sent through exploited hosts.  That would pretty directly
>> address the use of zombies, etc. and put a penalty on using them
>> to advertise sites through them.  Even with volume weighting such
>> a list of sites could be attacked by major joe job unless we took
>> additional countermeasures, but does anyone else think this might
>> be a useful type of data source for SURBLs?
[...]

> Spamtraps are bad news if you use them 1:1, you need to parse out a LOT, 
> did you run poluted spamtraps? I have been running two proxypots, i still 
> might have some tars, and most of it was really useless. What more helps 
> is a wider coverage. I rather see some automated system like spamcop 
> setup, so people can report, and we auto parse it with Joe's tool for 
> example. With a larger footprint we also get spam earlier. Its not like 
> they first send to the spamtraps and then to 'real'users alone.

> I understand you want to cover new area's but please dont rely on other 
> RBL's too much, i think waiting with own checks does much more in the end. 
> IF SBL picks it up we can pick it up faster. But we also want to pickup 
> ones NOT listed by any RBL do we ?

I think you're not understanding what I'm asking for.  :-)

I'm not asking for trap data.  I'm asking to look for XBL hits,
then take the URIs from messages that hit XBL.  In other words
I want to get the sites that are being advertised through
exploited hosts.

Nothing to do with traps or SBL.  ;-)

Jeff C.
-- 
Jeff Chan
mailto:[EMAIL PROTECTED]
http://www.surbl.org/

Reply via email to