James E. Pratt wrote:

> 
>>>>>>> I can confirm too that Trojan.Downloader.JS.Agent-2 (and 1) hit
> a
>> load of legitimate sites.
> 
> Hello . I ran into this " Trojan.Downloader.JS.Agent-2" issue yesterday
> on our web server. When notified, the webmaster replied with "these are
> coming from compressed js files using Dean Edwards' javascript "packer"
> [http://dean.edwards.name/packer/], which compresses js and usually
> reduces the file size by 30-40 percent." 

If the principal users of this service are spammers trying to obfuscate 
their content then I see no reason not to use a tool to block that 
content. A lesson that has been hard to teach is that when legitimate 
users create content that is indistinguishable from common spam it will 
be blocked. That takes into consideration the source - sales and 
marketing types in any corporation have a particular problem as almost 
all of what they create could be considered spam by someone. Best effort 
rules apply. I've never had a manager reverse me on this.

However - without some kind of scoring system that weighs various parts 
of the content, it cannot be determined if the entire content is 
acceptable or not and to make that decision based only on the presence 
of compressed javascript patterns is probably unreliable. Well, the 
pattern is gone now so that seems to be a widely accepted notion :)

This pattern might work better in a milter that does scoring and which 
is capable of considering a wider range of criterion.

dp
_______________________________________________
Help us build a comprehensive ClamAV guide: visit http://wiki.clamav.net
http://lurker.clamav.net/list/clamav-users.html

Reply via email to