On Tuesday 28 January 2003 03.25, Kwan Chee Kin wrote:
>
> The infected host will try to make at least 100 hits/minute to the
> bogus URL through the Squid. This affect the squid logs -
> access.log and store.log. It grew to a few Gigs within hours.
>
>         My question will be is there any solution to this type of
> problem where the squid will just drop requests that have more than
> 30 hits to a bogus or unreachable URL and not log into the logs?

I have been thinking a bit on how this could be addressed and it is 
not easy to find a generic method that works. Just dropping the 
requests won't help as this still will put a large strain on Squid as 
the virus/worm will simply retry the request again only a little 
quicker..

But maybe a design can be made where such requests are suspended 
rather than dropped, thereby trying to slow down the virus. Care must 
however be taken to not block to many filedescriptors on "suspended" 
requests. This works well for single-threaded attacks, but if the 
virus/worm is multithreaded and making many requests in parallell 
then this won't help either I am afraid, probably only making the 
situation even worse..

The most reliable approach is as suggested: Firewall the offending 
stations from using network resources until cleaned.

Regards
Henrik

Reply via email to