Hi,
How can squid slow down clients (Browsers) making a lot of hits (over
100 000 hits per day) by staying connected all day and night to web
sites doing page refresh up to 30 per minute (increasing also the size
of access_log) ?
Same question if download is over 20 Gb per day.
delay_pools
On Wednesday 15 May 2013 at 10:55:16, C. Pelissier wrote:
Hi,
How can squid slow down clients (Browsers) making a lot of hits (over
100 000 hits per day) by staying connected all day and night to web
sites doing page refresh up to 30 per minute (increasing also the size
of access_log) ?
You might find this easier to achieve with IPtables rules than Squid:
http://www.debian-administration.org/articles/187
You'd want to restrict connections to your Squid port (probably 3128) to be a
compromise between lots in a few seconds to allow for normal browser
parallel accesses,