mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman:
I use a file called bad_url.squid to represent sites I want blocked. I
think I have reached a limit to what it can hold as when I do a reconfigure
it could take a few minutes for the data to be scanned and processing power
gets sucked
From: Henrik Nordstrom hen...@henriknordstrom.net
Date: Tue, 13 Oct 2009 12:54:30 +0200
To: Ross Kovelman rkovel...@gruskingroup.com
Cc: squid-users@squid-cache.org squid-users@squid-cache.org
Subject: Re: [squid-users] Bad url sites
mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman
I use a file called bad_url.squid to represent sites I want blocked. I
think I have reached a limit to what it can hold as when I do a reconfigure
it could take a few minutes for the data to be scanned and processing power
gets sucked up. I know there is dansguardian and a few other ways to