Re: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-18 Thread Henrik Nordstrom
ons 2006-07-12 klockan 15:22 +0100 skrev Brian Gregory:

> Squid is set up to run 5 squidGuard processes. When we boot Suse it 
> takes 15-20 minutes with lots of disk thrashing for the 5 squidGuards to 
> read in the blacklists and build their tables.

This will be much faster if you let squidGuard build it's lookup db.

> Much of the time it works fine but every now and then for no obvious 
> reason, squid decides it needs to start more squidGuard processes which 
> effectively cuts off all web access.

helper processes are restarted

  when "squid -k rotate" is run
  when "squid -k reconfigure" is run
  when more than 50% of the helpers have crashed
  if Squid crashes or is restarted

>  I'm not sure exactly what happens, 

See cache.log for information on why the helpers was restarted.

Regards
Henrik


signature.asc
Description: Detta är en digitalt signerad	meddelandedel


Re: AW: AW: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-13 Thread Brian Gregory

[EMAIL PROTECTED] wrote:

Define the location of the pre-built databas in the configuration file of 
squidguard.

Example:

destination porn {
domainlistporn/domains
urllist   porn/urls
expressionlistporn/expressions
log   porn.log
}


Mit freundlichem Gruß/Yours sincerely
Werner Rost
GMT-FIR - Netzwerk
 
 ZF Boge Elastmetall GmbH

 Friesdorfer Str. 175
 53175 Bonn
 Deutschland/Germany 
 Telefon/Phone +49 228 3825 - 420

 Telefax/Fax +49 228 3825 - 398
 [EMAIL PROTECTED]


I think I've got it working now, it certainly starts up much quicker 
even when I configure 10 squidGuard processes.


I have set up the following running on a weekly cron job as root to 
download new blacklists and create the database just once a week (watch 
out for the line wraps):



# This is Brian's blacklist update script

cd ~

rm -f -f bl.tar.gz

wget -O bl.tar.gz 
http://ftp.tdcnorge.no/pub/www/proxy/squidGuard/contrib/blacklists.tar.gz


tar --ungzip --extract --exclude=*.diff 
--directory=/var/lib/squidGuard/db --verbose -f bl.tar.gz


rm -f -f bl.tar.gz

wget -O bl.tar.gz 
ftp://ftp.univ-tlse1.fr/pub/reseau/cache/squidguard_contrib/blacklists.tar.gz


tar --ungzip --extract --exclude=*.diff 
--directory=/var/lib/squidGuard/db --verbose -f bl.tar.gz


rm -f -f bl.tar.gz

chown -R squid:nogroup /var/lib/squidGuard/db

/usr/sbin/squidGuard -C all

chown -R squid:nogroup /var/lib/squidGuard/db

/usr/sbin/squid -k reconfigure

#Script Ends

The squid.conf file seems to be okay exactly as it was. The squidGuard 
processes seem to know to use the databases rather than the text files.



Does this look reasonable?

--

Brian Gregory.
[EMAIL PROTECTED]

Computer Room Volunteer.
Therapy Centre.
Prospect Park Hospital.


AW: AW: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-13 Thread Werner.Rost
Define the location of the pre-built databas in the configuration file of 
squidguard.

Example:

destination porn {
domainlistporn/domains
urllist   porn/urls
expressionlistporn/expressions
log   porn.log
}


Mit freundlichem Gruß/Yours sincerely
Werner Rost
GMT-FIR - Netzwerk
 
 ZF Boge Elastmetall GmbH
 Friesdorfer Str. 175
 53175 Bonn
 Deutschland/Germany 
 Telefon/Phone +49 228 3825 - 420
 Telefax/Fax +49 228 3825 - 398
 [EMAIL PROTECTED]



-Ursprüngliche Nachricht-
Von: Brian Gregory [mailto:[EMAIL PROTECTED] 
Gesendet: Donnerstag, 13. Juli 2006 13:11
An: squid-users@squid-cache.org
Betreff: Re: AW: [squid-users] Squid and SquidGuard retsarting. Why?


[EMAIL PROTECTED] wrote:
> Please read the documentation for squidguard.
> 
> In short: You should build a squidguard-database containing your 
> blacklists one time. After that squidguard should start within a few 
> seconds.
> 
> Mit freundlichem Gruß/Yours sincerely
> Werner Rost
> GMT-FIR - Netzwerk
>  
>  ZF Boge Elastmetall GmbH
>  Friesdorfer Str. 175
>  53175 Bonn
>  Deutschland/Germany
>  Telefon/Phone +49 228 3825 - 420
>  Telefax/Fax +49 228 3825 - 398
>  [EMAIL PROTECTED]
> 
> 
> 


Ok I found some documentation that says the -C listfile parameter builds 
a pre-built database but there doesn't seem to be any info on how to use 
a pre-build database. Maybe all will become clear if I experiment a bit.



Re: AW: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-13 Thread Brian Gregory

[EMAIL PROTECTED] wrote:

Please read the documentation for squidguard.

In short: You should build a squidguard-database containing your blacklists one 
time. After that squidguard should start within a few seconds.

Mit freundlichem Gruß/Yours sincerely
Werner Rost
GMT-FIR - Netzwerk
 
 ZF Boge Elastmetall GmbH

 Friesdorfer Str. 175
 53175 Bonn
 Deutschland/Germany 
 Telefon/Phone +49 228 3825 - 420

 Telefax/Fax +49 228 3825 - 398
 [EMAIL PROTECTED]






Ok I found some documentation that says the -C listfile parameter builds 
a pre-built database but there doesn't seem to be any info on how to use 
a pre-build database. Maybe all will become clear if I experiment a bit.




Re: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-13 Thread Brian Gregory

Dwayne Hottinger wrote:

Quoting Brian Gregory <[EMAIL PROTECTED]>:


We have a Linux box running Suse 10.0 set up as a router and web proxy
with filtering sharing our DSL connection between 7 Windows XP
computers. It's running squid and squidGuard with a very large blacklist
of forbidden URLs and phrases.

Because we basically have no money the Suse box is an old 400MHz Pentium
II PC with only 256MB of RAM and this isn't likely to change in the near
future, except that I might be able to get some more RAM if necessary.

Squid is set up to run 5 squidGuard processes. When we boot Suse it
takes 15-20 minutes with lots of disk thrashing for the 5 squidGuards to
read in the blacklists and build their tables. During this time the web
proxy is non functional so we usually leave the Suse box running 24/7 to
avoid having to wait for it.

Much of the time it works fine but every now and then for no obvious
reason, squid decides it needs to start more squidGuard processes which
effectively cuts off all web access. I'm not sure exactly what happens,
maybe sometimes it just kills the existing squidGuards and starts new
ones but it sometimes seems to end running 10 squidGuards and thrashing
the disk hard for ages leaving the users with no web access.

When it's all running properly free -m seems to indicated that there is
enough memory:

  total   used   free sharedbufferscached
Mem:   250246  3  0 51   126
-/+ buffers/cache: 68181
Swap:  400  2397



Does anyone know what's going on and how to stop it happening?

--

Brian Gregory.
[EMAIL PROTECTED]

Computer Room Volunteer.
Therapy Centre.
Prospect Park Hospital.



How big are your access.log files?  There is a 2gb limit on Squid.  I would
definately think about adding more memory to the box though.  You should be
able to pick up PC 100 memory fairly cheap.
--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools



Part of the problem may be log file rotation which appears to be set to 
restart squid at the moment.


However this does not explain why I sometimes find that it is running 10 
squidGuard processes when my squid.conf specifies 5.


--

Brian Gregory.
[EMAIL PROTECTED]

Computer Room Volunteer.
Therapy Centre.
Prospect Park Hospital.


Re: [squid-users] Squid and SquidGuard retsarting. Why?

2006-07-12 Thread Dwayne Hottinger
Quoting Brian Gregory <[EMAIL PROTECTED]>:

> We have a Linux box running Suse 10.0 set up as a router and web proxy
> with filtering sharing our DSL connection between 7 Windows XP
> computers. It's running squid and squidGuard with a very large blacklist
> of forbidden URLs and phrases.
>
> Because we basically have no money the Suse box is an old 400MHz Pentium
> II PC with only 256MB of RAM and this isn't likely to change in the near
> future, except that I might be able to get some more RAM if necessary.
>
> Squid is set up to run 5 squidGuard processes. When we boot Suse it
> takes 15-20 minutes with lots of disk thrashing for the 5 squidGuards to
> read in the blacklists and build their tables. During this time the web
> proxy is non functional so we usually leave the Suse box running 24/7 to
> avoid having to wait for it.
>
> Much of the time it works fine but every now and then for no obvious
> reason, squid decides it needs to start more squidGuard processes which
> effectively cuts off all web access. I'm not sure exactly what happens,
> maybe sometimes it just kills the existing squidGuards and starts new
> ones but it sometimes seems to end running 10 squidGuards and thrashing
> the disk hard for ages leaving the users with no web access.
>
> When it's all running properly free -m seems to indicated that there is
> enough memory:
>
>   total   used   free sharedbufferscached
> Mem:   250246  3  0 51   126
> -/+ buffers/cache: 68181
> Swap:  400  2397
>
>
>
> Does anyone know what's going on and how to stop it happening?
>
> --
>
> Brian Gregory.
> [EMAIL PROTECTED]
>
> Computer Room Volunteer.
> Therapy Centre.
> Prospect Park Hospital.
>

How big are your access.log files?  There is a 2gb limit on Squid.  I would
definately think about adding more memory to the box though.  You should be
able to pick up PC 100 memory fairly cheap.
--
Dwayne Hottinger
Network Administrator
Harrisonburg City Public Schools


[squid-users] Squid and SquidGuard retsarting. Why?

2006-07-12 Thread Brian Gregory
We have a Linux box running Suse 10.0 set up as a router and web proxy 
with filtering sharing our DSL connection between 7 Windows XP 
computers. It's running squid and squidGuard with a very large blacklist 
of forbidden URLs and phrases.


Because we basically have no money the Suse box is an old 400MHz Pentium 
II PC with only 256MB of RAM and this isn't likely to change in the near 
future, except that I might be able to get some more RAM if necessary.


Squid is set up to run 5 squidGuard processes. When we boot Suse it 
takes 15-20 minutes with lots of disk thrashing for the 5 squidGuards to 
read in the blacklists and build their tables. During this time the web 
proxy is non functional so we usually leave the Suse box running 24/7 to 
avoid having to wait for it.


Much of the time it works fine but every now and then for no obvious 
reason, squid decides it needs to start more squidGuard processes which 
effectively cuts off all web access. I'm not sure exactly what happens, 
maybe sometimes it just kills the existing squidGuards and starts new 
ones but it sometimes seems to end running 10 squidGuards and thrashing 
the disk hard for ages leaving the users with no web access.


When it's all running properly free -m seems to indicated that there is 
enough memory:


 total   used   free sharedbufferscached
Mem:   250246  3  0 51   126
-/+ buffers/cache: 68181
Swap:  400  2397



Does anyone know what's going on and how to stop it happening?

--

Brian Gregory.
[EMAIL PROTECTED]

Computer Room Volunteer.
Therapy Centre.
Prospect Park Hospital.