RE: [squid-users] sometimes the users can´t visit any webpage

2009-09-02 Thread Amos Jeffries
On Wed, 2 Sep 2009 16:33:49 -0500, Jesus Angeles
jange...@confidesolutions.com.pe wrote:
 Hi, thanks for your interest
 
 Well, today I had same problem, this is an extract to my cache.log. The
 problem happened about 15:30hrs, and the user reported me about 16:00
hrs,
 and I had to restart the squid service. 
 
 Any Idea?  What does it mean httpReadReply Excess data from...?

The server at www.paginasamarillas.com.pe is pushing more data into Squid
after the objects its supposed to be sending have supposedly finished. This
is a broken web server or a malicious attack. Not good either way.

 
 2009/09/02 06:32:19| storeDirWriteCleanLogs: Starting...
 2009/09/02 06:32:19| 65536 entries written so far.
 2009/09/02 06:32:19|131072 entries written so far.
 2009/09/02 06:32:19|   Finished.  Wrote 132412 entries.
 2009/09/02 06:32:19|   Took 0.0 seconds (4109238.7 entries/sec).
 2009/09/02 06:32:19| logfileRotate: /var/log/squid/store.log
 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/store.log
 2009/09/02 06:32:19| logfileRotate: /var/log/squid/access.log
 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/access.log
 2009/09/02 06:32:19| logfileRotate: /var/log/squid/access1.log
 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/access1.log

 2009/09/02 15:32:49| httpReadReply: Excess data from GET
 http://www.paginasamarillas.com.pe/js/scriptTagHead.js.jsp;
 2009/09/02 15:32:49| httpReadReply: Excess data from GET
 http://www.paginasamarillas.com.pe/js/scriptHome.js.jsp;
 2009/09/02 15:32:49| httpReadReply: Excess data from GET

http://www.paginasamarillas.com.pe/searchBarLocality.do?stateId=cityId=sub
 urbId=

Server at www.paginasamarillas.com.pe told Squid it was sending objects of
size X then sent X + N bytes of data down the link.  A response splitting
attack is probably underway. Squid drops those connections.

 2009/09/02 15:59:54| Preparing for shutdown after 337998 requests
 2009/09/02 15:59:54| Waiting 30 seconds for active connections to finish

Someone shutdown Squid.

 2009/09/02 15:59:54| FD 11 Closing HTTP connection
 2009/09/02 16:00:25| Shutting down...
 2009/09/02 16:00:25| FD 12 Closing ICP connection
 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
 lifetime timeout
 2009/09/02 16:00:25|
 http://mail.google.com/mail/images/cleardot.gif?zx=g31q8sija2fo
 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.136 connection
due
 to lifetime timeout
 2009/09/02 16:00:25|  http://kh.google.com/geauth
 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
 lifetime timeout
 2009/09/02 16:00:25|

http://toolbarqueries.clients.google.com/history/feeds/default/subscriptions
 /browser
 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
 lifetime timeout
 2009/09/02 16:00:25|
 http://mail.google.com/mail/images/cleardot.gif?zx=8w46jyqzoqzz

Two clients had their 4 active connections closed on them.

snip
 2009/09/02 16:00:25| Squid Cache (Version 2.7.STABLE3): Exiting normally.
 2009/09/02 16:00:26| Starting Squid Cache version 2.7.STABLE3 for
 i386-debian-linux-gnu...
snip
 
 
 -Mensaje original-
 De: Jeff Pang [mailto:pa...@arcor.de] 
 Enviado el: Lunes, 31 de Agosto de 2009 08:44 p.m.
 Para: squid-users
 Asunto: Re: [squid-users] sometimes the users can´t visit any webpage
 
 2009/9/1 Jesus Angeles jange...@confidesolutions.com.pe:
 Hi all, I have a problem. Three weeks ago I installed Squid 2.7.STABLE3
+
 Dansguardian 2.10.1.1 in GNU/Linux Ubuntu Server 9.04. First week was
ok,
 but the service was started to fail, sometimes (once or twice for day )
 the
 users can´t visit any webpage, the web browser shows a blank page (delay
 on
 load), in those moment I check:
 -   The squid service is running.
 -   The dansguardian is ok, because if the users try visit a
 prohibited
 web, It shows the access denied page.
 -   The logfile  (access.log) is generating logs (I checked with
tail
 -f).
 -   The memory and HD space is ok (I have configured 256 MB in
 cache_mem
 and 4096 MB in cache_dir)
 Then, in those moments, I have to execute “/etc/init.d/squid reload” to
 solve the problem.

 
 Have you checked cache.log for the special requests?
 Only the info on cache.log (or with debug level) is valuable.
 
 Jeff.


Re: [squid-users] sometimes the users can´t visit any webpage

2009-09-01 Thread Amos Jeffries

Jesus Angeles wrote:

Hi all, I have a problem. Three weeks ago I installed Squid 2.7.STABLE3 +
Dansguardian 2.10.1.1 in GNU/Linux Ubuntu Server 9.04. First week was ok,
but the service was started to fail, sometimes (once or twice for day ) the
users can´t visit any webpage, the web browser shows a blank page (delay on
load), in those moment I check:
-   The squid service is running.
-   The dansguardian is ok, because if the users try visit a prohibited
web, It shows the access denied page.
-   The logfile  (access.log) is generating logs (I checked with tail
-f).
-   The memory and HD space is ok (I have configured 256 MB in cache_mem
and 4096 MB in cache_dir)
Then, in those moments, I have to execute “/etc/init.d/squid reload” to
solve the problem.

What could be happening?


Anything could be happening.

init.d 'reload' is also known as 'squid -k reconfigure'. Which closes 
all network connections, reads the config file again and restarts all of 
Squid internal processes.


Look in your cache.log for any useful information. Change the 
debug_options setting to a higher logging level if there is nothing there.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


[squid-users] sometimes the users can´t visit any webpage

2009-08-31 Thread Jesus Angeles
Hi all, I have a problem. Three weeks ago I installed Squid 2.7.STABLE3 +
Dansguardian 2.10.1.1 in GNU/Linux Ubuntu Server 9.04. First week was ok,
but the service was started to fail, sometimes (once or twice for day ) the
users can´t visit any webpage, the web browser shows a blank page (delay on
load), in those moment I check:
-   The squid service is running.
-   The dansguardian is ok, because if the users try visit a prohibited
web, It shows the access denied page.
-   The logfile  (access.log) is generating logs (I checked with tail
-f).
-   The memory and HD space is ok (I have configured 256 MB in cache_mem
and 4096 MB in cache_dir)
Then, in those moments, I have to execute “/etc/init.d/squid reload” to
solve the problem.

What could be happening?

Regards,

Jesus