Re: [squid-users] limiting connections

2012-03-24 Thread Carlos Manuel Trepeu Pupo
On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

 I need to block each user to make just one connection to download
 specific extension files, but I dont know how to tell that can make
 just one connection to each file and not just one connection to every
 file with this extension.

 i.e:
 www.google.com #All connection that required
 www.any.domain.com/my_file.rar #just one connection to that file
 www.other.domain.net/other_file.iso #just connection to this file
 www.other_domain1.com/other_file1.rar #just one connection to that file

 I hope you understand me and can help me, I have my boss hurrying me !!!


 There is no easy way to test this in Squid.

 You need an external_acl_type helper which gets given the URI and decides
 whether it is permitted or not. That decision can be made by querying Squid
 cache manager for the list of active_requests and seeing if the URL appears
 more than once.

Hello Amos, following your instructions I make this external_acl_type helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???

This is my log of squid:
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #1, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #2, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #1 (FD 15) exited
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #2 (FD 16) exited
Mar 24 09:25:04 test squid[28075]: CACHEMGR: unknown@192.168.19.19
requesting 'active_requests'
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #3, 3 bytes 'OK '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #3 (FD 24) exited
Mar 24 09:25:04 test squid[28075]: helperHandleRead: unexpected read
from one_conn #4, 4 bytes 'ERR '
Mar 24 09:25:04 test squid[28075]: WARNING: one_conn #4 (FD 27) exited
Mar 24 09:25:04 test squid[28075]: Too few one_conn processes are running
Mar 24 09:25:04 test squid[28075]: storeDirWriteCleanLogs: Starting...
Mar 24 09:25:04 test squid[28075]: WARNING: Closing open FD   12
Mar 24 09:25:04 test squid[28075]:   Finished.  Wrote 25613 entries.
Mar 24 09:25:04 test squid[28075]:   Took 0.00 seconds (7740404.96 entries/sec).
Mar 24 09:25:04 test squid[28075]: The one_conn helpers are crashing
too rapidly, need help!



 Amos



[squid-users] Popular log analysis tools? SARG?

2012-03-24 Thread Jack Bates

Which are the most popular log analysis tools? SARG?

The Squid website features a comprehensive list of log analysis tools 
[1]. Which are the most popular?


[1] http://www.squid-cache.org/Misc/log-analysis.html


Re: [squid-users] Re: Need some help about delay_parameters directive

2012-03-24 Thread Muhammad Yousuf Khan
i am using squid 2.7 stable

On Fri, Mar 23, 2012 at 6:35 AM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 23/03/2012 9:41 a.m., Muhammad Yousuf Khan wrote:

 ok i did every thing byes bits i achieve the gole but value as per my
 calculation is wrong. can you please tell me which value do you think
 i shell put to limit download till 10MB then bandwidth may limit to
 specific bytes.
 i tries 1024000 it wokes till 1 MB file downloaded in full speed after
 that things got limited as instructed but when i use 102400 it
 give me only a lil above 5 MB download then bandwidth limits activate
 . i can not actually get the point here why it is limiting so early as
 i am expecting files to be downloaded till 10 however i manage that by
 double the value things work well but still logic is not clear in my
 mind

 Thanks.


 What version of Squid are you using? this matters. The original delay pool
 is 32-bit until recently.

 Amos


[squid-users] 404s' not cached despite being cache-able

2012-03-24 Thread ilyail3 K
Hi.

I have this url:http://bmtu.livedefinition.com/context/1001/homepage
which is 404 , but it has max-age and expires.
I've checked the headers with redbot, and it seems to agree the page
is cachable:
http://bmtu.livedefinition.com/context/1001/homepage

Still, I only get MISSs' from squid
ubuntu@ip-10-0-0-144:~$ curl -vv -x localhost:3128
http://bmtu.livedefinition.com/context/1001
* About to connect() to proxy localhost port 3128 (#0)
*   Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 3128 (#0)
 GET http://bmtu.livedefinition.com/context/1001 HTTP/1.1
 User-Agent: curl/7.21.3 (i686-pc-linux-gnu) libcurl/7.21.3 OpenSSL/0.9.8o 
 zlib/1.2.3.4 libidn/1.18
 Host: bmtu.livedefinition.com
 Accept: */*
 Proxy-Connection: Keep-Alive

* HTTP 1.0, assume close after body
 HTTP/1.0 404 Not Found
 Cache-Control: max-age=3600
 Date: Sat, 24 Mar 2012 20:47:13 GMT
 Expires: Sat, 24 Mar 2012 21:47:13 GMT
 Server: Jetty(6.1.25)
 Content-Length: 27
 X-Cache: MISS from localhost
 X-Cache-Lookup: MISS from localhost:80
 Via: 1.0 localhost (squid/3.1.11)
* HTTP/1.0 connection set to keep alive!
 Connection: keep-alive

Here's the access log related:
1332622033.136  6 127.0.0.1 TCP_MISS/404 331 GET
http://bmtu.livedefinition.com/context/1001 - DIRECT/176.34.129.228 -

The config is default, and the squid version is 3.1.11.

How can I force squid to cache those specific 404s'?


RE: [squid-users] Popular log analysis tools? SARG?

2012-03-24 Thread Jenny Lee

 Date: Sat, 24 Mar 2012 12:07:34 -0700
 From: nwv...@nottheoilrig.com
 To: squid-users@squid-cache.org
 Subject: [squid-users] Popular log analysis tools? SARG?
 
 Which are the most popular log analysis tools? SARG?
 
 The Squid website features a comprehensive list of log analysis tools 
 [1]. Which are the most popular?
 
 [1] http://www.squid-cache.org/Misc/log-analysis.html
 
 
None of those tools will produce anything good-looking. 
 
Any tools for Squid, just like everything else opensource, looks... well, like 
crap.
 
I use Flowerfire Sawmill. http://sawmill.net
 
It produces awesome graphs with excellent features. I use my custom squid log 
format with it, though.
 
Jenny
 
  

Re: [squid-users] limiting connections

2012-03-24 Thread Amos Jeffries

On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:

On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:

On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:

I need to block each user to make just one connection to download
specific extension files, but I dont know how to tell that can make
just one connection to each file and not just one connection to every
file with this extension.

i.e:
www.google.com #All connection that required
www.any.domain.com/my_file.rar #just one connection to that file
www.other.domain.net/other_file.iso #just connection to this file
www.other_domain1.com/other_file1.rar #just one connection to that file

I hope you understand me and can help me, I have my boss hurrying me !!!


There is no easy way to test this in Squid.

You need an external_acl_type helper which gets given the URI and decides
whether it is permitted or not. That decision can be made by querying Squid
cache manager for the list of active_requests and seeing if the URL appears
more than once.

Hello Amos, following your instructions I make this external_acl_type helper:

#!/bin/bash
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c $1`
if [ $result -eq 0 ]
then
echo 'OK'
else
echo 'ERR'
fi

# If I have the same URI then I denied. I make a few test and it work
for me. The problem is when I add the rule to the squid. I make this:

acl extensions url_regex /etc/squid3/extensions
external_acl_type one_conn %URI /home/carlos/script
acl limit external one_conn

# where extensions have:
\.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$

http_access deny extensions limit


So when I make squid3 -k reconfigure the squid stop working

What can be happening ???


* The helper needs to be running in a constant loop.
You can find an example 
http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
although that is re-writer and you do need to keep the OK/ERR for 
external ACL.


* eq 0 - there should always be 1 request matching the URL. Which is 
the request you are testing to see if its 1 or not. You are wanting to 
deny for the case where there are *2* requests in existence.


* ensure you have manager requests form localhost not going through the 
ACL test.



Amos



Re: [squid-users] Popular log analysis tools? SARG?

2012-03-24 Thread James Robertson
 The Squid website features a comprehensive list of log analysis tools [1].
 Which are the most popular?


I cannot say which is the most popular and nor do I know what log
information your actually interested in... but for monitoring user
access to the internet I have been quite satisfied with Cyfin
Reporter, it integrates into Active Directory nicely, if you need
that.