Re: Fwd: [squid-users] Storeurl program,what's wrong with it ?

2012-12-05 Thread Amos Jeffries

On 6/12/2012 3:49 p.m., 金 戈 wrote:

I'm sorry, I didn't tell the version of squid I use.And the detail of my 
experiment.

THE SQUID VERSION

root@cache-squid:/usr/local/etc/squid # squid -v
Squid Cache: Version LUSCA_HEAD-r14809


This is not Squid. This is Lusca, a commercial fork of Squid-2.7. You 
had best try to contact Xenion about these issues.


Amos


Fwd: [squid-users] Storeurl program,what's wrong with it ?

2012-12-05 Thread 金 戈
I'm sorry, I didn't tell the version of squid I use.And the detail of my 
experiment.

THE SQUID VERSION

root@cache-squid:/usr/local/etc/squid # squid -v
Squid Cache: Version LUSCA_HEAD-r14809
configure options:  '--bindir=/usr/local/sbin' '--sbindir=/usr/local/sbin' 
'--datadir=/usr/local/etc/squid' '--libexecdir=/usr/local/libexec/squid' 
'--localstatedir=/usr/local/squid' '--sysconfdir=/usr/local/etc/squid' 
'--enable-removal-policies=lru heap' '--disable-linux-netfilter' 
'--disable-linux-tproxy' '--disable-epoll' '--enable-auth=basic ntlm digest' 
'--enable-basic-auth-helpers=DB NCSA PAM MSNT SMB YP' 
'--enable-digest-auth-helpers=password' '--enable-external-acl-helpers=ip_user 
session unix_group wbinfo_group' '--enable-ntlm-auth-helpers=SMB' 
'--with-pthreads' '--enable-storeio=aufs null coss' '--enable-snmp' 
'--enable-htcp' '--enable-forw-via-db' '--disable-wccp' '--enable-wccpv2' 
'--disable-ident-lookups' '--enable-referer-log' '--enable-useragent-log' 
'--enable-pf-transparent' '--enable-follow-x-forwarded-for' 
'--with-large-files' '--enable-large-cache-files' 
'--enable-err-languages=English' '--enable-default-err-language=English' 
'--with-maxfd=5' '--with-aufs-threads=255' 'LDFLAGS=-L/usr/locar/local/lib' 
'LIBS=-ltcmalloc' '--prefix=/usr/local' '--mandir=/usr/local/man' 
'--infodir=/usr/local/info/' '--build=amd64-portbld-freebsd9.1' 
'build_alias=amd64-portbld-freebsd9.1' 'CC=cc' 'CFLAGS=-O2 -fno-strict-aliasing 
-pipe -msse3 -I/usr/local/include -Wl,-L/usr/local/lib -msse3 -march=nocona ' 
'CPPFLAGS=' 'CPP=cpp'

00:12:14.3742.527   426 26.8M   GET 200 video/mp4   
http://58.53.215.16/youku/69768E90451347BDEB2885FED/030008030050BE079785AA03BAF2B11A3334A2-BCC9-F277-D1AF-AE5D4966566C.mp4

(Request-Line)  GET 
/youku/69768E90451347BDEB2885FED/030008030050BE079785AA03BAF2B11A3334A2-BCC9-F277-D1AF-AE5D4966566C.mp4
 HTTP/1.1
Host58.53.215.16
User-Agent  Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:15.0) 
Gecko/20100101 Firefox/15.0.1
Accept  text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language zh-cn,zh;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding gzip, deflate
Proxy-Connectionkeep-alive

I capture from my firefox. You see the Referer head of the video/mp4  is the 
http://static.youku.com/ 
And i list one of my capture url here:
http://58.53.215.16/youku/69768E90451347BDEB2885FED/030008030050BE079785AA03BAF2B11A3334A2-BCC9-F277-D1AF-AE5D4966566C.mp4
And the what troubles me most is,My store_url_rewrite.py output the rewrite url 
log in cache.log show what i want to store like 

http://118.180.3.36/youku/656D4E0E32378272EBACE5B86/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv?start=18
 
->store as 
http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv?start=18

But I didn't see 
http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv?start=18
 in store.log
And i thought squid was not cache my rewire url. And i didn't know why it did't 
cache it. Is my store_url_rewrite.py returned value not match store 
urlrewrite_program tell me? And other problem?

Re: [squid-users] Problem to access to a specific url (correo-gto.com.mx) with squid 2.7

2012-12-05 Thread Eliezer Croitoru

Hey Ogeid,

No Problem.
As you can see the problem is a network issue and not by the proxy.

To migrate you should save your current settings into other directory first.

Regards,
Eliezer

On 12/6/2012 12:51 AM, Ogeid Alavaz wrote:

Hi Eliezer, thanks for your advise and response.

I will be more carefull next time, this is my first time at a forum like this.

I have my doubt to be a network issue, due that the proxy is behind a
fisical gateway, and I do not have control over it, it is another work
mate, and he says everything is correct on his side.

The thind is that if I use the proxy to go out to internet via another
modem, not the fisical gateway, the client can go out to
www.correo-gto.com.mx  with out any problem, and when I conect it, the
way that has to be, is when I have the problem.

I want to get into the real problem, and see if the server has
something wrong. On the other side, I have installed a new serever
with squid 3.1, on the same level as the old proxy, the one with the
problem, the new proxy works fine, and the client can go in to the
pagewww.correo-gto.com.mx.

I follor your sugest to try wget and here are the results:

(1)
to the web page that I have the problem, it gaves me time out:

# wgethttp://www.correo-gto.com.mx/internacional/index.1.html
--2012-12-05 16:38:16--http://www.correo-gto.com.mx/internacional/index.1.html
Resolvingwww.correo-gto.com.mx... 184.154.122.58
Connecting towww.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset
by peer) in headers.
Retrying.

--2012-12-05 16:44:42--  (try: 3)
http://www.correo-gto.com.mx/internacional/index.1.html
Connecting towww.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response...


(2)
to a diferent web page, it download succefully:

# wgethttp://curl.haxx.se/docs/manpage.html
--2012-12-05 16:30:23--http://curl.haxx.se/docs/manpage.html
Resolving curl.haxx.se... 80.67.6.50, 2a00:1a28:1200:9::2
Connecting to curl.haxx.se|80.67.6.50|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 127127 (124K) [text/html]
Saving to: `manpage.html'

100%[==>] 127,127 --.-K/s   in 0.006s

2012-12-05 16:30:25 (20.9 MB/s) - `manpage.html' saved [127127/127127]

I been looking how to migrate from squid 2.7 to squid 3.1 but have no
found a way clear for me, I do not know if I just make a remove and
install will be enogth.

Thanks and great day.


--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] Problem to access to a specific url (correo-gto.com.mx) with squid 2.7

2012-12-05 Thread Ogeid Alavaz
Hi Eliezer, thanks for your advise and response.

I will be more carefull next time, this is my first time at a forum like this.

I have my doubt to be a network issue, due that the proxy is behind a
fisical gateway, and I do not have control over it, it is another work
mate, and he says everything is correct on his side.

The thind is that if I use the proxy to go out to internet via another
modem, not the fisical gateway, the client can go out to
www.correo-gto.com.mx with out any problem, and when I conect it, the
way that has to be, is when I have the problem.

I want to get into the real problem, and see if the server has
something wrong. On the other side, I have installed a new serever
with squid 3.1, on the same level as the old proxy, the one with the
problem, the new proxy works fine, and the client can go in to the
page www.correo-gto.com.mx.

I follor your sugest to try wget and here are the results:

(1)
to the web page that I have the problem, it gaves me time out:

# wget http://www.correo-gto.com.mx/internacional/index.1.html
--2012-12-05 16:38:16--  http://www.correo-gto.com.mx/internacional/index.1.html
Resolving www.correo-gto.com.mx... 184.154.122.58
Connecting to www.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset
by peer) in headers.
Retrying.

--2012-12-05 16:44:42--  (try: 3)
http://www.correo-gto.com.mx/internacional/index.1.html
Connecting to www.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response...


(2)
to a diferent web page, it download succefully:

# wget http://curl.haxx.se/docs/manpage.html
--2012-12-05 16:30:23--  http://curl.haxx.se/docs/manpage.html
Resolving curl.haxx.se... 80.67.6.50, 2a00:1a28:1200:9::2
Connecting to curl.haxx.se|80.67.6.50|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 127127 (124K) [text/html]
Saving to: `manpage.html'

100%[==>] 127,127 --.-K/s   in 0.006s

2012-12-05 16:30:25 (20.9 MB/s) - `manpage.html' saved [127127/127127]

I been looking how to migrate from squid 2.7 to squid 3.1 but have no
found a way clear for me, I do not know if I just make a remove and
install will be enogth.

Thanks and great day.

On Wed, Dec 5, 2012 at 3:44 PM, Eliezer Croitoru  wrote:
> Hey omzatru,
>
> You indeed gave us a lot of info on config etc.
> The basic thing to check is a network issue.
> If the client is not able to access the site try to not use the proxy or use
> a forward mode which is not transparent.
>
> This can eliminate the issue from the network level to the application.
>
> Did you tried to use wget or curl from the squid machine to test
> connectivity?
>
> You are using a very old version of squid 2.7 is out of support for a very
> long time.
>
> I can however point you to that squid 2.7 dosn't support http/1.1 which
> might be the source to the problem.
>
> Also this server response can sometime be very slow maybe due to a reverse
> proxy on the way or other device.
>
> For the next time filter the squid.conf since the diff makes it unreadable.
>
> Kind Regards,
> Eliezer
>
>
> On 12/5/2012 11:23 PM, omzatru wrote:
>>
>> Hi I have a proxy server with Squid 2.7 installed, and I a problem
>> with a specific page.
>>
>>   www.correo-gto.com.mx
>>
>> A client can not access via proxy (squid 2.7) to this page.
>>
>> Accessing to diferents pages I do not have this problem, the
>> navigation via proxy works fine.
>>
>> I have adj the config file for the squid.
>>
>> I have the following logs:
>>
>> (1)
>> log in /var/log/squid/access.log:
>> -
>>
>> 1354318142.058 381547 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>> 1354318175.552 378090 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>> 1354318206.135 378088 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>>
>>
>> (2)
>> error in firefox accessing to www.correo-gto.com.mx
>> -
>>
>> ERROR
>> The requested URL could not be retrieved
>> The following error was encountered while trying to retrieve the URL:
>> http://www.correo-gto.com.mx/
>> Read Error
>> The system returned: (104) Connection reset by peer
>> An error condition occurred while reading data from the network.
>> Please retry your request.
>> Your cache administrator is webmaster.
>> Generated Fri, 31 Aug 2012 21:36:31 GMT by webproxy (squid/2.7.STABLE7)
>>
>>
>>
>> (3.a)
>> Testin nslookup from the proxy server:
>> 
>>
>> # nslookup correo-gto.com.mx
>> Server: 10.0.0.2
>> Address:10.0.0.2#53
>>
>> Non-authoritative answer:
>> Name:   correo-gto.com.mx
>> Address: 184.154.122.58
>>
>>
>>
>> (4.a)
>> Making a tracepath to correo-gto.com.mx from  proxy server
>> -
>>
>> # tracepath corr

Re: [squid-users] Storeurl program,what's wrong with it ?

2012-12-05 Thread Amos Jeffries

On 06.12.2012 01:49, 金 戈 wrote:

Hi everyone!

I use squid with our ISP services.

And now we use storeurl_rewrite_program for rewrite some video cache
for our user. But after a few days draggle and google. We found it's
so difficult for us to do this. I can't find where is the error of my
configuration.And i hope someone can help.

This is my core thing about the rewrite program.
The squid.conf

acl store_rewrite_list referer_regex ^http://static.youku.com/.*$


Eliezer already pointed out the uselessness of ".*$" at the end of a 
pattern. What it means is "anything which ends". And notice that there 
is *always* an end on URLs.


I will also point out that your ACL is checking the Referer: HTTP 
header. From a quick scan of that website I see that 
^http://static.youku.com/ matches only images, CSS and scriptlet files. 
None of them have sub-objects where the Referer: would be set to 
http://static.youku.com/...


You are perhapse wanting:
  acl store_rewrite_list referer_regex ^http://www.youku.com/

or,
  acl store_rewrite_list dstdomain static.youku.com



storeurl_access allow store_rewrite_list
storeurl_access deny all
storeurl_rewrite_program /var/squid/run/squid/store_url_rewrite.py





FYI there are several major problems on that website which would make 
me steer away from storeURL re-write no this one.


In order for storeURL feature to work properly without screwing up the 
client responses the CDN being re-written need to be consistent and 
working properly. The Youku site server is presenting incorrect and 
inconsistently changing Vary, ETag, and Content-Type headers on a 
handful of the objects it is serving up (including some apparently 
'static' CSS and images!), if you were to merge any of these objects 
using storeURL the responses sent to the client could become hopelessly 
corrupted.


Amos


Re: [squid-users] Problem to access to a specific url (correo-gto.com.mx) with squid 2.7

2012-12-05 Thread Eliezer Croitoru

Hey omzatru,

You indeed gave us a lot of info on config etc.
The basic thing to check is a network issue.
If the client is not able to access the site try to not use the proxy or 
use a forward mode which is not transparent.


This can eliminate the issue from the network level to the application.

Did you tried to use wget or curl from the squid machine to test 
connectivity?


You are using a very old version of squid 2.7 is out of support for a 
very long time.


I can however point you to that squid 2.7 dosn't support http/1.1 which 
might be the source to the problem.


Also this server response can sometime be very slow maybe due to a 
reverse proxy on the way or other device.


For the next time filter the squid.conf since the diff makes it unreadable.

Kind Regards,
Eliezer

On 12/5/2012 11:23 PM, omzatru wrote:

Hi I have a proxy server with Squid 2.7 installed, and I a problem
with a specific page.

  www.correo-gto.com.mx

A client can not access via proxy (squid 2.7) to this page.

Accessing to diferents pages I do not have this problem, the
navigation via proxy works fine.

I have adj the config file for the squid.

I have the following logs:

(1)
log in /var/log/squid/access.log:
-

1354318142.058 381547 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
1354318175.552 378090 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
1354318206.135 378088 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html


(2)
error in firefox accessing to www.correo-gto.com.mx
-

ERROR
The requested URL could not be retrieved
The following error was encountered while trying to retrieve the URL:
http://www.correo-gto.com.mx/
Read Error
The system returned: (104) Connection reset by peer
An error condition occurred while reading data from the network.
Please retry your request.
Your cache administrator is webmaster.
Generated Fri, 31 Aug 2012 21:36:31 GMT by webproxy (squid/2.7.STABLE7)



(3.a)
Testin nslookup from the proxy server:


# nslookup correo-gto.com.mx
Server: 10.0.0.2
Address:10.0.0.2#53

Non-authoritative answer:
Name:   correo-gto.com.mx
Address: 184.154.122.58



(4.a)
Making a tracepath to correo-gto.com.mx from  proxy server
-

# tracepath correo-gto.com.mx
  1:  web.congresogto.gob.mx (10.0.0.8)  0.200ms pmtu 1500
  1:  10.0.0.253 (10.0.0.253)0.230ms
  1:  10.0.0.253 (10.0.0.253)0.183ms
  2:  no reply
  3:  no reply
  4:  no reply
...
30:  no reply
31:  no reply


I have posted the problem in, but I have not had a contribution.

http://www.linuxquestions.org/questions/showthread.php?p=4771659#post4771659


I will appreciate a lot, if you can help me on this, I been looking
throw a solution, but I have not succeed.

On the other hand, I have configure a new proxy test with squid 3.1,
and works fine, I can reach to the page correo-gto.com.mx with out any
problem.

Thanks and have a great day.


squid.conf file, here are the changes that I have made:


diff -purN squid.conf.orig squid.conf

--- squid.conf.orig 2012-03-22 09:30:54.732721143 -0600
+++ squid.conf  2012-12-05 13:02:57.745042191 -0600
@@ -608,7 +608,7 @@ acl to_localhost dst 127.0.0.0/8 0.0.0.0
  # should be allowed
  acl localnet src 10.0.0.0/8# RFC1918 possible internal network
  acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
-acl localnet src 192.168.0.0/16# RFC1918 possible internal network
+acl localnet src 192.168.1.0/24# RFC1918 possible internal network
  #
  acl SSL_ports port 443 # https
  acl SSL_ports port 563 # snews
@@ -626,9 +626,16 @@ acl Safe_ports port 777# multiling htt
  acl Safe_ports port 631# cups
  acl Safe_ports port 873# rsync
  acl Safe_ports port 901# SWAT
+acl Safe_ports port 3201   # SAP
+acl Safe_ports port 82 # isseg
+
  acl purge method PURGE
  acl CONNECT method CONNECT

+# Lista de pAginas denegadas
+acl pages_deny url_regex "/etc/squid/pagesDeny.acl"
+acl pages_acces url_regex "/etc/squid/pagesAcces.acl"
+
  #  TAG: http_access
  #  Allowing or Denying access based on defined access lists
  #
@@ -662,6 +669,11 @@ http_access deny purge
  http_access deny !Safe_ports
  # Deny CONNECT to other than SSL ports
  http_access deny CONNECT !SSL_ports
+
+# Deny pages request
+#http_access deny pages_deny
+#http_access allow pages_acces
+
  #
  # We strongly recommend the following be uncommented to protect innocent
  # web applications running on the proxy server who think the only
@@ -673,7 +685,7 @@ http_access deny CONNECT !SSL_ports
  # Example rule allowing access from your local networks.
  # Adapt localnet in the ACL section t

[squid-users] Problem to access to a specific url (correo-gto.com.mx) with squid 2.7

2012-12-05 Thread omzatru
Hi I have a proxy server with Squid 2.7 installed, and I a problem
with a specific page.

 www.correo-gto.com.mx

A client can not access via proxy (squid 2.7) to this page.

Accessing to diferents pages I do not have this problem, the
navigation via proxy works fine.

I have adj the config file for the squid.

I have the following logs:

(1)
log in /var/log/squid/access.log:
-

1354318142.058 381547 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
1354318175.552 378090 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
1354318206.135 378088 10.0.12.51 TCP_MISS/502 1634 GET
http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html


(2)
error in firefox accessing to www.correo-gto.com.mx
-

ERROR
The requested URL could not be retrieved
The following error was encountered while trying to retrieve the URL:
http://www.correo-gto.com.mx/
Read Error
The system returned: (104) Connection reset by peer
An error condition occurred while reading data from the network.
Please retry your request.
Your cache administrator is webmaster.
Generated Fri, 31 Aug 2012 21:36:31 GMT by webproxy (squid/2.7.STABLE7)



(3.a)
Testin nslookup from the proxy server:


# nslookup correo-gto.com.mx
Server: 10.0.0.2
Address:10.0.0.2#53

Non-authoritative answer:
Name:   correo-gto.com.mx
Address: 184.154.122.58



(4.a)
Making a tracepath to correo-gto.com.mx from  proxy server
-

# tracepath correo-gto.com.mx
 1:  web.congresogto.gob.mx (10.0.0.8)  0.200ms pmtu 1500
 1:  10.0.0.253 (10.0.0.253)0.230ms
 1:  10.0.0.253 (10.0.0.253)0.183ms
 2:  no reply
 3:  no reply
 4:  no reply
...
30:  no reply
31:  no reply


I have posted the problem in, but I have not had a contribution.

http://www.linuxquestions.org/questions/showthread.php?p=4771659#post4771659


I will appreciate a lot, if you can help me on this, I been looking
throw a solution, but I have not succeed.

On the other hand, I have configure a new proxy test with squid 3.1,
and works fine, I can reach to the page correo-gto.com.mx with out any
problem.

Thanks and have a great day.


squid.conf file, here are the changes that I have made:


diff -purN squid.conf.orig squid.conf

--- squid.conf.orig 2012-03-22 09:30:54.732721143 -0600
+++ squid.conf  2012-12-05 13:02:57.745042191 -0600
@@ -608,7 +608,7 @@ acl to_localhost dst 127.0.0.0/8 0.0.0.0
 # should be allowed
 acl localnet src 10.0.0.0/8# RFC1918 possible internal network
 acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
-acl localnet src 192.168.0.0/16# RFC1918 possible internal network
+acl localnet src 192.168.1.0/24# RFC1918 possible internal network
 #
 acl SSL_ports port 443 # https
 acl SSL_ports port 563 # snews
@@ -626,9 +626,16 @@ acl Safe_ports port 777# multiling htt
 acl Safe_ports port 631# cups
 acl Safe_ports port 873# rsync
 acl Safe_ports port 901# SWAT
+acl Safe_ports port 3201   # SAP
+acl Safe_ports port 82 # isseg
+
 acl purge method PURGE
 acl CONNECT method CONNECT

+# Lista de pAginas denegadas
+acl pages_deny url_regex "/etc/squid/pagesDeny.acl"
+acl pages_acces url_regex "/etc/squid/pagesAcces.acl"
+
 #  TAG: http_access
 #  Allowing or Denying access based on defined access lists
 #
@@ -662,6 +669,11 @@ http_access deny purge
 http_access deny !Safe_ports
 # Deny CONNECT to other than SSL ports
 http_access deny CONNECT !SSL_ports
+
+# Deny pages request
+#http_access deny pages_deny
+#http_access allow pages_acces
+
 #
 # We strongly recommend the following be uncommented to protect innocent
 # web applications running on the proxy server who think the only
@@ -673,7 +685,7 @@ http_access deny CONNECT !SSL_ports
 # Example rule allowing access from your local networks.
 # Adapt localnet in the ACL section to list your (internal) IP networks
 # from where browsing should be allowed
-#http_access allow localnet
+http_access allow localnet
 http_access allow localhost

 # And finally deny all other access to this proxy
@@ -715,8 +727,8 @@ http_access deny all
 # icp_access deny all
 #
 #Allow ICP queries from local networks only
-icp_access allow localnet
-icp_access deny all
+##icp_access allow localnet
+##icp_access deny all

 #  TAG: htcp_access
 #  Allowing or Denying access to the HTCP port based on defined
@@ -,7 +1123,7 @@ icp_access deny all
 #  visible on the internal address.
 #
 # Squid normally listens to port 3128
-http_port 3128
+http_port 3128 transparent

 #  TAG: https_port
 # Note: This option is only available if Squid is rebuilt with the
@@ -1748,7 +1760,7 @@ hierarchy_stoplist cgi-bin ?
 #  objects.
 #
 #Default:
-# cach

Re: [squid-users] After upgrade from 3.1 to 3.2.3 our parent virusscanner is busy

2012-12-05 Thread Eliezer Croitoru

Hey Dieter,

It seems to me like it's not a squid problem but it can be..
I would start with basic checks such as File-descriptors limits on the 
system.


ulimit -Ha
ulimit -Sa

and also on what are the squid build options? (squid -v)

if the antivirus responded with something there is no problem with 
communication or proxy basic function but from the AV side.


Try the ss tool to see the open connections\FD between squid and the AV 
and the clients.


Regards,
Eliezer

On 12/5/2012 3:47 PM, Dieter Bloms wrote:

Hi Eliezer,

On Wed, Dec 05, Eliezer Croitoru wrote:


We will need more information such as squid.conf and other info.
Who claims for busy?


We use sles11sp2 (64bit) on HP Proliant DL380 G7 hardware with 140G Ram.


The virusscanner is busy? where do you see that? etc..


yes, the virusscanner creates a http page with "AVwebgate is busy"
message.
Even with lower load (150req/s) the virusscanner generate this message.


A more clear picture can help us try to help you.


Please have a look at:

http://downloads.bloms.de/squid.conf-3.1.20
http://downloads.bloms.de/squid.conf-3.2.3


On 12/5/2012 1:50 PM, Dieter Bloms wrote:

Hi,

we use following constellation:

clients -> squid -> virusscanner -> internet.
the virusscanner is avwebgate from avira configured as parent proxy.

The load is ~400 req/s.

With squid 3.1.20 we had no problems, but after upgrade to 3.2.3 our
virusscanner claims it is busy after a few seconds.

Does anybody know any change about connections to a parent proxy from
3.1 to 3.2 series.

The releasenote doesn't mentioned ianything about this (or I can't find
it).


--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il




--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] Storeurl program,what's wrong with it ?

2012-12-05 Thread Eliezer Croitoru

Hey 金 戈,

Squid storeurl_rewrite feature was introduced in squid version 2.6-7 
which is not supported anymore.


I am working on another version which called StoreID which replaces 
storeurl_rewrite.

It will be integrated into squid 3.3 which is now on Beta.
To find out what is your error you should understand better what you are 
aiming for.


If you are willing to try the new feature it will be available soon.

Just a Note on your small program:
The regex you are using is not optimized and you should use less ".*" 
and use a more strict approach on letters and characters.


Regards,
Eliezer

On 12/5/2012 2:49 PM, 金 戈 wrote:

Hi everyone!

I use squid with our ISP services.

And now we use storeurl_rewrite_program for rewrite some video cache for our 
user. But after a few days draggle and google. We found it's so difficult for 
us to do this. I can't find where is the error of my configuration.And i hope 
someone can help.

This is my core thing about the rewrite program.
The squid.conf

acl store_rewrite_list referer_regex ^http://static.youku.com/.*$
storeurl_access allow store_rewrite_list
storeurl_access deny all
storeurl_rewrite_program /var/squid/run/squid/store_url_rewrite.py


this is my store_url_rewrite.py
-
#!/usr/bin/env python
import re
import sys
YOUKU=re.compile("http://\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}/youku/.*/(.*-.*-.*-.*-.*)\?.*")

def modify_url(line):
 list = line.split(' ')
 old_url = list[0]
 new_url = '\n'
 is_match = YOUKU.search(old_url)
 if is_match:
new_url='http://www.youku.com/'+is_match.group(1)+new_url
return new_url

while True:
 line = sys.stdin.readline().strip()
 new_url = modify_url(line)
 if new_url:
sys.stdout.write(new_url)
sys.stderr.write("\n\nREWRITE:"+new_url+'\nORI:'+line)
 else:
sys.stdout.write(line+'\n')
sys.stderr.write("NOREWRITE:"+line+'\n')
 sys.stdout.flush()



throught the stderr , I can find the new_url already write to the log file.But 
I can't found the it in the store.log

The cache.log

REWRITE:http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv
ORI:http://118.180.3.36/youku/6971A9C8A1E348250177A4314B/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv?start=17
 192.168.108.14/- - GET - myip=192.168.137.20 myport=3128

But there is no
object  
http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv
 in store.log


Thanks for your help!

Best wishes!




--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] After upgrade from 3.1 to 3.2.3 our parent virusscanner is busy

2012-12-05 Thread Dieter Bloms
Hi Eliezer,

On Wed, Dec 05, Eliezer Croitoru wrote:

> We will need more information such as squid.conf and other info.
> Who claims for busy?

We use sles11sp2 (64bit) on HP Proliant DL380 G7 hardware with 140G Ram.

> The virusscanner is busy? where do you see that? etc..

yes, the virusscanner creates a http page with "AVwebgate is busy"
message.
Even with lower load (150req/s) the virusscanner generate this message.
 
> A more clear picture can help us try to help you.

Please have a look at:

http://downloads.bloms.de/squid.conf-3.1.20
http://downloads.bloms.de/squid.conf-3.2.3

> On 12/5/2012 1:50 PM, Dieter Bloms wrote:
> >Hi,
> >
> >we use following constellation:
> >
> >clients -> squid -> virusscanner -> internet.
> >the virusscanner is avwebgate from avira configured as parent proxy.
> >
> >The load is ~400 req/s.
> >
> >With squid 3.1.20 we had no problems, but after upgrade to 3.2.3 our
> >virusscanner claims it is busy after a few seconds.
> >
> >Does anybody know any change about connections to a parent proxy from
> >3.1 to 3.2 series.
> >
> >The releasenote doesn't mentioned ianything about this (or I can't find
> >it).
> 
> -- 
> Eliezer Croitoru
> https://www1.ngtech.co.il
> sip:ngt...@sip2sip.info
> IT consulting for Nonprofit organizations
> eliezer  ngtech.co.il

-- 
Gruß

  Dieter

--
I do not get viruses because I do not use MS software.
If you use Outlook then please do not put my email address in your
address-book so that WHEN you get a virus it won't use my address in the
From field.


[squid-users] Storeurl program,what's wrong with it ?

2012-12-05 Thread 金 戈
Hi everyone!

I use squid with our ISP services.

And now we use storeurl_rewrite_program for rewrite some video cache for our 
user. But after a few days draggle and google. We found it's so difficult for 
us to do this. I can't find where is the error of my configuration.And i hope 
someone can help.

This is my core thing about the rewrite program.
The squid.conf

acl store_rewrite_list referer_regex ^http://static.youku.com/.*$
storeurl_access allow store_rewrite_list
storeurl_access deny all
storeurl_rewrite_program /var/squid/run/squid/store_url_rewrite.py


this is my store_url_rewrite.py
-
#!/usr/bin/env python
import re
import sys
YOUKU=re.compile("http://\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}/youku/.*/(.*-.*-.*-.*-.*)\?.*")

def modify_url(line):
list = line.split(' ')
old_url = list[0]
new_url = '\n'
is_match = YOUKU.search(old_url)
if is_match:
new_url='http://www.youku.com/'+is_match.group(1)+new_url
return new_url

while True:
line = sys.stdin.readline().strip()
new_url = modify_url(line)
if new_url:
sys.stdout.write(new_url)
sys.stderr.write("\n\nREWRITE:"+new_url+'\nORI:'+line)
else:
sys.stdout.write(line+'\n')
sys.stderr.write("NOREWRITE:"+line+'\n')
sys.stdout.flush()



throught the stderr , I can find the new_url already write to the log file.But 
I can't found the it in the store.log

The cache.log

REWRITE:http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv
ORI:http://118.180.3.36/youku/6971A9C8A1E348250177A4314B/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv?start=17
 192.168.108.14/- - GET - myip=192.168.137.20 myport=3128

But there is no
object  
http://www.youku.com/030002050050BCAFBB490B03BAF2B1A20A79FD-0282-DEA6-350C-E810E14BAA19.flv
 in store.log


Thanks for your help!

Best wishes!




Re: [squid-users] After upgrade from 3.1 to 3.2.3 our parent virusscanner is busy

2012-12-05 Thread Eliezer Croitoru

Hey Dieter,

We will need more information such as squid.conf and other info.
Who claims for busy?
The virusscanner is busy? where do you see that? etc..

A more clear picture can help us try to help you.

Regards,
Eliezer

On 12/5/2012 1:50 PM, Dieter Bloms wrote:

Hi,

we use following constellation:

clients -> squid -> virusscanner -> internet.
the virusscanner is avwebgate from avira configured as parent proxy.

The load is ~400 req/s.

With squid 3.1.20 we had no problems, but after upgrade to 3.2.3 our
virusscanner claims it is busy after a few seconds.

Does anybody know any change about connections to a parent proxy from
3.1 to 3.2 series.

The releasenote doesn't mentioned ianything about this (or I can't find
it).


--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


[squid-users] After upgrade from 3.1 to 3.2.3 our parent virusscanner is busy

2012-12-05 Thread Dieter Bloms
Hi,

we use following constellation:

clients -> squid -> virusscanner -> internet.
the virusscanner is avwebgate from avira configured as parent proxy.

The load is ~400 req/s.

With squid 3.1.20 we had no problems, but after upgrade to 3.2.3 our
virusscanner claims it is busy after a few seconds.

Does anybody know any change about connections to a parent proxy from
3.1 to 3.2 series.

The releasenote doesn't mentioned ianything about this (or I can't find
it).


-- 
Regards

  Dieter

--
I do not get viruses because I do not use MS software.
If you use Outlook then please do not put my email address in your
address-book so that WHEN you get a virus it won't use my address in the
>From field.


Re: [squid-users] WARNING: accept_filter not supported on your OS

2012-12-05 Thread Amos Jeffries

On 5/12/2012 10:54 p.m., Le Trung, Kien wrote:

And finally, my squid-configure

#
# Recommended minimum configuration:
#
acl localhost src A.B.C.D/32
acl purgehost src A.B.C.D/32
acl to_localhost dst A.B.C.D/32


NP: 127.0.0.1 is not confidential information. Every machine on the 
planet has one. If you are defining A.B.C.D to be something other than 
127.0.0.1 then adding it to the "to_localhost" ACL is incorrect for what 
the to_localhost ACL means - it is there to prevent client requests 
looping via the 127.0.0.1 and 0.0.0.0 special "localhost" addresses. 
BTW, you are not using to_localhost anyway.




# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7   # RFC 4193 local private network range
acl localnet src fe80::/10  # RFC 4291 link-local (directly
plugged) machines

acl Safe_ports port 80  # http
acl Safe_ports port 81  # http
acl Safe_ports port 82  # http


You list three ports here. But I only see one http_port line for port 82.


acl CONNECT method CONNECT
acl purge method PURGE

acl invalid_urls url_regex ^someregrex

acl valid_urls url_regex ^someregrex

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
#acl Redirection  http_status 302
#cache allow Redirection

acl RedirectTC url_regex ^needredirect
http_access deny RedirectTC
deny_info ERR_REDIRECT_TC RedirectTC

client_persistent_connections on
connect_timeout 5 seconds
detect_broken_pconn on
accept_filter httpready
accept_filter data
negative_ttl 120 seconds
follow_x_forwarded_for allow localhost

http_access allow manager localhost
http_access allow purge purgehost
http_access allow purge localhost


You defined purgehost as being one of the entries of localhost. You can 
remove the "allow purge purgehost" line entirely.



http_access deny manager
http_access deny purge

# Deny requests to certain unsafe ports
http_access deny !Safe_ports
http_access deny invalid_urls
deny_info ERR_INVALID_URLS invalid_urls
http_access allow valid_urls


Certain requests have unlimited access, depending on a regex pattern 
which you have removed. Unless they have matched either of two other 
removed regex patterns. These would seem to be rather important details.



# Deny CONNECT to other than secure SSL ports
#http_access deny CONNECT !SSL_ports
http_access deny all
deny_info ERR_INVALID_URLS all

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Squid normally listens to port 3128

### One domain
cache_effective_user squid
http_port A.B.C.D:82 accel vhost ignore-cc

cache_peer A1.B1.C1.D1 parent 80 0 no-query originserver name=WEB1 max-conn=25
cache_peer_domain WEB1 domain1 domain2

cache_peer A2.B2.C2.D2 parent 80 0 no-query originserver name=WEB2
max-conn=20 round-robin
cache_peer A3.B3.C3.D3 parent 80 0 no-query originserver name=WEB3
max-conn=20 round-robin

cache_peer_domain WEB2 domain3 domain4
cache_peer_domain WEB3 domain4 domain4


acl web1 dstdomain domain1 domain2
acl web2 dstdomain domain3 domain4
acl web3 dstdomain domain4 domain4

cache_peer_access WEB1 allow web1
cache_peer_access WEB2 allow web2
cache_peer_access WEB3 allow web3

cache_peer_access web1 deny all
cache_peer_access web2 deny all
cache_peer_access web3 deny all

# from where browsing should be allowed
http_access allow localnet
http_access allow localhost


You already put "http_access deny all" above these lines. So these ones 
will never be reached.




# And finally deny all other access to this proxy
http_access deny all


# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?


hierarchy_stoplist prevents requests matching the regex from reaching 
your cache_peers. I think you do not want to use it at all.



#hierarchy_stoplist \?
acl CacheType urlpath_regex \? \.css \.gif \.gif\? \.html \.html\?
\.ico \.jpeg \.jpeg\? \.jpg \.jpg\? \.js \.js\? \.php \.php\? \.png
\.png\? \.swf \.swf\? \-
#cache allow CacheType

# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /opt/squid/var/cache 9216 16 256

# Leave coredumps in the first cache dir
coredump_dir /opt/squid/var/cache

cache_mem 9216 MB
maximum_object_size_in_memory 1024 KB
cache_swap_low 30
cache_swap_high 50


You have 9GB of disk cache and 9GB of memory cache. Whenever they fill 
to 4.5GB of data Squid will schedule ~2GB of data to be purged.


You can achieve the equivalent by setting these to the defaul

Re: [squid-users] Squid3 extremely slow for some website cnn.com

2012-12-05 Thread Eliezer Croitoru

I will post it later since I am working on something right now.

Eliezer

On 12/5/2012 11:49 AM, Muhammed Shehata wrote:

Do you have spec file for this version to build on Centos 6 X86_64

Best Regards,
*Muhammad Shehata*
IT Network Security Engineer
TEData
Building A11- B90, Smart Village
Km 28 Cairo - Alex Desert Road, 6th October, 12577, Egypt
T: +20 (2) 33 32 0700 | Ext: 1532
F: +20 (2) 33 32 0800 | M:
E: m.sheh...@tedata.net
On 12/04/2012 02:08 PM, Eliezer Croitoru wrote:


For me it works fine on both 3.2.1-3 and others.

Try to check the headers that is being sent from you proxy such as
"X-Forward" etc.

Regards,
Eliezer

On 12/4/2012 12:37 PM, Muhammed Shehata wrote:



Dear Amos,
I downgraded to 2.6stable21 and it still the same issue with same
websites

Best Regards,
*Muhammad Shehata*
IT Network Security Engineer
TEData
Building A11- B90, Smart Village
Km 28 Cairo - Alex Desert Road, 6th October, 12577, Egypt
T: +20 (2) 33 32 0700 | Ext: 1532
F: +20 (2) 33 32 0800 | M:
E: m.sheh...@tedata.net
On 12/04/2012 09:27 AM, Muhammed Shehata wrote:

the attach
Best Regards,
*Muhammad Shehata*
IT Network Security Engineer
TEData
Building A11- B90, Smart Village
Km 28 Cairo - Alex Desert Road, 6th October, 12577, Egypt
T: +20 (2) 33 32 0700 | Ext: 1532
F: +20 (2) 33 32 0800 | M:
E: m.sheh...@tedata.net
On 12/04/2012 08:46 AM, Muhammed Shehata wrote:


Dear Amos,
Kindly find wireshark of the squid behaviour when access cnn it
takes 4 minutes to load the site, is there any workaround to avoid
such slowness or any version of squid3 can handle efficiently such
websites contains ETag

Best Regards,
*Muhammad Shehata*
IT Network Security Engineer
TEData
Building A11- B90, Smart Village
Km 28 Cairo - Alex Desert Road, 6th October, 12577, Egypt
T: +20 (2) 33 32 0700 | Ext: 1532
F: +20 (2) 33 32 0800 | M:
E: m.sheh...@tedata.net
On 12/03/2012 01:30 PM, Amos Jeffries wrote:


On 3/12/2012 9:12 p.m., Muhammed Shehata wrote:




I notice brighttalk is an HTTPS site, so the issue there is likely
to be bug 3659 which was resolved in 3.2.4.

CNN problem is not obvious. It has a great many objects in its
pages, all of which have ETag and Vary problems which could do some
very strange things to the responses. Since it is HTTP site only
there is no reason Squid should be sending (FIN,ACK).

Amos


Dear Amos,
I have serious problem after upgrading to Squid3 it response
extremely slow to some specific websites(www.cnn.com,
www.brighttalk.com)
I've tried  squid-3.0.STABLE20,3.1.20,3.1.21,3.2.1 and all of
them have the same slowness problem however  this issue didn't
happen in squid2.6 with the same configuration and accessing the
same websites
some guys suggested  dns resolving issue but it's not, the
problem is squid send (FIN,ACK) message to the website in the
middle of loading and h

















--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] WARNING: accept_filter not supported on your OS

2012-12-05 Thread Le Trung, Kien
And finally, my squid-configure

#
# Recommended minimum configuration:
#
acl localhost src A.B.C.D/32
acl purgehost src A.B.C.D/32
acl to_localhost dst A.B.C.D/32

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7   # RFC 4193 local private network range
acl localnet src fe80::/10  # RFC 4291 link-local (directly
plugged) machines

acl Safe_ports port 80  # http
acl Safe_ports port 81  # http
acl Safe_ports port 82  # http
acl CONNECT method CONNECT
acl purge method PURGE

acl invalid_urls url_regex ^someregrex

acl valid_urls url_regex ^someregrex

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
#acl Redirection  http_status 302
#cache allow Redirection

acl RedirectTC url_regex ^needredirect
http_access deny RedirectTC
deny_info ERR_REDIRECT_TC RedirectTC

client_persistent_connections on
connect_timeout 5 seconds
detect_broken_pconn on
accept_filter httpready
accept_filter data
negative_ttl 120 seconds
follow_x_forwarded_for allow localhost

http_access allow manager localhost
http_access allow purge purgehost
http_access allow purge localhost
http_access deny manager
http_access deny purge

# Deny requests to certain unsafe ports
http_access deny !Safe_ports
http_access deny invalid_urls
deny_info ERR_INVALID_URLS invalid_urls
http_access allow valid_urls
# Deny CONNECT to other than secure SSL ports
#http_access deny CONNECT !SSL_ports
http_access deny all
deny_info ERR_INVALID_URLS all

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Squid normally listens to port 3128

### One domain
cache_effective_user squid
http_port A.B.C.D:82 accel vhost ignore-cc

cache_peer A1.B1.C1.D1 parent 80 0 no-query originserver name=WEB1 max-conn=25
cache_peer_domain WEB1 domain1 domain2

cache_peer A2.B2.C2.D2 parent 80 0 no-query originserver name=WEB2
max-conn=20 round-robin
cache_peer A3.B3.C3.D3 parent 80 0 no-query originserver name=WEB3
max-conn=20 round-robin

cache_peer_domain WEB2 domain3 domain4
cache_peer_domain WEB3 domain4 domain4


acl web1 dstdomain domain1 domain2
acl web2 dstdomain domain3 domain4
acl web3 dstdomain domain4 domain4

cache_peer_access WEB1 allow web1
cache_peer_access WEB2 allow web2
cache_peer_access WEB3 allow web3

cache_peer_access web1 deny all
cache_peer_access web2 deny all
cache_peer_access web3 deny all

# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all


# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
#hierarchy_stoplist \?
acl CacheType urlpath_regex \? \.css \.gif \.gif\? \.html \.html\?
\.ico \.jpeg \.jpeg\? \.jpg \.jpg\? \.js \.js\? \.php \.php\? \.png
\.png\? \.swf \.swf\? \-
#cache allow CacheType

# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /opt/squid/var/cache 9216 16 256

# Leave coredumps in the first cache dir
coredump_dir /opt/squid/var/cache

cache_mem 9216 MB
maximum_object_size_in_memory 1024 KB
cache_swap_low 30
cache_swap_high 50
strip_query_terms off
logformat combined %>a %ui %un [%tl] "%rm %ru HTTP/%rv" %>Hs %h" "%{User-Agent}>h" %Ss:%Sh
#access_log none
cache_store_log none
access_log stdio:/opt/squid/var/logs/access.log combined
cache_log /opt/squid/var/logs/cache.log
#cache_swap_log /var/log/squid/swap.state
#maximum_object_size 10 MB
#quick_abort_min 0 KB
#quick_abort_max 0 KB
#memory_replacement_policy lru
#cache_replacement_policy heap LFUDA
#store_dir_select_algorithm round-robin
#cache_dir null /tmp

# Add any of your own refresh_pattern entries above these.
#refresh_pattern ^ftp:  144020% 10080
#refresh_pattern ^gopher:   14400%  1440
refresh_pattern -i (^someregrex) ...
refresh_pattern -i (/cgi-bin/) 0 0%  0
refresh_pattern .   0   20% 4320


On Wed, Dec 5, 2012 at 3:52 PM, Eliezer Croitoru  wrote:
> Hey Trung Kien,
>
> We will need more data to try helping you with the problem.
> If you can share the configure options of squid build and squid.conf it will
> give us a good look on why it may could be happening.
>
> If you can describe more about you infrastructure it will help.
>
> Note that this is a public list so remove any identifying and confidential
> data from squid.conf.
>
> Best Regards,
> Eliezer
>
>
>
> On 12/5/2012 9:59 AM, Le Trung, Kien w

Re: [squid-users] WARNING: accept_filter not supported on your OS

2012-12-05 Thread Le Trung, Kien
Ok,

Here is my configure log:

./configure --prefix=/opt/squid --with-filedescriptors=100
--enable-removal-policies="lru heap" --enable-linux-netfilter

checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
checking how to create a ustar tar archive... gnutar
checking whether to enable maintainer-specific portions of Makefiles... no
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking for style of include used by make... GNU
checking dependency style of gcc... gcc3
checking whether gcc and cc understand -c and -o together... yes
checking for g++... g++
checking whether we are using the GNU C++ compiler... yes
checking whether g++ accepts -g... yes
checking dependency style of g++... gcc3
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking simplified host os... linux (version )
checking for ranlib... ranlib
checking how to run the C preprocessor... gcc -E
checking whether ln -s works... yes
checking for egrep... /bin/egrep
checking for sh... /bin/sh
checking for false... /bin/false
checking for true... /bin/true
checking for mv... /bin/mv
checking for mkdir... /bin/mkdir
checking for ln... /bin/ln
checking for chmod... /bin/chmod
checking for tr... /usr/bin/tr
checking for rm... /bin/rm
checking for cppunit-config... false
checking for perl... /usr/bin/perl
checking for pod2man... /usr/bin/pod2man
checking for ar... /usr/bin/ar
configure: strict error checking enabled: yes
checking whether to use loadable modules... yes
checking how to print strings... printf
checking for a sed that does not truncate output... /bin/sed
checking for fgrep... /bin/fgrep
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking the maximum length of command line arguments... 1966080
checking whether the shell understands some XSI constructs... yes
checking whether the shell understands "+="... yes
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for ar... /usr/bin/ar
checking for strip... strip
checking for ranlib... (cached) ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for dlfcn.h... yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... no
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... no
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports
shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking for shl_load... no
checking for shl_load in -ldld... no
checking for dlopen... no
checking for dlopen in -ldl... yes
checking whether a program can dlopen itself... yes
checking whether a statically linked program can dlopen itself... yes
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... yes
checking how to run the C++ preprocessor... g++ -E
checking for ld used by g++... /usr/bin/ld -m elf_x86_64
checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports
shared libraries... yes
checking for g++ option to produce PIC... -fPIC -DPIC
checking if g++ PIC flag -fPIC -DPIC works... yes
checking if g++ static flag -static works... no
checking if g++ supports -c -o file.o... yes
checking if g++ supports -c -o file.o... (cached) yes
checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports
shared libraries... yes
checking dynamic linker characteristics... (cached) GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking which extension is used for runtime loadable modules... .so
checking which variable specifies run-time module search path... LD_LIBRARY_PATH
checking for the default library search path... /lib /usr/lib
/usr/lib64/mysql /usr/lib6

Re: [squid-users] tcp_miss_aborted with squidguard

2012-12-05 Thread Eliezer Croitoru

Hey Richardo,

Please share your configuration file and relevant debug_options output.

Take a look at http://wiki.squid-cache.org/KnowledgeBase/DebugSections
Sections 61 and 84 should be the relevant.
Try at level 3 first and later level 6.
Share the relevant results ONLY.

Regards,
Eliezer

On 12/5/2012 1:19 AM, Ricardo Rios wrote:

All seems to go up well, squid process with the 10 squiguard childrens,
squiguard also says on logs is ready for requests, but when i try to
access a website that should be blocked, the website open anyways, and
when i check logs, i see this :

TCP_MISS_ABORTED/000 0 GET
http://www.redtube.com/bid/00012244/index.php? - HIER_DIRECT/62.212.83.1-

This happend with : squid-3.3.0.2 , squid-3.2.4 and squid-3.2.3

Is working ok with same conf in : squid-3.1.22



Regards


--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il


Re: [squid-users] WARNING: accept_filter not supported on your OS

2012-12-05 Thread Eliezer Croitoru

Hey Trung Kien,

We will need more data to try helping you with the problem.
If you can share the configure options of squid build and squid.conf it 
will give us a good look on why it may could be happening.


If you can describe more about you infrastructure it will help.

Note that this is a public list so remove any identifying and 
confidential data from squid.conf.


Best Regards,
Eliezer


On 12/5/2012 9:59 AM, Le Trung, Kien wrote:

Hi,

Today, I built version 3.1.22, then started squid with or without
accept_filter directive in squid's configuration file and in both case
I got NO 500 MISS in the access log.

Moreover, the speed when access new links faster (not cached) than
version 3.2.3.


Best Regards,
Trung Kien


--
Eliezer Croitoru
https://www1.ngtech.co.il
sip:ngt...@sip2sip.info
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il