[squid-users] Antwort: Re: [squid-users] automatic removing core files

2010-04-20 Thread Martin . Pichlmaier
Hello,

core files are created when squid crashes.
It would make sense to find out why squid writes core dumps.

Some documentation:
http://wiki.squid-cache.org/SquidFaq/BugReporting

To prevent writing of core files set the core file limit to 0.
It could be ulimit -c 0 or something similar.

Also see the coredump config of squid.conf


Martin







fedorischev fedorisc...@bsu.edu.ru 
20.04.2010 08:14

An
squid-users@squid-cache.org
Kopie

Thema
Re: [squid-users] automatic removing core files






В сообщении от Tuesday 20 April 2010 09:54:10 Jeff Pang написал(а):
 That means squid has coredump happened?
 Your squid is maybe running in incorrect mode.

 On Tue, Apr 20, 2010 at 1:47 PM, fedorischev fedorisc...@bsu.edu.ru 
wrote:
  Hello.
 
  From time to time I checking that number of cores in /var/spool/squid 
is
  increasing. Today I checked, that number of core files is 4-6 and size 
of
  each of them is near 1-1.2G. The question is simple: is there a way to
  automatic removing this files periodically by squid itself? Maybe an
  option in squid.conf ? Or it's required a cron job? Or how to 
absolutely
  disable core writing ?
 
  Thanks.

Our squid works without significant errors, only some kind of errors are 
periodically writing in cache.log, see below:

parseHttpRequest: Requestheader contains NULL characters.

Is this may be a cause of coredumps ?

Squid Cache: Version 2.6.STABLE21 - works fine for us.

WBR.




Re: [squid-users] ACL configuration

2010-04-20 Thread Никоноров Григорий
Hello Amos !

Thank for your replay, i solve the problem.
It was necessary to remove 2 lines permissive all authorized users
All work fine, thanks

Вы писали 19 апреля 2010 г., 18:01:00:
 Никоноров Григорий wrote:
 Hello, Amos
 
 I install the latest version of squid3 from backports (unfortunately
 i cant find my problem in squid3 bugs ...)
 dpkg --list |grep squid3
 ii  squid3  3.0.STABLE19-1~bpo50+1   A full 
 featured Web Proxy cache (HTTP proxy)
 ii  squid3-common   3.0.STABLE19-1~bpo50+1   A full 
 featured Web Proxy cache (HTTP proxy) - common files
 
 I also delete two lines about QUERY...
 acl QUERY urlpath_regex cgi-bin \?
 no_cache deny QUERY
 
 ...and modified my refresh_patters accordingly your advice
 refresh_pattern \.doc$  0   20% 4320
 refresh_pattern \.zip$  0   20% 4320
 refresh_pattern \.exe$  0   20% 4320
 refresh_pattern \.rar$  0   20% 4320
 refresh_pattern ^ftp:   144020% 10080
 refresh_pattern ^gopher:14400%  1440
 refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
 refresh_pattern .   0   20% 4320
 
 I upload my squid.conf for easy to read purpose in pastebay.com
 http://pastebay.com/94291 (no virus guys...only my squid.conf :)
 
 p.s. regex replacement on dstdomain not helped
 
 You wrote 19 апреля 2010 г., 13:47:21:
 Никоноров Григорий wrote:
 Hi,

 After  the upgrade from 2.7 to 3.0.STABLE8-3 + lenny3 squid stop block
 prohibited sites. 
 
 IMO grab the official backport package from 
 http://www.backports.org/debian/pool/main/s/squid3/ if you can.
 
 My Squid3 conf:
 acl ADMIN proxy_auth /etc/squid3/users/users.admin
 acl bad_site url_regex -i  /etc/squid3/bad_site.acl

 bad_site.acl:
 vkontakte\.ru
 odnoklassniki\.ru
 pagewash\.com
 vk\.com
 
 Hmm. Regardless of your squid version those are far better off being 
 configured as a dstdomain ACL type. Regex is Slowww.
 
acl bad_site dstdomain /etc/squid3/bad_site.acl
 
   bad_site.acl:
.vkontakte.ru
.odnoklassniki.ru
.pagewash.com
.vk.com
 
 http_access allow manager localhost
 http_access deny manager
 http_access deny !Safe_ports
 http_access allow ADMIN !bad_site
 acl QUERY urlpath_regex cgi-bin \?
 no_cache deny QUERY
 
 The above two lines about QUERY are no longer very useful.
 
 Remove them and make sure your *final* two refresh_patterns lines match
 the new defaults for squid-3.x:
 
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern .  0 20% 4320
 
 
 http_access deny all


 192.168.164.111 - user from group ADMIN

 Access log:
 1271418317.455103 192.168.164.111 TCP_MISS/302 494 GET 
 http://vkontakte.ru/id00 user DIRECT/93.186.231.220 text/html
 1271418317.536 71 192.168.164.111 TCP_MISS/200 3767 GET 
 http://vkontakte.ru/login.php? user DIRECT/93.186.231.220 text/html
 1271418317.665  5 192.168.164.111 TCP_MISS/304 347 GET 
 http://vkontakte.ru/images/xhead2.gif user DIRECT/93.186.231.220 -
 1271418317.669  9 192.168.164.111 TCP_MISS/304 347 GET 
 http://vkontakte.ru/images/header_yellow.gif user DIRECT/93.186.231.222 -
 1271418317.674 15 192.168.164.111 TCP_MISS/304 347 GET 
 http://vkontakte.ru/images/header_divider.gif user DIRECT/93.186.231.221 -
 1271418317.690 35 192.168.164.111 TCP_MISS/304 483 GET 
 http://www.tns-counter.ru/V13a***R*vkontakte_ru/ru/CP1251/tmsec=vkontakte_total/
  user DIRECT/217.73.200.219 -
 1271418317.714 55 192.168.164.111 TCP_MISS/200 386 GET 
 http://counter.yadro.ru/hit? user DIRECT/88.212.196.77 image/gif
 1271418321.434 82 192.168.164.111 TCP_MISS/200 5360 GET http://vk.com/ 
 user DIRECT/93.186.231.221 text/html
 1271418321.476124 192.168.164.111 TCP_MISS/200 719 GET 
 http://sitecheck2.opera.com/? user DIRECT/91.203.99.45 text/xml
 1271418322.588 34 192.168.164.111 TCP_MISS/304 483 GET 
 http://www.tns-counter.ru/V13a***R*vkontakte_ru/ru/CP1251/tmsec=vkontakte_total/
  user DIRECT/217.73.200.220 -
 1271418322.608 54 192.168.164.111 TCP_MISS/200 386 GET 
 http://counter.yadro.ru/hit? user DIRECT/88.212.196.101 image/gif
 1271418324.221   1670 192.168.164.111 TCP_MISS/200 6368 CONNECT 
 certs.opera.com:443 user DIRECT/91.203.99.57 -
 1271418324.358 69 192.168.164.111 TCP_MISS/200 738 GET 
 http://login.vk.com/? user DIRECT/93.186.229.129 text/html
 1271418324.433 56 192.168.164.111 TCP_MISS/200 617 POST 
 http://vk.com/login.php? user DIRECT/93.186.231.222 text/html

 
 
 I can't see any reason why those requests might go through. Is there any
 additional http_access configuration anywhere?
 
 If not, try with the backports package and see if it goes away.
 
 Amos
 

 Wading through that config I find the very first http_access:

   acl ncsa_users proxy_auth REQUIRED
   http_access allow ncsa_users

 ... any user with a valid login has unlimited access through your server.

   The http_access rules following that line apply only to 

Re: [squid-users] Squid No Longer Compiles on RedHat enterprise 5

2010-04-20 Thread John Doe
From: Bradley, Stephen W. Mr. bradl...@muohio.edu
 This is my second time posting this with no answers yet.
 I have probably compiled Squid with various options over 
 100 times in the last two months and after a two week 
 break I tried compiling last night to add SNMP support 
 and it fails with this:
 util.c: In function âxint64toaâ:
 util.c:929: warning: format â%lldâ 
 expects type âlong long intâ, but argument 4 has type âint64_tâ
 util.c:929: 
 warning: format â%lldâ expects type âlong long intâ, but argument 4 has type 
 âint64_tâ

I just tested and succesfully compiled 3.1.1 on an up to date Centos 5.4 
(equivalent to RH 5.4)...
...
gcc -DHAVE_CONFIG_H  -I.. -I../include -I../src -I../include-Wall 
-Wpointer-arith -Wwrite-strings -Wmissing-prototypes -Wmissing-declarations 
-Wcomments -Werror -D_REENTRANT -Wall -g -O2 -MT util.o -MD -MP -MF 
.deps/util.Tpo -c -o util.o util.c
mv -f .deps/util.Tpo .deps/util.Po
...

What configure options did you use?

JD





Re: [squid-users] automatic removing core files

2010-04-20 Thread Jeff Pang
On Tue, Apr 20, 2010 at 2:14 PM, fedorischev fedorisc...@bsu.edu.ru wrote:


 parseHttpRequest: Requestheader contains NULL characters.


Seems an abnormal request header.


 Squid Cache: Version 2.6.STABLE21 - works fine for us.


That's a really old version, you should be considering to upgrade it
to the latest 2.7 or 3.x release.


-- 
Jeff Pang
http://home.arcor.de/pangj/


[squid-users] SQUID3: Access denied connecting to one site

2010-04-20 Thread Alexandr Dmitriev

Hello,

I have ubuntu 9.10 runing with squid 3.0.STABLE18-1 and squidGuard.

Squid is set up as a transparent proxy - everything is working just 
fine, except I can't access one site (www.airbaltic.lv). Squid drops me 
an error - Access denied.
I tried to disable squidGuard - it did not help, but when I connect 
without squid (disabling transparent access) - I can visit airbaltic.lv


Here are records from access.log:
1271761294.299  5 192.168.1.64 TCP_MISS/403 2834 GET 
http://www.airbaltic.lv/ - DIRECT/87.110.220.160 text/html
1271761305.202  0 192.168.1.64 TCP_NEGATIVE_HIT/403 2842 GET 
http://www.airbaltic.lv/ - NONE/- text/html


And here is my squid.conf:
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
acl localnet src 192.168.1.0/24
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443# https
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow localnet
http_access deny all
icp_access deny all
htcp_access deny all
http_port 3128 transparent
hierarchy_stoplist cgi-bin ?
access_log /var/log/squid3/access.log squid
refresh_pattern ^ftp:144020%10080
refresh_pattern ^gopher:14400%1440
refresh_pattern (cgi-bin|\?)00%0
refresh_pattern .020%4320
coredump_dir /var/spool/squid3
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

Any ideas?

Best regards,

--
Alexandr Dmitrijev
Head of IT Department
Fashion Retail Ltd.
Phone: +371 67560501
Fax:   +371 67560502
GSM:   +371 2771
E-mail:alexandr.dmitr...@mos.lv



Re: [squid-users] automatic removing core files

2010-04-20 Thread fedorischev
В сообщении от Tuesday 20 April 2010 15:02:52 Jeff Pang написал(а):
 On Tue, Apr 20, 2010 at 2:14 PM, fedorischev fedorisc...@bsu.edu.ru wrote:
  parseHttpRequest: Requestheader contains NULL characters.

 Seems an abnormal request header.

I already found a cause of the core dumps. Maybe it's a password brutforcer in 
our network that exploit our squid basic authenticator. This force squid to 
say too many queued basicauthenticator requests and generate a coredump.

  Squid Cache: Version 2.6.STABLE21 - works fine for us.
 That's a really old version, you should be considering to upgrade it
 to the latest 2.7 or 3.x release.

3.x not usable in production for us because of bug 2305
http://bugs.squid-cache.org/show_bug.cgi?id=2305

Ok, thanks for all. The topic is closed.

WBR. Igor


[squid-users] log users that failed to authenticate

2010-04-20 Thread fedorischev
Good day.

We using custom basic authenticator to authenticate our users with mysql. Now 
I want to log all abortive authentication attempts. For this purpose I 
rewrote our authenticator program and now it logs user and password 
fields. But I want to log an ip-address too. Is there a way to pass in 
authenticator helper not only user password, but an ip-address too. Or some 
another way to do this ?

auth configuration in squid.conf:

auth_param basic program /usr/lib64/squid/mysql_auth
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 20 second
auth_param basic casesensitive off

WBR. Igor.


Re: [squid-users] Re: Yahoo mail Display problem

2010-04-20 Thread goody goody
Thanks for reply.

Please let me know which version of squid 2.7/3.1.1 is most stable i-e bug free 
bcoz i am gonna deploy it in production environment.

Best Regards,




- Original Message 
From: Kinkie gkin...@gmail.com
To: goody goody think...@yahoo.com
Cc: squid-users@squid-cache.org
Sent: Fri, April 16, 2010 2:21:16 PM
Subject: Re: [squid-users] Re: Yahoo mail Display problem

 - Original Message 
 From: goody goody think...@yahoo.com
 To: squid-users@squid-cache.org
 Sent: Thu, April 15, 2010 12:16:38 PM
 Subject: Yahoo mail Display problem

 Hi,

 I am running squid 2.5 on 5.4-RELEASE FreeBSD 5.4-RELEASE, since the number 
 of years and was working very fine.

Hi Goody.
  2.5 is a really OLD version of Squid (as in: YEARS old). The most
up-to-date versions are 2.7 and 3.1.1 and they contain uncountable
improvements and fixes;using those versions you're most likely to get
help. If you can consider upgrading, please do so.


-- 
/kinkie



  


Re: [squid-users] Re: Yahoo mail Display problem

2010-04-20 Thread Amos Jeffries

goody goody wrote:

Thanks for reply.

Please let me know which version of squid 2.7/3.1.1 is most stable i-e bug free 
bcoz i am gonna deploy it in production environment.

Best Regards,



Both the same by that measure. 126 bugs and enhancement requests each.

2.7 being the oldest version still supported. We do recommend trying 3.1 
first.


Coming from 2.5 you will not already be using any of the features that 
have locked people into 2.7 use.


Be careful of the configuration file though, since there are now two 
full versions worth of changes you have to leap over. If you need any 
help with the conversion the release notes and we are here.



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


RE: [squid-users] Squid No Longer Compiles on RedHat enterprise 5

2010-04-20 Thread Bradley, Stephen W. Mr.
./configure  --prefix=/usr --includedir=/usr/include --datadir=/usr/share 
--bindir=/usr/sbin --libexecdir=/usr/lib/squid --localstatedir=/var 
--sysconfdir=/etc/squid --enable-wccpv2 --enable-linux-netfilter 
--enable-default-err-language=English --enable-err-languages=English 
--enable-async-io --enable-removal-policies=lru,heap --disable-auth


It dies the same way on three different systems.

thx

-Original Message-
From: John Doe [mailto:jd...@yahoo.com] 
Sent: Tuesday, April 20, 2010 5:57 AM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid No Longer Compiles on RedHat enterprise 5

From: Bradley, Stephen W. Mr. bradl...@muohio.edu
 This is my second time posting this with no answers yet.
 I have probably compiled Squid with various options over 
 100 times in the last two months and after a two week 
 break I tried compiling last night to add SNMP support 
 and it fails with this:
 util.c: In function âxint64toaâ:
 util.c:929: warning: format â%lldâ 
 expects type âlong long intâ, but argument 4 has type âint64_tâ
 util.c:929: 
 warning: format â%lldâ expects type âlong long intâ, but argument 4 has type 
 âint64_tâ

I just tested and succesfully compiled 3.1.1 on an up to date Centos 5.4 
(equivalent to RH 5.4)...
...
gcc -DHAVE_CONFIG_H  -I.. -I../include -I../src -I../include-Wall 
-Wpointer-arith -Wwrite-strings -Wmissing-prototypes -Wmissing-declarations 
-Wcomments -Werror -D_REENTRANT -Wall -g -O2 -MT util.o -MD -MP -MF 
.deps/util.Tpo -c -o util.o util.c
mv -f .deps/util.Tpo .deps/util.Po
...

What configure options did you use?

JD


  


[squid-users] unable to bypass AUP page with local servers

2010-04-20 Thread Johnson, S
Hello,

 I've got a weird issue that I've been finding off an on.  I can finally
duplicate it regularly now.  I'm working with a public network that
we've separated from the local network.  We have web resources that are
on the external side of the squid box.

This is what our network looks like:

public network 65.80.133.x
   |  |
   |   public network
firewall---(nat)DMZ   (192.168.80.x/23)
   |   (192.168.2.0/24)
   |(web servers)
   |
   |
private network
(10.x.x.x)
 The squid server here is configured with an AUP page with a click
through to continue to the site they originally were trying to get to.
Any page outside of our network altogether works great; they get the AUP
and the click through it.  However, if they try to access the local web
server which shares the same external subnet as the squid server then I
cannot click past the AUP.

 To make this a little more complex, I'm attempting to do this through
transparent proxy.  I've also got DNS configured to provide a WPAD file.
If I use the autoproxy config in the browser then it works just fine
(which is why it was working for me).  Once I turn this off in the
browser I once again cannot get to the local web server but other
outside sites work just fine.  I don't see any hits in the log if I try
to browse the local web server which makes me believe that the traffic
isn't even hitting the proxy.  However, it should since there are no
local routes on the workstation that would do otherwise.  It's like the
proxy server isn't picking up the packets at all...

 Oh one more weird thing... if I set myweb in the acl below at the top
of the ACL list then I'm able to get to the local servers but the AUP
page never shows if their homepage is set to the local web server.  I
guess I would expect this behavior since I've never denied the session.
I've tried moving the myweb acl around the whole list but I don't get
any other results...

This is my config:

#  TAG: acl
#Recommended minimum configuration:
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
acl to_localbox dst 192.168.80.5/32
acl myweb dst 64.80.132.1/32


follow_x_forwarded_for allow localhost
acl_uses_indirect_client on
delay_pool_uses_indirect_client on
log_uses_indirect_client on


external_acl_type session ttl=10 children=1 negative_ttl=0
concurrency=200 %SRC /usr/lib/squid/squid_session -t 1800

acl session external session

acl localnet src 192.168.80.0/23 # RFC1918 possible internal network
acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

#  TAG: http_access
http_access allow to_localbox
deny_info http://192.168.80.5/index.php?url=%s session
#http_access allow myweb  #trying different locations for the session to
be set
http_access deny !Safe_portshttp_access allow session
http_access allow SSL_ports
http_access allow CONNECT SSL_ports
http_access deny !session
http_access allow myweb
http_access deny !Safe_ports

http_access deny all

http_port 3128 transparent


[squid-users] Slow tranfert speed over ADSL internet connection

2010-04-20 Thread francis aubut
Hi,I configured Squid, first with Ubuntu server and then on CentOS 5
the problem is the same, I get very slow speed on a network connected
with a ADSL internet connection and when I bring the computer at home
it goes well, I have a Cable Modem connection, what could be wrong?

Francis.


[squid-users] ASA 5505, WCCP, Squid - wrong Router Identifier

2010-04-20 Thread Rudie Shahinian
I'm having a problem with my ASA 5505 with WCCP. I have 2 isp's (a primary with 
ip aaa.aaa.aaa.aaa, and a backup bbb.bbb.bbb.bbb) set up in the ASA. I 
installed squid 2.6 on Ubuntu 8.04 and have followed countless tutorials (which 
say pretty mucht he same thing). So I believe my configureation is correct, 
however the problem is that the ASA is using the wrong IP for the Routing 
Identifier. Its using bbb.bbb.bbb.bbb instead of aaa.aaa.aaa.aaa. It seems the 
ASA picks the highest IP to use and I have no idea how to change it. I know 
this is place is for squid disscussions, but I thought someone else here might 
have had a similar problem with their ASA. 

Thanks you,

Rudie



[squid-users] Cisco ASA, WCCP, Squid - wrong Router Identifier

2010-04-20 Thread Rudie Shahinian
I'm having a problem with my ASA 5505 with WCCP. I have 2 isp's (a primary with 
ip aaa.aaa.aaa.aaa, and a backup bbb.bbb.bbb.bbb) set up in the ASA. I 
installed squid 2.6 on Ubuntu 8.04 and have followed countless tutorials (which 
say pretty mucht he same thing). So I believe my configureation is correct, 
however the problem is that the ASA is using the wrong IP for the Routing 
Identifier. Its using bbb.bbb.bbb.bbb instead of aaa.aaa.aaa.aaa. It seems the 
ASA picks the highest IP to use and I have no idea how to change it. I know 
this is place is for squid disscussions, but I thought someone else here might 
have had a similar problem with their ASA. 

Thanks you,

Rudie



Re: [squid-users] Squid No Longer Compiles on RedHat enterprise 5

2010-04-20 Thread John Doe
From: Bradley, Stephen W. Mr. bradl...@muohio.edu
 ./configure  --prefix=/usr --includedir=/usr/include --datadir=/usr/share 
 --bindir=/usr/sbin --libexecdir=/usr/lib/squid --localstatedir=/var 
 --sysconfdir=/etc/squid --enable-wccpv2 --enable-linux-netfilter 
 --enable-default-err-language=English --enable-err-languages=English 
 --enable-async-io --enable-removal-policies=lru,heap 
 --disable-auth
 It dies the same way on three different systems.

Tried your exact configure command and it still works...
Did you check the configure output?
Only things on my side are these warnings:
  configure: WARNING: Auth scheme modules built: None
  configure: WARNING: cppunit does not appear to be installed. squid does not 
require this, but code testing with 'make check' will fail.
  configure: WARNING: Missing needed capabilities (libcap or libcap2) for TPROXY
  configure: WARNING: Linux Transparent Proxy support WILL NOT be enabled
  configure: WARNING: Reduced support to Interception Proxy
  configure: WARNING: Missing needed capabilities (libcap or libcap2) for 
TPROXY v2
  configure: WARNING: Linux Transparent Proxy support WILL NOT be enabled
Strange thing is that libcap and libcap-devel are installed...
And on make:
  WARNING: Translation toolkit was not detected.

$ grep -C1 int64_t /usr/include/stdint.h
# if __WORDSIZE == 64
typedef long intint64_t;
# else
__extension__
typedef long long int   int64_t;
# endif

I have:
glibc-2.5-42.el5_4.3
glibc-headers-2.5-42.el5_4.3
gcc-4.1.2-46.el5_4.2

JD


  


RE: [squid-users] Reverse Proxy Cluster Issues

2010-04-20 Thread senad.cimic


-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz] 
Sent: Thursday, April 15, 2010 8:55 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Reverse Proxy Cluster Issues

senad.ci...@thomsonreuters.com wrote:
 Thanks Amos, removing hierarchy_stoplist solved my query-string issue. 
 
 However, I'm not sure what you meant by removing cache/no_cache controls. I 
 can't see any such operators in my squif.config file. Can you please 
 elaborate more?
 

Good. It's just a little bit of trash left over from very old configs 
which might have also been causing you issues.

Amos

 Thanks again.
 
 -Original Message-
 From: Amos Jeffries [mailto:squ...@treenet.co.nz] 
 Sent: Wednesday, April 14, 2010 6:17 PM
 To: squid-users@squid-cache.org
 Subject: Re: [squid-users] Reverse Proxy Cluster Issues
 
 On Wed, 14 Apr 2010 08:13:01 -0500, senad.ci...@thomsonreuters.com
 wrote:
 Hi,

 I am first time squid user and was wondering if could get some help. I
 tried to find answers to these questions on-line, but unsuccessfully... 

 I have 2 squid boxes setup as reverse proxies in a cluster (they're
 using each other as siblings). On the backend I'm using single tomcat
 server that both squid boxes use to retrieve content. Squid version I'm
 using is 3.0. I'm running into couple issues:

 Issue #1:
 Whenever squid box receives request for url that contains querystring
 (e.g. - http://site1:8080/RSSSource/rss/feed?max=1) it does not contact
 sibling cache for that resource, but it retrieves it from the backend
 server right away. What's odd is that it works (sometimes...) when query
 string is not present (e.g. http://site1:8080/RSSSource/rss/feed). 

 Issue #2:
 Let's say squidA receives request for some resource (e.g.
 http://site1:8080/RSSSource/rss/feed). If squidA doesn't have it in its
 cache, it will check if it's available from squidB. However, if squidA
 has expired version of that resource, it doesn't contact squidB but
 retrieves it directly from the backend server, which should not be the
 case (it should check if squidB had valid copy available), correct? 

 Here are relevant squid.conf lines for one of the squids (everything
 else is unchanged, config for the second squid is the same except for
 sibling references):
 
 Nope.
 
 The relevant lines are hierarchy_stoplist (prevent peers being asked for
 query-string URLs).
 and cache/no_cache controls (prevent QUERY ACL matches being stored
 locally.)
 
 Both of which need to be removed from your config.
 
 Amos
 


-- 
Please be using
   Current Stable Squid 2.7.STABLE9 or 3.1.1

I switched to squid 3.1.1 and the issue still remains: instead of contacting 
sibling for the expired resources, squid goes to the origin server right away. 
I have cluster of only 2 squids, here is the whole squid.config file for one of 
them (squid.config for the other one is the same except for sibling 
references). Is there anything I missed in the config that could be causing 
this issue?

#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl localhost src ::1/128
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
acl to_localhost dst ::1/128

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7   # RFC 4193 local private network range
acl localnet src fe80::/10  # RFC 4291 link-local (directly plugged) machines

acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on localhost is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow 

[squid-users] SOAP client with no SSL client-certificate features

2010-04-20 Thread D.Veenker
I am running into the following problem and I think Squid might be just 
the solution I am looking for. But I'm not sure about it.


We are developing an application consuming a SOAP-webservice. The 
platform we are developing on (4D) does not support SSL with client 
certificates. It does support the regular HTTPS features though.


So I was wondering if Squid could help me out, and proxy a regular 
plain-http (or https) request from this newly made application to the 
webservice implementing the SSL connection with client certificates.


Let's say the url of the webservice is: 
https://webservice.domain.com/methods
From this developed 4D-application I'd like to connect to 
http://webservice.domain.com/methods and let Squid do all the SSL 
features using client certificate authorization.


Situation:
Application not capable of SSL with client certificates -- plain 
HTTP-request -- Squid (+ client certificate provided by webservice 
company) -- HTTPS request with client certificate -- SSL Webservice


And of course vice-versa, but I assume you already guessed that. The 
certificates are formatted as .der documents, but I guess I can overcome 
the problem when squid does only support a particular format by 
converting the certificate.


** Is this type of proxying possible using Squid?
** How do I configure such a situation in Squid?
** What elements need to be compiled with Squid to get these features 
implemented?


To be honest I'm an total rookie to Squid so I might need some specific 
help, on the other hand not to lazy to get through some docs when you me 
point me in the right direction. And last but not least, I have a strong 
wish to run Squid on a debian server.


Thanks in advance,
Dolf Veenker - Rotterdam - Netherlands



Re: [squid-users] Re: Yahoo mail Display problem

2010-04-20 Thread Amos Jeffries
On Tue, 20 Apr 2010 19:12:54 +0600, abdul sami sami.me...@gmail.com
wrote:
 Thanks for quick reply.
 
 Actually the reason behind the question was my previous experience of
3.0.4
 version, which i installed but after then it was shutting down after
 running
 for some time, and if there is not serious problem with 3.1.1 i would
love
 to install the latest to get benefit from new features.
 
 Best Regards,
 

Okay. Worth a try again then. A lot of the 3.0 bugs are fixed now, most of
the remainder need extra info to track down or testing that they are in
fact closed by 3.1 like we suspect.
Bug 2305 is the big outstanding one from 3.0 and I'm working on that now.

Amos



RE: [squid-users] Squid No Longer Compiles on RedHat enterprise 5

2010-04-20 Thread Amos Jeffries
On Tue, 20 Apr 2010 09:55:48 -0400, Bradley, Stephen W. Mr.
bradl...@muohio.edu wrote:
 ./configure  --prefix=/usr --includedir=/usr/include
--datadir=/usr/share
 --bindir=/usr/sbin --libexecdir=/usr/lib/squid --localstatedir=/var
 --sysconfdir=/etc/squid --enable-wccpv2 --enable-linux-netfilter
 --enable-default-err-language=English --enable-err-languages=English
 --enable-async-io --enable-removal-policies=lru,heap --disable-auth
 

Thanks.

Somehow the definitions of %PRId64 and int64_t are getting disconnected.

Squid pulls in all the possible OS type headers then tries to define a
missing %PRId64 based on the int64_t size (include/squid_types.h), but if
we can find the header where RHEL defines them both and make sure its
included that would be better. Or second best, what code RHEL5 should be
using instead of %lld.

PS: The --enable-default-err-language and --enable-err-languages configure
options are dead. Squid does automatic l10n negotiation with the browser in
3.1.

Amos

 
 It dies the same way on three different systems.
 
 thx
 
 -Original Message-
 From: John Doe [mailto:jd...@yahoo.com] 
 Sent: Tuesday, April 20, 2010 5:57 AM
 To: squid-users@squid-cache.org
 Subject: Re: [squid-users] Squid No Longer Compiles on RedHat enterprise
5
 
 From: Bradley, Stephen W. Mr. bradl...@muohio.edu
 This is my second time posting this with no answers yet.
 I have probably compiled Squid with various options over 
 100 times in the last two months and after a two week 
 break I tried compiling last night to add SNMP support 
 and it fails with this:
 util.c: In function âxint64toaâ:
 util.c:929: warning: format â%lldâ 
 expects type âlong long intâ, but argument 4 has type âint64_tâ
 util.c:929: 
 warning: format â%lldâ expects type âlong long intâ, but argument 4 has
 type
 âint64_tâ
 
 I just tested and succesfully compiled 3.1.1 on an up to date Centos 5.4
 (equivalent to RH 5.4)...
 ...
 gcc -DHAVE_CONFIG_H  -I.. -I../include -I../src -I../include-Wall
 -Wpointer-arith -Wwrite-strings -Wmissing-prototypes
-Wmissing-declarations
 -Wcomments -Werror -D_REENTRANT -Wall -g -O2 -MT util.o -MD -MP -MF
 .deps/util.Tpo -c -o util.o util.c
 mv -f .deps/util.Tpo .deps/util.Po
 ...
 
 What configure options did you use?
 
 JD


Re: [squid-users] Slow tranfert speed over ADSL internet connection

2010-04-20 Thread Amos Jeffries
On Tue, 20 Apr 2010 11:49:05 -0400, francis aubut fugitif...@gmail.com
wrote:
 Hi,I configured Squid, first with Ubuntu server and then on CentOS 5
 the problem is the same, I get very slow speed on a network connected
 with a ADSL internet connection and when I bring the computer at home
 it goes well, I have a Cable Modem connection, what could be wrong?
 
 Francis.

Your experiments as described pretty conclusively confirm that the
problems is:
 a) difference in network lag (its conceivable that your ADSL is simply
slower than Cable, I know mine is by a whole order of magnitude or two).

 b) site-specific configuration somewhere in your setup. Resulting in the
box going a long way to get stuff, ie a DNS server from the cable
connection being used when on ADSL etc.

Amos


Re: [squid-users] SOAP client with no SSL client-certificate features

2010-04-20 Thread Amos Jeffries
On Tue, 20 Apr 2010 23:25:59 +0200, D.Veenker d...@veenker.tk wrote:
 I am running into the following problem and I think Squid might be just 
 the solution I am looking for. But I'm not sure about it.
 
 We are developing an application consuming a SOAP-webservice. The 
 platform we are developing on (4D) does not support SSL with client 
 certificates. It does support the regular HTTPS features though.
 
 So I was wondering if Squid could help me out, and proxy a regular 
 plain-http (or https) request from this newly made application to the 
 webservice implementing the SSL connection with client certificates.
 
 Let's say the url of the webservice is: 
 https://webservice.domain.com/methods
  From this developed 4D-application I'd like to connect to 
 http://webservice.domain.com/methods and let Squid do all the SSL 
 features using client certificate authorization.
 
 Situation:
 Application not capable of SSL with client certificates -- plain 
 HTTP-request -- Squid (+ client certificate provided by webservice 
 company) -- HTTPS request with client certificate -- SSL Webservice
 
 And of course vice-versa, but I assume you already guessed that. The 
 certificates are formatted as .der documents, but I guess I can overcome

 the problem when squid does only support a particular format by 
 converting the certificate.
 
 ** Is this type of proxying possible using Squid?

Yes.

 ** How do I configure such a situation in Squid?

Simply make sure the HTTP requests sent through Squid contain full
absolute URLs starting with https://.

There are some other details such as the difference between Proxy-*
headers and their regular client-server normal versions.


 ** What elements need to be compiled with Squid to get these features 
 implemented?

Nothing special. The defaults are fine.

 
 To be honest I'm an total rookie to Squid so I might need some specific 
 help, on the other hand not to lazy to get through some docs when you me

 point me in the right direction. And last but not least, I have a strong

 wish to run Squid on a debian server.

http://wiki.squid-cache.org/ has almost everything you need for playing
with Squid.


PS: Just a mention. Check your SOAP underlayer. A lot of SOAP systems uses
POST requests which are not cacheable when they should be using GET
requests which are. Tools that use REST HTTP seems to be better IME when
going through any proxies.

Amos


[squid-users] What's the difference between vhost and vport?

2010-04-20 Thread yjyj
Hi,

I know that 'vhost' and 'vport'  used in the reverse proxy mode.
Whtat's the difference between them?

And what about 'accel' ? It is said that 'vhost' and 'vport' implies
'accel' in the default squid.conf. Is it necessary in the reverse
proxy mode ?


yjyj


Re: [squid-users] SQUID3: Access denied connecting to one site

2010-04-20 Thread Drunkard Zhang
2010/4/20 Alexandr Dmitriev alexandr.dmitr...@mos.lv:
 Hello,

 I have ubuntu 9.10 runing with squid 3.0.STABLE18-1 and squidGuard.

 Squid is set up as a transparent proxy - everything is working just fine,
 except I can't access one site (www.airbaltic.lv). Squid drops me an error -
 Access denied.

Try this:
echo 0  /proc/sys/net/ipv4/tcp_ecn

 I tried to disable squidGuard - it did not help, but when I connect without
 squid (disabling transparent access) - I can visit airbaltic.lv

 Here are records from access.log:
 1271761294.299      5 192.168.1.64 TCP_MISS/403 2834 GET
 http://www.airbaltic.lv/ - DIRECT/87.110.220.160 text/html
 1271761305.202      0 192.168.1.64 TCP_NEGATIVE_HIT/403 2842 GET
 http://www.airbaltic.lv/ - NONE/- text/html

 And here is my squid.conf:
 acl manager proto cache_object
 acl localhost src 127.0.0.1/32
 acl to_localhost dst 127.0.0.0/8
 acl localnet src 192.168.1.0/24
 acl Safe_ports port 80        # http
 acl Safe_ports port 21        # ftp
 acl Safe_ports port 443        # https
 acl Safe_ports port 70        # gopher
 acl Safe_ports port 210        # wais
 acl Safe_ports port 1025-65535    # unregistered ports
 acl Safe_ports port 280        # http-mgmt
 acl Safe_ports port 488        # gss-http
 acl Safe_ports port 591        # filemaker
 acl Safe_ports port 777        # multiling http
 acl CONNECT method CONNECT
 http_access allow manager localhost
 http_access deny manager
 http_access deny !Safe_ports
 http_access deny CONNECT !SSL_ports
 http_access allow localhost
 http_access allow localnet
 http_access deny all
 icp_access deny all
 htcp_access deny all
 http_port 3128 transparent
 hierarchy_stoplist cgi-bin ?
 access_log /var/log/squid3/access.log squid
 refresh_pattern ^ftp:        1440    20%    10080
 refresh_pattern ^gopher:    1440    0%    1440
 refresh_pattern (cgi-bin|\?)    0    0%    0
 refresh_pattern .        0    20%    4320
 coredump_dir /var/spool/squid3
 redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

 Any ideas?

 Best regards,

 --
 Alexandr Dmitrijev
 Head of IT Department
 Fashion Retail Ltd.
 Phone:     +371 67560501
 Fax:       +371 67560502
 GSM:       +371 2771
 E-mail:    alexandr.dmitr...@mos.lv




Re: [squid-users] Problem downloading files greater then 2 GB

2010-04-20 Thread Jacques Beaudoin

Hi,

Sorry for my late reply I was making more test
My os version is suse enterprise 10.2 32-bits kernel 2.6.16 with 16 GB
memory on my server

I have the message *preventing off_t overflow in my squid log
Found this message after a google search


There is currently a limit in how large responses Squid can
handle. This limit is around 2GB on 32-bit platforms.

This camera is most likely streaming video over HTTP, and to prevent

internal disaster Squid terminates the connection to the camera when about

2GB of data has been transferred.

There is no special action required from your part, unless ofcourse if

your users are having problems due to this.


Could this be the reason for the interruption of file download after 2 
GB

and the solution is to install suse enterprise 64 bis version.

Thanks

Jacques


Hi
Tell me OS version and kernel version,physical memory and sysctl.conf file.
Also squid.conf after removing hash.
--Original Message--
From: Jacques Beaudoin
To:squid-users@squid-cache.org
Cc:jacques-beaud...@cspi.qc.ca
Subject: [squid-users] Problem downloading files greater then 2 GB
Sent: Apr 16, 2010 8:54 AM

Hi,

I'm using version 3.1.1 of Squid on a suse 10.2 server

and I my users cannot download files greater then 2 GB.

I saw some posting via Google but cannot find a solution

for my problem

Greetings

Sent from BlackBerry® on Airtel




Re: [squid-users] Re: Yahoo mail Display problem

2010-04-20 Thread goody goody
Thanks for your help Amos,

Actually the reason behind 
the question was my previous experience of 3.0.4 version, which i installed but 
after 
then it was shutting down after running for some time, and if there is not such 
a serious problem with 3.1.1 i would definitely love to install the latest 
to get benefit from new features.


Best Regards,



- Original Message 
From: Amos Jeffries squ...@treenet.co.nz
To: squid-users@squid-cache.org
Sent: Tue, April 20, 2010 6:31:58 PM
Subject: Re: [squid-users] Re: Yahoo mail Display problem

goody goody wrote:
 Thanks for reply.
 
 Please let me know which version of squid 2.7/3.1.1 is most stable i-e bug 
 free bcoz i am gonna deploy it in production environment.
 
 Best Regards,
 

Both the same by that measure. 126 bugs and enhancement requests each.

2.7 being the oldest version still supported. We do recommend trying 3.1 first.

Coming from 2.5 you will not already be using any of the features that have 
locked people into 2.7 use.

Be careful of the configuration file though, since there are now two full 
versions worth of changes you have to leap over. If you need any help with the 
conversion the release notes and we are here.


Amos
-- Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1



  


Re: [squid-users] SQUID3: Access denied connecting to one site

2010-04-20 Thread Alexandr Dmitriev

Hello,

I tried to change tcp_ecn, but this did not help. Maybe some other ideas?

Regards,

21.04.2010 4:22, Drunkard Zhang пишет:

2010/4/20 Alexandr Dmitrievalexandr.dmitr...@mos.lv:
   

Hello,

I have ubuntu 9.10 runing with squid 3.0.STABLE18-1 and squidGuard.

Squid is set up as a transparent proxy - everything is working just fine,
except I can't access one site (www.airbaltic.lv). Squid drops me an error -
Access denied.
 

Try this:
echo 0  /proc/sys/net/ipv4/tcp_ecn

   

I tried to disable squidGuard - it did not help, but when I connect without
squid (disabling transparent access) - I can visit airbaltic.lv

Here are records from access.log:
1271761294.299  5 192.168.1.64 TCP_MISS/403 2834 GET
http://www.airbaltic.lv/ - DIRECT/87.110.220.160 text/html
1271761305.202  0 192.168.1.64 TCP_NEGATIVE_HIT/403 2842 GET
http://www.airbaltic.lv/ - NONE/- text/html

And here is my squid.conf:
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
acl localnet src 192.168.1.0/24
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443# https
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow localnet
http_access deny all
icp_access deny all
htcp_access deny all
http_port 3128 transparent
hierarchy_stoplist cgi-bin ?
access_log /var/log/squid3/access.log squid
refresh_pattern ^ftp:144020%10080
refresh_pattern ^gopher:14400%1440
refresh_pattern (cgi-bin|\?)00%0
refresh_pattern .020%4320
coredump_dir /var/spool/squid3
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

Any ideas?

Best regards,

--
Alexandr Dmitrijev
Head of IT Department
Fashion Retail Ltd.
Phone: +371 67560501
Fax:   +371 67560502
GSM:   +371 2771
E-mail:alexandr.dmitr...@mos.lv


 



--
Alexandr Dmitrijev
Head of IT Department
Fashion Retail Ltd.
Phone: +371 67560501
Fax:   +371 67560502
GSM:   +371 2771
E-mail:alexandr.dmitr...@mos.lv