Hi.
squid is 3.1.19 on FreeBSD 8.2 with MIT kerberos. squid_kerb_auth is in use as
the only auth scheme. Have some external acl to check authorization in mysql
db. On machines running XP SP2 with IE8 (enabled Windows Intergrated Auth)
sometimes authentication windows popup. I think this is
hi
i have some problem with installing squid manually, these problem exist when i
want to support the transparent mode in squid using this configuration option
./configure --prefix=/usr/local/squid3 --enable-ssl --enable-storeio=ufs,aufs
--enable-removal-policies=lru,heap
Hi,
Hi, Thanks for that. I tried your recommendations and now I get this.
2012/03/13 12:11:25| clientNegotiateSSL: Error negotiating SSL connection on
FD 18: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert unknown ca
(1/0)
2012/03/13 12:11:25| clientNegotiateSSL: Error negotiating SSL
Duration and overlap of those connections matters. If they were all
serviced in less than 100ms and closed it is possible they all took
place one after another sequentially with no more than 1 open at a
time.
maxconn allows up to 3 *simultaneous* connections. Opening three then
closing
On 13.03.2012 20:24, Mustafa Raji wrote:
hi
i have some problem with installing squid manually, these problem
exist when i want to support the transparent mode in squid using this
configuration option
./configure --prefix=/usr/local/squid3
This: --enable-linux-netlter --enable-linux-tproxy
On 13.03.2012 21:21, kadvar wrote:
Hi,
Hi, Thanks for that. I tried your recommendations and now I get this.
2012/03/13 12:11:25| clientNegotiateSSL: Error negotiating SSL
connection on
FD 18: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert
unknown ca
(1/0)
2012/03/13 12:11:25|
Hi.
I'm using squid 3.1.x on FreeBSD. Squid is built from ports.
Recently I was hit by a weird issue: my users cannot open HTTPS pages.
This is not something constant - if they hit the F5 button in browser,
the pages are loading, sometimes showing the message like 'Unable to
connect. Firefox
Hi,
Pl advise how do I can check my squid configuration from shell
prompt.
Thanks/regards,
Vishal Agarwal
On 13.03.2012 22:10, Eugene M. Zheganin wrote:
Hi.
I'm using squid 3.1.x on FreeBSD. Squid is built from ports.
Recently I was hit by a weird issue: my users cannot open HTTPS
pages. This is not something constant - if they hit the F5 button in
browser, the pages are loading, sometimes showing
On 13.03.2012 21:38, FredB wrote:
Duration and overlap of those connections matters. If they were all
serviced in less than 100ms and closed it is possible they all took
place one after another sequentially with no more than 1 open at a
time.
maxconn allows up to 3 *simultaneous* connections.
How can i configurre squid to create a new access.log file every 15 minutes, in
1 hour i have 4 different log file...
On 13.03.2012 22:25, Vishal Agarwal wrote:
Hi,
Pl advise how do I can check my squid configuration from shell
prompt.
With squidclient command line tool and the mgr:config action and
whatever password you configured in squid.conf for management.
On 13.03.2012 23:09, Ibrahim Lubis wrote:
How can i configurre squid to create a new access.log file every 15
minutes, in 1 hour i have 4 different log file...
What are you using to manage the Squid logs cron? logrotate?
something else?
Amos
Bit suspicious yes.
Tried apachebench (ab) with concurrency level 10? or anything like
that
which can guarantee multiple simultaneous connections for the test?
Amos
Yes, a little script who make many wget recursive + I navigate with firefox,
after I watch access.log and read 20 cnx by
Hi folks,
We're in a large-number-of-users, high bandwidth/usage situation
(average 80 gigs per hour during business hours) and so have opted for a
couple of new proxies (one for fail-over) which we're about to make
'live'. Currently, our cache_mem and cache_dir look like the following
On 13.03.2012 23:44, Peter Gaughran wrote:
Hi folks,
We're in a large-number-of-users, high bandwidth/usage situation
(average 80 gigs per hour during business hours) and so have opted
for
a couple of new proxies (one for fail-over) which we're about to make
'live'. Currently, our cache_mem
Thanks Amos,
The web servers reply to squid with these headers
=
Cache-Control max-age=60
Connection Keep-Alive
Content-Encodinggzip
Content-Length 15139
Content-Typetext/html; charset=UTF-8
DateTue, 13 Mar 2012 12:42:26 GMT
Expires Tue, 13 Mar 2012 12:43:26 GMT
Hi Amos, that worked brilliantly, thanks a lot!
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/squid-3-1-endless-loop-IIS-webserver-tp4465329p4469087.html
Sent from the Squid - Users mailing list archive at Nabble.com.
Hi,
Sorry, I pressed the send button by mistake ...
We are having strange Squid troubles, at first, let me describe our setup:
- 4 HP G6/G7 DL380 servers with 16CPUs and 28GB RAM with RHEL 5.4-5.8
64bit and Squid 3.1.12 (custom compiled)
Squid Cache: Version 3.1.12
configure options:
I limit the maximum file size an employee can download to our network using
reply_body_max_size 100 MB proxy_user1. If this limit is hit, Squid returns a
403.
My problem is that I would like to differentiate between the status code 403
that comes from a target website that does not allow
On 14.03.2012 04:50, squid-list wrote:
I limit the maximum file size an employee can download to our network
using
reply_body_max_size 100 MB proxy_user1. If this limit is hit, Squid
returns a
403.
My problem is that I would like to differentiate between the status
code 403
that comes from a
I use cron...
-Original Message-
From: Amos Jeffries
Sent: 13 Mar 2012 10:15:54 GMT
To: squid-users@squid-cache.org
Subject: Re: [squid-users] About access.log hourly?
On 13.03.2012 23:09, Ibrahim Lubis wrote:
How can i configurre squid to create a new access.log file every 15
On 14.03.2012 03:54, guest01 wrote:
Hi,
Sorry, I pressed the send button by mistake ...
We are having strange Squid troubles, at first, let me describe our
setup:
- 4 HP G6/G7 DL380 servers with 16CPUs and 28GB RAM with RHEL 5.4-5.8
64bit and Squid 3.1.12 (custom compiled)
Squid Cache:
On 14.03.2012 14:54, Ibrahim Lubis wrote:
I use cron...
Then the answer is quite simply to set it to run its command every 15
minutes and bump up your logfile_rotate limit to prevent loosing logs
earlier.
Amos
-Original Message-
From: Amos Jeffries
On 13.03.2012 23:09,
24 matches
Mail list logo