Probably, you can change the ulimit value and then try with
--with-filedescriptors option/. /It may work.
Change the ulimit value: root#ulimit -HSn 32768
or try
client_persistent_connections off
server_persistent_connections off
in the squid.conf configuration.
Regards,
ViSolve Squid Team./
/S
Any thoughts on this ..
On Mon, Feb 23, 2009 at 4:11 PM, Shekhar Gupta wrote:
> I think this is some bug as the same machine with 2.6 swuid version
> were not having any of these messages , I still have 3 machine on the
> older squid version and i upgraded 2 machine to 3.0 13 version and i
> am
Why yes it was
thank you !
-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz]
Sent: Monday, February 23, 2009 9:47 PM
To: Jim Lawrence
Cc: Amos Jeffries; squid-users@squid-cache.org
Subject: RE: [squid-users] New Setup help
> cat /etc/squid/allowed_sites.squid
> *.a
> cat /etc/squid/allowed_sites.squid
> *.americas-pet-store.com
> *.petfrenzy.com
> *.google.com
> [r...@virt1 ~]#
There is the problem. the '*' is not a proper part of domain names.
Just begin the partial domains with a '.'
Amos
>
> I did a service squid restart
> And for good measure service
> Hi Squids,
>
> I wonder if is it possible to do this in Sq. We need to long HTTP/GET
> response of what users are surfing.
>
> I mean to know one file per http-sesion (not ip, because nat-ed fw) and
> then
> inside that file I could see what does is this user doing. How could you
> reach that c
cat /etc/squid/allowed_sites.squid
*.americas-pet-store.com
*.petfrenzy.com
*.google.com
[r...@virt1 ~]#
I did a service squid restart
And for good measure service squid reload
-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz]
Sent: Monday, February 23, 2009 8:45 PM
> Current config
>
> http_port 192.168.31.3:3128
> hierarchy_stoplist cgi-bin ?
> acl QUERY urlpath_regex cgi-bin \?
> cache deny QUERY
> acl apache rep_header Server ^Apache
> broken_vary_encoding allow apache
> cache_dir ufs /var/spool/squid 1000 16 256
> access_log /var/log/squid/access.log squ
Hi Squids,
I wonder if is it possible to do this in Sq. We need to long HTTP/GET
response of what users are surfing.
I mean to know one file per http-sesion (not ip, because nat-ed fw) and then
inside that file I could see what does is this user doing. How could you
reach that configuration?
> I think url_rewrite_access is not supported by Squid 2.5 and supported on
> Squid 2.6+.
>
> I was looking and I found this
> http://www.squid-cache.org/mail-archive/squid-users/200502/0150.html but I
> do not want to limit access on port 80.
>
> Any ideas?
Step 1: upgrade to a current Squid whi
Current config
http_port 192.168.31.3:3128
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
cache_dir ufs /var/spool/squid 1000 16 256
access_log /var/log/squid/access.log squid
dns_nameservers
> Amos Jeffries wrote:
>> Joseph Spadavecchia wrote:
>>> Hi all,
>>>
>>> We have a requirement to use different authentication mechanisms
>>> based on the subnet/ip-address of the client.
>>>
>>> For example, a client from one subnet would authenticate against ntlm
>>> while a client from another s
> Hi
> we have a strange problem which I hope someone can give me a pointer
> towards resolving.
> Our setup consists of Squid 2.5 acting as a caching proxy interfacing with
> Websense 6.3.2 to provide access filtering. The issue below has been
> replicated using Squid 2.5 and Squid 2.7 acting sole
> Cisco1720 router --> 4 windows based servers 1 centos virtual server 1
> centos squid server.
> Client computers (8)
>
> Would like to have all web traffic blocked except websites defined in a
> allowed_sites.squid config file.
> My squid.conf file
>
> Should my squid server have 2 network cards
> Hi,
>
> I have some questions about squid as reverse proxy.
>
> The web server I´m accelerating (cache_peer) has dynamic content
> (cgi-
> bin).
>
> At the beginning I left the default cache refresh values (so for cgi-bin \
> /
> ? has a value "0") and the hierarchy list for cgi-bin and
I think url_rewrite_access is not supported by Squid 2.5 and supported on Squid
2.6+.
I was looking and I found this
http://www.squid-cache.org/mail-archive/squid-users/200502/0150.html but I do
not want to limit access on port 80.
Any ideas?
Thank you,
Roberto O. Fernández Crisial
-Or
I look at the log files
tail -30 /var/log/squid/access.log
1235404880.957 0 192.168.31.75 TCP_DENIED/403 1380 CONNECT
urs.microsoft.com:443 - NONE/- text/html
1235404880.959 0 192.168.31.75 TCP_DENIED/403 1380 CONNECT
urs.microsoft.com:443 - NONE/- text/html
1235404880.977 0 192.
> > > “http://...”, even after be matched with script, and makes an infinite
> > > loop
> > > requests (the script redirects to https but the Squid take it as http and
> > > make the redirection again). What I can do? How can I make the “http” to
> > > “https” to work fine?
> >
> > What is your
Cisco1720 router --> 4 windows based servers 1 centos virtual server 1 centos
squid server.
Client computers (8)
Would like to have all web traffic blocked except websites defined in a
allowed_sites.squid config file.
My squid.conf file
Should my squid server have 2 network cards or can I
JD,
The exits are for testing and should not be at the example I wrote.
The access.log shows (after redirection):
1235404323.937 0 200.127.215.7 TCP_MISS/301 181 GET http://xxx.yyy.com/ -
NONE/- -
1235404324.445 0 200.127.215.7 TCP_MISS/301 181 GET http://xxx.yyy.com /
> I’m using Squid 2.5-STABLE14 with SSL support. I need to rewrite every url
> with “http://...“ request to “https://...” so I use this script at the
> redirect_program line:
Old version... ^_^
> #!/usr/bin/perl
>
> $|=1;
>
> while (<>)
> {
> @X = split;
> $url = $X[0];
>
>
Hi All,
Which IOS version in 12.4 series is best for Squid+Tproxy+Wccp setup?.
Some versions has bugs in traffic redirection.
Please post the version details.
Thanks,
Vivek N.
You are invited to Get a Free AOL Ema
Hi
we have a strange problem which I hope someone can give me a pointer towards
resolving.
Our setup consists of Squid 2.5 acting as a caching proxy interfacing with
Websense 6.3.2 to provide access filtering. The issue below has been replicated
using Squid 2.5 and Squid 2.7 acting soley as a ca
Hi,
My name is Roberto and Im a new user of the list. I having a trouble and I
want to know if you can help me with it.
Im using Squid 2.5-STABLE14 with SSL support. I need to rewrite every url
with http://... request to https://... so I use this script at the
redirect_program line:
#!/usr
Amos Jeffries wrote:
Joseph Spadavecchia wrote:
Hi all,
We have a requirement to use different authentication mechanisms
based on the subnet/ip-address of the client.
For example, a client from one subnet would authenticate against ntlm
while a client from another subnet would authenticate
Hi,
I have some questions about squid as reverse proxy.
The web server I´m accelerating (cache_peer) has dynamic content (cgi-
bin).
At the beginning I left the default cache refresh values (so for cgi-bin \ /
? has a value "0") and the hierarchy list for cgi-bin and "no_cache deny
a
> squid proxy wrote:
> >howto check in the squid logs if dynamic page (asp, cgi-bin etc.) was
> >cached or not?
everything that is not excluded from caching by '(no_)cache deny' directive,
is cached by fefault.
You apparently mean if the cached content was provided to any clients, which
means tha
> A packet trace on the outbound side of squid.
> The more interesting thing would be a packet trace of the whole squid-server
> communication and see as I suggested, whether that 304 contains a body object
> or
> not.
>
> Run this on the squid box:
> tcpdump -w $SERVERIP.trace -i $IFACE hos
-Original Message-
From: crobert...@gci.net [mailto:crobert...@gci.net]
Sent: 19 February 2009 20:37
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Helper protocol issue with wbinfo_group.pl
Benedict White wrote:
> When I use wbinfo_group.pl it clearly can and does go and che
> On Tue, Feb 03, 2009 at 05:21:50PM +0100, Matus UHLAR - fantomas wrote:
> > On 03.02.09 21:18, Vikram Goyal wrote:
> > > I want to anonymize surfing for that I have squid version 3.0 running in
> > > transparent mode. I have
> > >
> > > request_header_access From deny all
> > > request_header_a
I have 2 of the system which i upgraded to 3.0 13 and on both of them
i am getting the same .
On Mon, Feb 23, 2009 at 4:14 PM, Amos Jeffries wrote:
> Shekhar Gupta wrote:
>>
>> 2009/02/23 13:09:00| tunnelReadServer: FD 350: read failure: (0) Success
>> 2009/02/23 13:28:36| tunnelReadServer: FD 33
Shekhar Gupta wrote:
2009/02/23 13:09:00| tunnelReadServer: FD 350: read failure: (0) Success
2009/02/23 13:28:36| tunnelReadServer: FD 332: read failure: (0) Success
2009/02/23 13:37:51| tunnelReadServer: FD 401: read failure: (0) Success
2009/02/23 13:38:38| tunnelReadServer: FD 395: read failu
I think this is some bug as the same machine with 2.6 swuid version
were not having any of these messages , I still have 3 machine on the
older squid version and i upgraded 2 machine to 3.0 13 version and i
am finding this problem .
On Mon, Feb 23, 2009 at 3:53 PM, Amos Jeffries wrote:
> Shekhar
Shekhar Gupta wrote:
Amos,
I only configured it with delay pool , so you are saying that i have
to recompile the squid with that option . do i have to do ant
thing else apart from it like something in OS .
I would hope nothing in OS is needed. But I don't know RHEL very well.
The option i
Amos,
I only configured it with delay pool , so you are saying that i have
to recompile the squid with that option . do i have to do ant
thing else apart from it like something in OS .
On Mon, Feb 23, 2009 at 3:12 PM, Amos Jeffries wrote:
> Shekhar Gupta wrote:
>>
>> Guys , i tried fixing t
2009/02/23 13:09:00| tunnelReadServer: FD 350: read failure: (0) Success
2009/02/23 13:28:36| tunnelReadServer: FD 332: read failure: (0) Success
2009/02/23 13:37:51| tunnelReadServer: FD 401: read failure: (0) Success
2009/02/23 13:38:38| tunnelReadServer: FD 395: read failure: (0) Success
2009/02
Shekhar Gupta wrote:
Guys , i tried fixing this however most of the derivatives are not
working with this verision and can any one throw some light how to
make this fix in Version 3.0.STABLE13 running on RHEL 5.3..
Check you are using the configure option: --with-filedescriptors=N
3.0 uses
Guys , i tried fixing this however most of the derivatives are not
working with this verision and can any one throw some light how to
make this fix in Version 3.0.STABLE13 running on RHEL 5.3..
37 matches
Mail list logo