> To: squid-users@squid-cache.org
> Date: Tue, 19 Apr 2011 14:36:31 +1200
> From: squ...@treenet.co.nz
> Subject: RE: [squid-users] Why doesn't REQUEST_HEADER_ACCESS work properly
> with aclnames?
>
> On Mon, 18 Apr 2011 19:15:53 +, Jenny Lee wrote:
> >> > What is the definition of OFFICE
On 18/04/11 08:56, Gerson Barreiros wrote:
Amos,
I'm using your PPA for ubuntu 10.04
Thanks
Debian have picked up the 3.1.12 now. The PPA will be updated in a few
hours.
Amos
--
Please be using
Current Stable Squid 2.7.STABLE9 or 3.1.12
Beta testers wanted for 3.2.0.7
Hi,
I got my squid running, and it's forwarding packets to the server on
behalf of its clients. But it seems that squid didn't cache any
content as the access log only showed TCP_MISS.
Is there something wrong with my configuration file? Thank you in advance.
Experiment Setup
===
On Mon, Apr 18, 2011 at 9:08 PM, Amos Jeffries wrote:
> On Mon, 18 Apr 2011 20:07:33 -0700, Yang Zhang wrote:
>>
>> We're using squid (3.0.STABLE19-1ubuntu0.1) as an application cache
>> and we're trying to cache everything, for a long time, but max-age=0
>> is throwing off squid.
>
> max-age=0 is
On Mon, 18 Apr 2011 20:07:33 -0700, Yang Zhang wrote:
We're using squid (3.0.STABLE19-1ubuntu0.1) as an application cache
and we're trying to cache everything, for a long time, but max-age=0
is throwing off squid.
max-age=0 is called "reload" in HTTP terminology.
I have this refresh_pattern
We're using squid (3.0.STABLE19-1ubuntu0.1) as an application cache
and we're trying to cache everything, for a long time, but max-age=0
is throwing off squid.
I have this refresh_pattern in squid.conf:
$ grep ^refresh_pattern /etc/squid3/squid.conf
refresh_pattern ^ftp: 144020%
On Mon, 18 Apr 2011 19:15:53 +, Jenny Lee wrote:
> What is the definition of OFFICE ?
> request_header_access are fast ACL which will not wait for
unavailable
> details to be fetched.
Ah! proxy_auth :)
Jenny
acl OFFICE src 2.2.2.2
request_header_access User-Agent allow OFFICE
request_
>> Has there been a work around for this issue that I can use in 2.7 and
>> 2.5,
>> to enable it to display my error page when pages not on my whitelist
>> are
>> accessed via port 443/SSL. It works fine for HTTP and it is allow
>> HTTPS
>> pages through that are on my whitelist. Elmar is the sa
On Mon, 18 Apr 2011 18:30:51 -0700, Linda Walsh wrote:
I was wondering if anyone had write a module for squid to change it
into
an 'accelerator', of sorts.
What I mean, specifically -- well there are a couple of levels. S
1) Parsing fetched webpages and looking for statically included
conte
Hi Jason,
I tend to use wget, nice and easy, usually installed on linux,
available on windows and mac in a pinch.
wget -S domain.tld/path/to/file.pd5
On Tue, Apr 19, 2011 at 1:04 AM, Jason Greene wrote:
> I could fetch the file from the server but I don't know how to look at
> the header of a .
I was wondering if anyone had write a module for squid to change it into
an 'accelerator', of sorts.
What I mean, specifically -- well there are a couple of levels. S
1) Parsing fetched webpages and looking for statically included content
(especially .css, maybe .js, possibly image files) and
On Mon, 18 Apr 2011 12:27:32 -0400, Mohammad Fattahian wrote:
Hi,
I just configured transparent proxy to use in my network.
I assume you actually mean NAT interception...
I found HTTPS is not going through the proxy when it is transparet.
Yes. "transparent proxy" is a man-in-middle securi
On Mon, 18 Apr 2011 16:45:01 -0700 (PDT), Supadee718 wrote:
On 29/01/11 02:02, Jason Doran wrote:
In order to get anything useful to happen the deny_info must perform
a
URL redirect with a 307 status code. And the browser must support
correct RFC 2616 handling of that status code.
Support f
On Mon, 18 Apr 2011 16:49:28 -0700 (PDT), Supadee718 wrote:
Squid *is* sending the error page response back. However, there were
some
security vulnerabilities and at least one virus discovered a while
back
involving the way the popular browsers display such pages. So they
disabled
it.
The 307
On Mon, 18 Apr 2011 18:56:08 -0400, adam dirkmaat wrote:
How can I limit 80 traffic to one vhost and 443 traffic to a second
vhost. I want to be able to hit 1.2.3.4:80 & 5.6.7.8:443, and NOT
access 1.2.3.4:443 & 5.6.7.8:80?
http_port 80 defaultsite=web.somesite.com vhost
https_port 443 ce
Thanks for the information! Could you go into a little more detail with your
first suggestion (the easiest). By stdin and stderr do you mean System.in
and System.err in Java?
Also is there anywhere I can find an examples of the code for this? For the
helper and the squid.conf?
Thanks, Edward.
-
> Squid *is* sending the error page response back. However, there were some
> security vulnerabilities and at least one virus discovered a while back
> involving the way the popular browsers display such pages. So they
> disabled
> it.
> The 307 status code was created for non-GET redirects. U
>On 29/01/11 02:02, Jason Doran wrote:
>In order to get anything useful to happen the deny_info must perform a
>URL redirect with a 307 status code. And the browser must support
>correct RFC 2616 handling of that status code.
>Support for 307 has been added to 3.1 since the last formal packag
How can I limit 80 traffic to one vhost and 443 traffic to a second
vhost. I want to be able to hit 1.2.3.4:80 & 5.6.7.8:443, and NOT
access 1.2.3.4:443 & 5.6.7.8:80?
[root@calamari squid]# squid -v
Squid Cache: Version 2.6.STABLE21
[root@calamari squid]# cat /etc/squid/squid.conf
# SQUI
> Squid *is* sending the error page response back. However, there were some
> security vulnerabilities and at least one virus discovered a while back
> involving the way the popular browsers display such pages. So they
> disabled
> it.
> The 307 status code was created for non-GET redirects. U
Well the Wix MSI toolset is GPL so here is the link:
http://squidwindowsmsi.sourceforge.net/
For now it includes the latest 2.7... and let me take a look at the bugs
to see if I may put some efforts in it... (kind of overloaded at the
daily job though... :( ).
best regards
sich
This is a
> > What is the definition of OFFICE ?
> > request_header_access are fast ACL which will not wait for unavailable
> > details to be fetched.
>
> Ah! proxy_auth :)
>
> Jenny
acl OFFICE src 2.2.2.2
request_header_access User-Agent allow OFFICE
request_header_access User-Agent deny all
header_r
Hi,
I just configured transparent proxy to use in my network.
I found HTTPS is not going through the proxy when it is transparet.
What should I do if I want to limit access to some HTTPS site?
Thanks for any help.
Mohammad
I could fetch the file from the server but I don't know how to look at
the header of a .pd5 file
Jason
On Mon, Apr 18, 2011 at 2:41 AM, Matus UHLAR - fantomas
wrote:
> On 14.04.11 12:08, Jason Greene wrote:
>> Can some one tell me if it is possible to make squid not cache a single
>> domain?
One of my users is having problems accessing http://www.scotusblog.com/
through my Squid 3.1.12 proxy. The site comes up in text mode.
Anyone have any ideas?
Thanks,
Mike Grasso
Data Network Administrator
DC Circuit Court of Appeals
(202) 216-7443
Hi Amos,
At first big thanks. By putting "forwarded_for transparent" and "via
off", the host info at www.whatismyip.com removed and also no email
view
problem at hotmail or live.com. All this configuration working
perfectly with Squid as router.
But problem not solved with Router using Wccp2.
At L
On 18/04/2011 04:14, Amos Jeffries wrote:
On Sun, 17 Apr 2011 19:57:11 +0300, Eliezer Croitoru wrote:
On 17/04/2011 19:44, Jenny Lee wrote:
Sorry for not answering. There was just had nothing I could be sure
about until now...
3.2.0.7 will be out early (and very soon) with fixes for the cr
> > When you say earlier, what would be the upper end of the timeframe?
> > (1 week, 1 month?)
>
> By "early" I mean earlier than 1st May which was the next scheduled
> "monthly beta".
> Specifically as soon as I can migrate a half dozen bug fixes around,
> test for build failures and write the Ch
On 18/04/11 19:35, Eugene M. Zheganin wrote:
Hi.
Around 6 months ago I switched from 2.7 to 3.1 for its IPv6.
I may be wrong, but after that I noticed that 'squid -k reconfigure' (I
use my own custom quota manager, which web-interface issues reconfigure
request when quotas are changed) now break
On 14.04.11 12:08, Jason Greene wrote:
> Can some one tell me if it is possible to make squid not cache a single
> domain?
it is, however...
> We have a service that downloads a file and squid seems to be keeping
> the old file in cache so we are not getting the updates.
the webserver probably p
Hi.
Around 6 months ago I switched from 2.7 to 3.1 for its IPv6.
I may be wrong, but after that I noticed that 'squid -k reconfigure' (I
use my own custom quota manager, which web-interface issues reconfigure
request when quotas are changed) now breaks existing connections and
reopens listenin
31 matches
Mail list logo