Hello
i have more than one different urls. For each Url squid save an object
allthrough all these objects have the same E-Tag. Is there any
possibiltiy to change this behavior? May be squid prefer to look for
E-Tags instead of requested headers.
thanks a lot
Enrico
Hello
I have an problem with regular expression in squidGuard. I'm using the
following regex
[EMAIL PROTECTED]://www.main.example.org/(.+)@http://[EMAIL PROTECTED]
Now the problem consists that http://example.org contain no querystring
after this rewrite. Could you please help me to understand
On 23.07.07 09:28, [EMAIL PROTECTED] wrote:
after reading manuals an two days searching in the internet for a solution
I try this way:
The following I want to reach:
In my clients (Firefox and filezille) I configure the acces via proxy
GateKeeperOut port 3128.
Each access should be
please configure your mailer to wrap lines belod 80 chars per line.
72 to 75 is OK.
On 24.07.07 02:58, Nadeem Semaan wrote:
While trying to open ftp sites with IE, i usally get an error,
what error? Do you have blocked FTP access to outside? M$IE, when
configured to 'use folder view for FTP
Hi all,
Is there a way to reset/clear the authentication/IP table without
restarting squid? We use this feature but from time to time we would
need to clear the table (or just a single user) from it, but the proxy
is heavily used and do not want to create those grace time long outages.
On Thu, 2 Aug 2007 03:31:09 +0300 Nerijus Baliunas [EMAIL PROTECTED] wrote:
After searching a bit, I found
http://www.squid-cache.org/bugs/show_bug.cgi?id=1886.
The patch in it is a bit incorrect. The corrected patch is attached and it
works with
my setup.
BTW, the following patch should
The 'microsoft' ACLS are working fine - 'macintosh' is not:
acl microsoft dstdomain .microsoft.com
acl macintosh dstdomain .apple.com
acl all src 0.0.0.0/0.0.0.0
http_access allow microsoft
http_access allow macintosh
http_access deny all
-Original Message-
From: Tek Bahadur Limbu
Hello - I hope I'm writing to the correct place!
I have Squid running on RHAS4 and it has been running perfectly for some
time. I added some new ACLs and http_access protocols mirroring exactly
what existed. I then reconfigured the squid client and even restarted
the machine itself, and I
Heaton, Tobias wrote:
Hello - I hope I'm writing to the correct place!
I have Squid running on RHAS4 and it has been running perfectly for some
time. I added some new ACLs and http_access protocols mirroring exactly
what existed. I then reconfigured the squid client and even restarted
the
No log entries are appearing from a network machine on the same subnet. The
only way I can generate an access.log entry is running the squidclient app w/
the URL:
squidclient http://www.apple.com
access.log:
247 127.0.0.1 TCP_MISS/200 10226 GET http://www.apple.com -
DIRECT/17.149.160.10
Post your DENIED log entries in access.log.
Most probably apple.com site is using other domains different than
apple.com. So, despite apple.com is allowed, those others are denied and
the page cannot be accessed.
Post your DENIED logs please.
Heaton, Tobias escreveu:
The
I have setup cron under Cygwin to update my malware list on the Squid
that is running on Windows.
I want to setup a cron job which rotates the Squid log files when the
system is started up in the morning for the first time.
Is there a way out?
Regards
Santosh Rani
(Replying to list because I think that's what you intended to do.)
On Wed 01.Aug.07 09:53, Benno Blumenthal wrote:
Angel Olivera wrote:
But I don't know about the second part: detecting when it's down. It is
sort of down, since it will reply pings et al, but no HTTP packets will
come back
Heaton, Tobias escreveu:
No log entries are appearing from a network machine on the same subnet. The
only way I can generate an access.log entry is running the squidclient app w/
the URL:
squidclient http://www.apple.com
access.log:
247 127.0.0.1 TCP_MISS/200 10226 GET
Hello. I was trying to check whether there is some security hole or
issue with our squid /or ICP that I should know about. I looked around
the www.squid-cache.org the web, but didn't find anything relevant to
the case below. I'd appreciate any pointers.
BACKGROUND:
Someone from web site X
It was a DNS zone problem that I've resolved. Thanks for all your help!
-Original Message-
From: Leonardo Rodrigues Magalhães [mailto:[EMAIL PROTECTED]
Sent: Thursday, August 02, 2007 12:12 PM
To: Heaton, Tobias
Cc: Squid Users
Subject: Re: [squid-users] Squid ACL Problem
Heaton,
Ok. So here at the office we have a T1 line and a backup DSL line.
Basically we have NO CONTROL over the policies passed to us over the T1
line, which means we can't have proxies set at login automatically.
What I would like to do is connect two outside interfaces, one for the DSL
and T1 and
(Replying to list because I think that's what you intended to do.)
On Wed 01.Aug.07 09:53, Benno Blumenthal wrote:
Angel Olivera wrote:
But I don't know about the second part: detecting when it's down. It
is
sort of down, since it will reply pings et al, but no HTTP packets
will
come
On Thu, 2 Aug 2007 21:07:35 +0530
Santosh Rani [EMAIL PROTECTED] wrote:
I have setup cron under Cygwin to update my malware list on the Squid
that is running on Windows.
I want to setup a cron job which rotates the Squid log files when the
system is started up in the morning for the first
Hello. I was trying to check whether there is some security hole or
issue with our squid /or ICP that I should know about. I looked around
the www.squid-cache.org the web, but didn't find anything relevant to
the case below. I'd appreciate any pointers.
The major security problems we are
Ok. So here at the office we have a T1 line and a backup DSL line.
Basically we have NO CONTROL over the policies passed to us over the T1
line, which means we can't have proxies set at login automatically.
What I would like to do is connect two outside interfaces, one for the DSL
and T1
I was thinking of building several boxes with between 10TB and 20TB of
SATA drives, for some squid caches.
Has any used squid to cache that much data?
Any idea what the upper limit is? The practical limit?
How much RAM would be required to index all that? Our cache runs at
about 90Gbyte
I was thinking of building several boxes with between 10TB and 20TB of
SATA drives, for some squid caches.
Has any used squid to cache that much data?
Any idea what the upper limit is? The practical limit?
-Vickers
I have a dozen quad proc boxes for a carp squid farm.
Looks like I have to put 4 processes for squid configured as carp and 4
process for squid configured as caches servers on each server, 8
processes on each.
The config I'm thinking is an F5 load balancing 48 carp processes, and
each of those
Can someone tell me if it's possible to block CONNECT attempts that
only specify an IP address (rather than a hostname)?
I can see no legitimate reason to CONNECT to an IP, and I've just caught
students using this method to bypass the filters.
TB
Mark Vickers wrote:
I was thinking of building several boxes with between 10TB and 20TB of
SATA drives, for some squid caches.
Has any used squid to cache that much data?
Hi Mark,
I am using up to 1 TB for my caches. However only about 500 GB are
interconnected to each other currently due
26 matches
Mail list logo