On Thu, 16 Oct 2008 09:01:48 +0530
Tharanga [EMAIL PROTECTED] wrote:
Hi folks,
...
did anyone succesfully block the team viewer access in squid acl.
I block it by its user-agent string: DynGate .
On tor, 2008-10-16 at 09:42 +0530, Paras Fadte wrote:
Hi Henrik,
In CARP setup, if one uses same weightage for all the parent caches
how would the requests be handled ? will the requests be equally
forwarded to all the parent caches ? if the weightages differ then
won't all the requests be
I am using squid in ubuntu 8.04 and its already blocked (by default).
I am trying to connect to my linux server from home through broadband,
but the connection is not going through.
First, it said, the screen in locked, then i got the screen unlocked,
then, it said the password was accepted by
I've tried searching through the archives for data transfer limits but
all I can find is stuff on bandwidth limiting through the use of delay
pools to restrict users to a specific transfer rate.
Here is my situation. I have a Squid server on the internet that users
around the world can connect to
On Thu, Oct 16, 2008 at 01:56:59AM +0800, howard chen wrote:
Hello,
On Wed, Oct 15, 2008 at 10:14 PM, Henrik K [EMAIL PROTECTED] wrote:
On Wed, Oct 15, 2008 at 03:42:20PM +0200, Henrik Nordstrom wrote:
Any suggestion for having large ACL in a high traffic server?
Avoid using regex
On ons, 2008-10-15 at 17:14 +0300, Henrik K wrote:
Avoid using regex based acls.
It's fine if you use Perl + Regexp::Assemble to optimize them. And link
Squid with PCRE. Sometimes you just need to block more specific URLs.
No it's not. Even optimized regexes is several orders of magnitude
Hi!
On Thursday 16 October 2008, [EMAIL PROTECTED] wrote:
*
This message has been scanned by IMSS NIT-Silchar
Dear All Squid Users,
I have a proxy server, where the df shows the following :
[EMAIL PROTECTED] /]# df
Filesystem
http://www.ledge.co.za/software/squint/squish/
Squish has been out for some time, I know I use it within an education
environment, bit of a mess to setup, but once running it does the job
perfectly.
-Original Message-
From: RM [mailto:[EMAIL PROTECTED]
Sent: 16 October 2008 07:57
To:
Hello!
was trying for a few hours to have a certain site
(http://www.nix.ru) to be not cacheable - but squid always
gives me an object which is in cache!
My steps:
acl DIRECTNIX url_regex ^http://www.nix.ru/$
no_cache deny DIRECTNIX
always_direct allow DIRECTNIX
- But anyway - until I PURGED
Morning
to do this you need to implement some kind of quota system. In essence
you need to rotate your logs ( hourly, daily etc) and then put some
little script together that adds up the traffic associated with each
user account. This can then be used to feed an ACL denying access.
The
Just realized that i have
reload_into_ims on
this was making me to be not able to refresh the given page
or site, since refresh request was changed - but anyway -
it should not affect no_cache?
On Thursday 16 October 2008 14:28, Anton wrote:
BTW Squid 2.6STABLE20 - TPROXY2
On Thursday 16
Chris Natter wrote:
We were having issues with spell-check in 3.0, I haven't tried any of
the development builds to see if it was resolved though in a later
release.
OWA spell-check just seems to hang when you attempt to spell-check an
email, or gives the try again later prompt. I saw some
Thanks Hendrik.
I tried with both types for blocking https://gmail.com.
My conf is
acl gmail1 url_regex gmail.com mail.google.com
and
acl gmail dstdomain gmail.com mail.google.com
http_access deny gmail gmail1
Now https://gmail.com is blocking..
But all other https sites not working..
On ons, 2008-10-15 at 16:42 -0400, Todd Lainhart wrote:
I've looked in the archives, site, and Squid book, but I can't find
the answer to what I'm looking to do. I suspect that it's not
supported.
It is.
My origin server accepts Basic auth over SSL (non-negotiable). I'd
like to stick a
On tor, 2008-10-16 at 09:01 +0530, Tharanga wrote:
I need to block team viewer (remote access software) on squid. I analyse the
connection establishmet . it goes through port 80 to teamviewer server ( ip
is dynamic).
Team viewer clinetport 80 -- Team viewer main server (dynamic
On tor, 2008-10-16 at 13:49 +0500, Anton wrote:
Hello!
was trying for a few hours to have a certain site
(http://www.nix.ru) to be not cacheable - but squid always
gives me an object which is in cache!
My steps:
acl DIRECTNIX url_regex ^http://www.nix.ru/$
no_cache deny DIRECTNIX
On tor, 2008-10-16 at 14:34 +0500, Anton wrote:
Just realized that i have
reload_into_ims on
this was making me to be not able to refresh the given page
or site, since refresh request was changed - but anyway -
it should not affect no_cache?
It doesn't.
Regards
Henrik
Anton escreveu:
Hello!
was trying for a few hours to have a certain site
(http://www.nix.ru) to be not cacheable - but squid always
gives me an object which is in cache!
My steps:
acl DIRECTNIX url_regex ^http://www.nix.ru/$
no_cache deny DIRECTNIX
always_direct allow DIRECTNIX
Hi,
I've found lots of references online (in this list's archives, other sites and
the FAQ) to customising error pages in squid, but haven't yet found reference
to removing error pages completely.
My squid box is running transparently. In the case of any errors I'd like it to
simply return
Thanks so much Henrick and Leonardo!
Looks I should learn regexes, since taked $ as
the whatever after meaning but not end of string :)
Now it logs as TCP_MISS.
Thanks so much again!
On Thursday 16 October 2008 15:45, Leonardo Rodrigues
Magalhães wrote:
Anton escreveu:
Hello!
was trying
On Thu, Oct 16, 2008 at 10:10:23AM +0200, Henrik Nordstrom wrote:
On ons, 2008-10-15 at 17:14 +0300, Henrik K wrote:
Avoid using regex based acls.
It's fine if you use Perl + Regexp::Assemble to optimize them. And link
Squid with PCRE. Sometimes you just need to block more specific
Hi,
I have two reverse proxy servers using each other as neighbours. The
proxy servers are load balanced (using a least connections
algorithm) by a Netscaler upstream of them.
A small amount of URLs account for around 50% or so of the requests.
At the moment there's some imbalance in the hit
BTW Squid 2.6STABLE20 - TPROXY2
On Thursday 16 October 2008 13:49, Anton wrote:
Hello!
was trying for a few hours to have a certain site
(http://www.nix.ru) to be not cacheable - but squid
always gives me an object which is in cache!
My steps:
acl DIRECTNIX url_regex ^http://www.nix.ru/$
Henrik Nordstrom wrote:
On tis, 2008-10-14 at 09:04 -0700, Tom Williams wrote:
Is authentication required to access the server? If so then the server
need to return Cache-Control: public on the content which is
non-private and should be cached.
Keep in mind that such content will be
Thank you, Amos and Henrik. I'll be testing this in 2.7/Stable 4 - I
assume that's OK (no significant fixes in 3.0 in this area that I
should take advantage of)?
Could I do the same thing with SSL to the reverse proxy? That is, the
reverse proxy is the endpoint for the client, gets the creds,
Amos Jeffries wrote:
Tom Williams wrote:
Amos Jeffries wrote:
So, I setup my first Squid 3.0STABLE9 proxy in HTTP accelerator mode
over the weekend. Squid 3 is running on the same machine as the web
server and here are my HTTP acceleration related config options:
http_port 80 accel vhost
It seems that the site http://squidnt.com/ is trying to masquerade as an
official website for Mr Serassio's Windows port of Squid. It doesn't
explicitly state this, but the wording of the site contents strongly implies
such a thing.
Also it was entered into a new Wikipedia article on SquidNT as
hi all,
i've googled, but have been unable to find a simple sed command, or
otherwise to recover an object sitting in the web cache.
i know the filename(s) in the cache, however, there's a squid header
on top of a binary file(s), and I don't know how to recover just the
binary portion.
original
I'm working on getting this working but I'm unclear on the hardware placement
for each of the devices.
Is it:
A)
Workstation-Cisco-Squid--internet
(WCCP) (NAT)
B)
Workstation-Cisco (WCCP)
|
Hmmm, strange. I tested 2.7STABLE4, but it doesn't seem to be stripping
the DOMAIN, it will still accept only DOMAIN\USERNAME. Perhaps I'm
missing something?
I also tested squid-3.1-20081016, built with a spec file adopted from a
squid3.0STABLE7 Redhat package:
configure \
--exec_prefix=/usr
Hhi,
no reason (unless there's smothing i don't get) to use nat or wccp at
the workstation level. wccp should configured at the cisco box (level C
only) such that it forwards requests to the web through the squid box
cheers
charles
On Thu, 2008-10-16 at 12:56 -0500, Johnson, S wrote:
I'm
On tor, 2008-10-16 at 13:02 +0100, Robert Morrison wrote:
I've found lots of references online (in this list's archives, other
sites and the FAQ) to customising error pages in squid, but haven't
yet found reference to removing error pages completely.
You can't. Oce the request has reached the
On tor, 2008-10-16 at 14:39 +0100, James Cohen wrote:
I have two reverse proxy servers using each other as neighbours. The
proxy servers are load balanced (using a least connections
algorithm) by a Netscaler upstream of them.
Ok.
A small amount of URLs account for around 50% or so of the
B.
cheers
-Ryan
Johnson, S wrote:
I'm working on getting this working but I'm unclear on the hardware placement
for each of the devices.
Is it:
A)
Workstation-Cisco-Squid--internet
(WCCP)(NAT)
B)
Workstation-Cisco (WCCP)
On tor, 2008-10-16 at 19:06 +0200, lartc wrote:
hi all,
i've googled, but have been unable to find a simple sed command, or
otherwise to recover an object sitting in the web cache.
i know the filename(s) in the cache, however, there's a squid header
on top of a binary file(s), and I don't
Hi,
At 18.01 16/10/2008, Mr Lyphifco wrote:
It seems that the site http://squidnt.com/ is trying to masquerade as an
official website for Mr Serassio's Windows port of Squid. It doesn't
explicitly state this, but the wording of the site contents strongly implies
such a thing.
Also it was
On tor, 2008-10-16 at 17:01 +0100, Mr Lyphifco wrote:
It seems that the site http://squidnt.com/ is trying to masquerade as
an
official website for Mr Serassio's Windows port of Squid. It doesn't
explicitly state this, but the wording of the site contents strongly
implies
such a thing.
Hi
We have a problems with our new squid server,
when we want add wbinfo_group.pl, he can't start it :
2008/10/14 06:07:39| Starting Squid Cache version 3.0.STABLE7 for
i386-redhat-linux-gnu...
2008/10/14 06:07:39| Process ID 26104
2008/10/14 06:07:39| With 1024 file descriptors available
Hi,
I am a squid (and http 1.1 headers to be honest) newbie who'd really
appreciate help w/squid config and header attributes on the following:
I have a server serving images that change dynamically (same URL invoked at
different times may return different images). I would like the following
On tor, 2008-10-16 at 22:26 +0200, Phibee Network Operation Center
wrote:
Hi
We have a problems with our new squid server,
when we want add wbinfo_group.pl, he can't start it :
2008/10/14 06:07:39| WARNING: Cannot run
'/usr/lib/squid/wbinfo_group.pl' process.
Is wbinfo_group.pl
On tor, 2008-10-16 at 16:12 -0700, dukehoops wrote:
1. With what headers should the origin server respond in 3a) and 3b)? In
latter case, it seems like something like Cache-Control: must-revalidate,
not sure whether to use s-maxage=0 and/or maxage=0
You probably do not need or want
Phibee Network Operation Center wrote:
Hi
We have a problems with our new squid server,
when we want add wbinfo_group.pl, he can't start it :
2008/10/14 06:07:39| Starting Squid Cache version 3.0.STABLE7 for
i386-redhat-linux-gnu...
2008/10/14 06:07:39| Process ID 26104
2008/10/14 06:07:39|
See Thread at: http://www.techienuggets.com/Detail?tx=56772 Posted on behalf of
a User
All,
I really need help here, and this has got to be a real simple problem, just not
easy to lay out for you all.
I am using Squid 2.6 as a reverse proxy for our webservers.
Our webservers get rebooted
Thanks so much Henrick and Leonardo!
Looks I should learn regexes, since taked $ as
the whatever after meaning but not end of string :)
Now it logs as TCP_MISS.
Thanks so much again!
If you are needing to match just the domain its better to use 'dstdomain'
ACL type instead of regex. Squid
On tor, 2008-10-16 at 21:16 +0200, Guido Serassio wrote:
Please, do you can update again the Wikipedia page ?
Done.
Regards
Henrik
signature.asc
Description: This is a digitally signed message part
Sorry it took a while to get back. Not sure how to interpre X-Cache and
X-Cache-Lookup.
Here's the header info from Fiddler:
Request Header
GET /server1/websites/data/folder/myvideofile.vid HTTP/1.1
Client
Accept: */*
Transport
Host: ftp.mydomain.com
Proxy-Connection: Keep-Alive
anyone can give me squid.conf for delaypoll ?
i want to create :
user 192.168.1.1 - 192.168.1.100 up/down 12k/24k allow all files/website
user 192.168.1.50 - 192.168.1.80 up/down 12k/24k allow only to open
yahoo.com and google.com
user 192.168.1.100 - 192.168.1.200 up/down 12k/12k only allow to
On tor, 2008-10-16 at 17:01 +0100, Mr Lyphifco wrote:
It seems that the site http://squidnt.com/ is trying to masquerade as
an
official website for Mr Serassio's Windows port of Squid. It doesn't
explicitly state this, but the wording of the site contents strongly
implies
such a thing.
On tor, 2008-10-16 at 13:02 +0100, Robert Morrison wrote:
I've found lots of references online (in this list's archives, other
sites and the FAQ) to customising error pages in squid, but haven't
yet found reference to removing error pages completely.
You can't. Oce the request has reached
Hi,
I have two reverse proxy servers using each other as neighbours. The
proxy servers are load balanced (using a least connections
algorithm) by a Netscaler upstream of them.
A small amount of URLs account for around 50% or so of the requests.
At the moment there's some imbalance in the
Hi Folks,
I have had a look at the wiki and the docs and need a bit more help.
I am trying to look for and strip a request header X-MSISDN:
I could use ACL with request_header_access other deny, but this will
strip some other headers too which is not possible.
Is there a way to create custom
On Fri, Oct 10, 2008 at 12:30 PM, Amos Jeffries [EMAIL PROTECTED] wrote:
Richard Wall wrote:
Hi,
I've been reading through the archive looking for information about
squid 2.6 and windows update caching. The FAQ mentions problems with
range offsets but it's not really clear which versions of
52 matches
Mail list logo