Re: [squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9

2012-06-20 Thread Eliezer Croitoru

you can try to clean any cache object in this domain\page.
then deny cache...
try to load the page.
it can be because of an object that was cached but meant for FF instead 
of IE.


Eliezer
On 6/21/2012 8:15 AM, Noc Phibee Telecom wrote:

Hi

thanks for your answer, but i don't think's that it's a problems of web
designer.
This site work with IE8/IE9 when we don't use squid proxy

It's only when Squid is used by IE that's don't work.

Best regards
Jerome


Le 21/06/2012 05:23, Helmut Hullen a écrit :

Hallo, Noc,

Du meintest am 21.06.12:



We have a big issue with our squid proxy. We browse this website
(http://www.laroutedulait.fr) through squid 3.0. We get a blue
background and nothing else ( using IE8&  9 ).

Seems to be no squid problem but a problem made by the web designer.

The side tries do use a 'class="ie8"' or 'class="ie9"' when it finds
such a browser.

On my system it shows less informations under "Internet Explorer" than
under "Firefox".


No errors in log. a Idea of the problems ?

Just ask the maker of the side (but "contact" doesn't work ...).

Viele Gruesse!
Helmut







--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer  ngtech.co.il




Re: [squid-users] Time based Video Streaming Access

2012-06-20 Thread Anonymous
Thank you very much for detailed information with examples.

I have setup ACL as given below:

# -Start Here 
acl OpenIPs src "/etc/squid3/AlwaysOpenIPs.txt"
acl TimedTubed src "/etc/squid3/TimeBasedIPs.txt"
acl NoTubeTime time SMTWHFA 09:00-14:59
acl deny_rep_mime_flashvideo rep_mime_type video/x-flv
http_reply_access allow OpenIPs
http_reply_access allow TimedTubed NoTubeTime
http_reply_access deny deny_rep_mime_flashvideo
http_reply_access allow all
# -End Here 

Now "TimedTubed" (Time based youtube/video streaming access) can access all 
other web sites BUT after the restricted time (09:00-14:59) @ 15:00, they can 
not access the you tube website.
I want to allow the "TimedTubed" IPs to access you tube only from 15:00 till 
08:59.

Thank you very much for your time and kind help.

Regards.
-
--- On Thu, 6/21/12, Amos Jeffries  wrote:

> From: Amos Jeffries 
> Subject: Re: [squid-users] Time based Video Streaming Access
> To: "Anonymous" 
> Cc: squid-users@squid-cache.org
> Date: Thursday, June 21, 2012, 4:27 AM
> On 20.06.2012 20:31, Anonymous
> wrote:
> > Dear Amos Jeffries and All,
> > 
> > Thank you very much for great help. I am trying to
> understand the
> > actual working of "http_reply_access [allow|deny]" and
> "http_access
> > [allow|deny]". Can you please tell me the format,
> especailly the
> > "ORDER" of ACL Statements, as "http_reply_access
> [allow|deny]" and
> > "http_access [allow|deny]" are bit tricky and I am
> confused howto set
> > the order of acl statements.
> 
> 
> http_access lines are tested as soon as the HTTP request is
> received. Using only the TCP connection and HTTP request
> details (no HTTP reply details). To decide whether Squid is
> going to reject the request or try to handle it.
> 
> http_reply_access is tested as soon the HTTP reply is
> received. Using TCP connection details, HTTP request and
> reply details. To decide whether Squid is going to deliver
> the response or send an error instead.
> 
> 
> There is no configuration relevant in ordering of between
> http_access and http_reply_access lines. Each one will be
> separated in to a sequence of its own type of line.
>   eg
>     http_access allow A
>     http_reply_access deny B
>     http_access allow C
> 
> is the same as:
> 
>     http_access allow A
>     http_access allow C
> 
>     http_reply_access deny B
> 
> 
> 
> "acl" directive lines are just definitions of how to run a
> particular test. The only ordering they have is to be listed
> in the config before they are used on any other directive
> lines.
> 
> 
> Lines for each access directive type (eg, http_access) are
> processed top-to-bottom first matching whole line does its
> action. Individual ACL on each line are tested left-to-right
> with first mis-matching ACL stopping that lines test.
> 
> For example:
>   http_access allow A B C
>   http_access deny D E
> 
> means:
>   if A *and* B *and* C tests all match, ALLOW the
> request
>   OR,
>   if D *and* E tests all match, DENY the request
>   OR
>   do the opposite of DENY
> 
> 
> With some logic performance tricks like:
>   If B does not match the whole first line will not
> match so C will not be tested. (one less test == faster
> handling time).
> 
> 
> More details can be found at http://wiki.squid-cache.org/SquidFaq/SquidAcl
> 
> 
> HTH
> Amos
> 
> 
> > 
> > Thank you very much for your time and help.
> > 
> > 
> > --- On Wed, 6/20/12, Amos Jeffries 
> wrote:
> > 
> >> From: Amos Jeffries 
> >> Subject: Re: [squid-users] Time based Video
> Streaming Access
> >> To: squid-users@squid-cache.org
> >> Date: Wednesday, June 20, 2012, 7:23 AM
> >> On 19.06.2012 23:57, Anonymous
> >> wrote:
> >> > Hello Respected All,
> >> >
> >> > I want to setup Time based Video Streaming
> Access for
> >> different IPs
> >> > (same subnet), few IPs are allowed every time
> video/you
> >> tube streaming
> >> > access, while other IPs (IPs list in file as
> SRC) are
> >> only allowed in
> >> > set time duration any other IPs are not
> allowed to
> >> access Video/You
> >> > tube access. Here's setup:
> >> > ---
> >> > Ubuntu 12.04
> >> > Squid 3.1.x
> >> > Two Groups of IPs
> >> > G-1 = Allowd Everytime
> >> > G-2 = Time Restriction (09:00-14:59)
> >> > G-3 = Everybody, Deny Access to Video/You
> tube
> >> streaming every time.
> >> > --
> >> > acl OpenIPs src "/etc/squid3/AlwaysOpenIPs.
> txt" # G-1=
> >> List of IPs
> >> > allowed for Video Streaming Everytime.
> >> > acl TimedTubed src
> "/etc/squid3/TimeBasedIPs.txt" # G-2
> >> = List of IPs
> >> > allowed for set time duration.
> >> > acl NoTubeTime time SMTWHFA 08:30-14:59 # Time
> duration
> >> when you
> >> > access to Time based IPs.
> >> > acl deny_rep_mime_flashvideo rep_mime_type
> video/x-flv
> >> # ACL to Deny
> >> > Video Straming for everyone else.
> >> > http_reply_access allow OpenI

Re: [squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9

2012-06-20 Thread Noc Phibee Telecom

Hi

thanks for your answer, but i don't think's that it's a problems of web 
designer.

This site work with IE8/IE9 when we don't use squid proxy

It's only when Squid is used by IE that's don't work.

Best regards
Jerome


Le 21/06/2012 05:23, Helmut Hullen a écrit :

Hallo, Noc,

Du meintest am 21.06.12:



We have a big issue with our squid proxy. We browse this website
(http://www.laroutedulait.fr) through squid 3.0. We get a blue
background and nothing else ( using IE8&  9 ).

Seems to be no squid problem but a problem made by the web designer.

The side tries do use a 'class="ie8"' or 'class="ie9"' when it finds
such a browser.

On my system it shows less informations under "Internet Explorer" than
under "Firefox".


No errors in log. a Idea of the problems ?

Just ask the maker of the side (but "contact" doesn't work ...).

Viele Gruesse!
Helmut






Re: [squid-users] Re: squid3.1, squid_kerb_auth and Negotiate GSSAPI errors

2012-06-20 Thread Mark Davies
On Thu, 21 Jun 2012, Markus Moeller wrote:
>   Do you have the token you received as base64  encoded  in the log
> or better in a wireshark capture ?  This could help identifying if
> the un-encrypted elements in the tokebn are correct.

Well I've had the token before but hadn't worked out a way to pull it 
apart and up until today hadn't managed to get a matching wireshark 
capture but I now have so I think I now understand things a bit better 
(though don't have a fix as yet).

Here is the token from a fail:

2012/06/21 14:06:41| squid_kerb_auth: DEBUG: Got 'YR 
YIICyAYGKwYBBQUCoIICvDCCArigGDAWBgkqhkiC9xIBAgIGCSqGSIb3EgECAqKCApoEggKWYIICkgYJKoZIhvcSAQICAQBuggKBMIICfaADAgEFoQMCAQ6iBwMFAACjggFmYYIBYjCCAV6gAwIBBaEPGw1FQ1MuVlVXLkFDLk5aoiswKaADAgEBoSIwIBsESFRUUBsYd3d3LWNhY2hlMi5lY3MudnV3LmFjLm56o4IBFzCCAROgAwIBEqEDAgEBooIBBQSCAQHPjG8lCnGsCUUzvzF5R0WMoOI1cQXyZE7jcXVGTptHCww18sHxjFlR5uCBMubTqbqy8OrhOwOZkNlJ6vkQesG199bttwDhViLgr62AGqS7bXww336Bye1UjgGOrbtgxkIRlckgZkhwO8aOYGzbbVMiwwUl9XFvI8hpCPD1LlG/TiD47tSUdut7AvQ8GsCfb/wNgcm2BOt5Li9CjforhaElvCJqwpbPV6ht54yuNXeLjjL5TIKd2Lz36RL7w30cwkXoLgSw2dcdtjSu15nHHDyqlIaVe4F4vyeCUJYePveK8I8zTBT9vZjbu/Lv22qVEe/BilBg6ZY6++bAf5E/MO440qSB/TCB+qADAgESooHyBIHvRa9wiAOgMvzrkJi9QbirH51Gc6K9mdOVxrOB5R0O4JFsPGyixyYZzdyLUxu297Gp6lN+yiGw14v2vDqx3oiNAw+KjKsoPYEkj6P+i3CR9X+wtnlftgLmYAIOxYP285GWmXktnyEjFrDvhWuLibspFlY7US3lvtIJvzxLyqUxBsXmuAPlRInNgmbH7VGbrTwg58JzOVwnmXYP+IoAoDHsXm6p0RWOLxosDhHC//lGzhMPYUjtNpAy344EPCmltJCWPazMP11rMeEGeyP4S1CurQSOBnqPtIiFMDnhqonhJMtYJJeAB16RSFDB9pb6r2k='
 
from squid (length: 959).
2012/06/21 14:06:41| squid_kerb_auth: INFO: continuation needed
2012/06/21 14:06:41| squid_kerb_auth: DEBUG: Got 'KK 
YIICyAYGKwYBBQUCoIICvDCCArigGDAWBgkqhkiC9xIBAgIGCSqGSIb3EgECAqKCApoEggKWYIICkgYJKoZIhvcSAQICAQBuggKBMIICfaADAgEFoQMCAQ6iBwMFAACjggFmYYIBYjCCAV6gAwIBBaEPGw1FQ1MuVlVXLkFDLk5aoiswKaADAgEBoSIwIBsESFRUUBsYd3d3LWNhY2hlMi5lY3MudnV3LmFjLm56o4IBFzCCAROgAwIBEqEDAgEBooIBBQSCAQHPjG8lCnGsCUUzvzF5R0WMoOI1cQXyZE7jcXVGTptHCww18sHxjFlR5uCBMubTqbqy8OrhOwOZkNlJ6vkQesG199bttwDhViLgr62AGqS7bXww336Bye1UjgGOrbtgxkIRlckgZkhwO8aOYGzbbVMiwwUl9XFvI8hpCPD1LlG/TiD47tSUdut7AvQ8GsCfb/wNgcm2BOt5Li9CjforhaElvCJqwpbPV6ht54yuNXeLjjL5TIKd2Lz36RL7w30cwkXoLgSw2dcdtjSu15nHHDyqlIaVe4F4vyeCUJYePveK8I8zTBT9vZjbu/Lv22qVEe/BilBg6ZY6++bAf5E/MO440qSB/TCB+qADAgESooHyBIHvHPaawYzgkC67x/LP8OM72fiDvqj5fq/qXSHOWZjfZIRIR+H2FsbjrC5qcymo+Qh7u/pwHur3ZlY/SiPUC+tQlY41NEFmcmrLpNfQW21gsABa2podl1P/lSaQE4KgXYtp8sxZwKUX5/4a44XzWOo2PETgF7C+qKDLnyjripE5gWRYKt/WVH2dXYBJ3Lf/8tIqbTzOp0DvJ3XvjpROlLRpaBF9oUVXCcrqVjPKlQfrsN8kNZUWDubaUuSme1ZJvZek2QdH/gAe7ziXmHuLRiqe4b9aIolIJ6qCMa7Nu6XzPoIvAU8gFb3gjhKr1dKPGT0='
 
from squid (length: 959).
2012/06/21 14:06:41| squid_kerb_auth: ERROR: gss_accept_sec_context() 
failed:  A token was invalid. unknown mech-code 1859794441 for mech 
unknown


The failures have always this additional step of requesting 
continuation then sending the error response.

This token equates to a packet from a Safari on a Mac trying to 
connect to facebook and wireshark tells me the token breaks down to

GSS-API Generic Security Service Application Program Interface
  OID: 1.3.6.1.5.5.2 (SPNEGO - Simple Protected Negotiation)
  Simple Protected Negotiation
negTokenInit
  mechTypes: 2 items
MechType: 1.2.840.48018.1.2.2 (MS KRB5 - Microsoft Kerberos 5)
MechType: 1.2.840.113554.1.2.2 (KRB5 - Kerberos 5)
  mechToken: 6082029206092a864886f71201020201006e820281308202...
  krb5_blob: 6082029206092a864886f71201020201006e820281308202...
KRB5 OID: 1.2.840.113554.1.2.2 (KRB5 - Kerberos 5)
krb5_tok_id: KRB5_AP_REQ (0x0001)
Kerberos AP-REQ
  Pvno: 5
  MSG Type: AP-REQ (14)
  Padding: 0
  APOptions: 
  Ticket
Tkt-vno: 5
Realm: ECS.VUW.AC.NZ
Server Name (Principal): HTTP/www-cache2.ecs.vuw.ac.nz
enc-part aes256-cts-hmac-sha1-96
  [...]


The interesting thing about this is that we have our browsers and 
caches's set up so that a proxy.pac controls which cache our machines 
would normally talk to but with failover to the other cache.  This 
particular Mac would normally talk to the other cache and that ticket 
is for the _other cache_.  So for some reason its decided to fail over 
to this cache but present a ticket for the other.

Having got the "unknown mech-code" error back from squid_kerb_auth, 
squid sends back another Proxy Auth Required error to the Mac which 
then retries with another token that this time has a ticket for the 
correct cache and everything succeeds.

Why the Mac has decided to fail over, I dont know.
Why it sends the wrong ticket, I don't know.

Probably wouldn't be an issue at all except for the error messages in 
the logs AND for the fact that there seems to be a file descriptor 
leak in this particular error case in the heimdal libraries.

cheers
mark





Re: [squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9

2012-06-20 Thread Helmut Hullen
Hallo, Noc,

Du meintest am 21.06.12:


> We have a big issue with our squid proxy. We browse this website
> (http://www.laroutedulait.fr) through squid 3.0. We get a blue
> background and nothing else ( using IE8 & 9 ).

Seems to be no squid problem but a problem made by the web designer.

The side tries do use a 'class="ie8"' or 'class="ie9"' when it finds  
such a browser.

On my system it shows less informations under "Internet Explorer" than  
under "Firefox".

> No errors in log. a Idea of the problems ?

Just ask the maker of the side (but "contact" doesn't work ...).

Viele Gruesse!
Helmut


Re: [squid-users] squid 3.2.0.16 built --with-filedescriptors=16384 only has 256 file descriptors

2012-06-20 Thread YJZ
Thanks. It turns out the bash shell ulimit is 256, and Apple limits 
kern.maxfilesperproc to 10240, such that even "max_filedescriptors 16384" in 
squid.conf could not overwrite that: 

# sysctl kern.maxfiles kern.maxfilesperproc
kern.maxfiles: 12288
kern.maxfilesperproc: 10240

I can, however, use sysctlt to overcome that.

 Original-Nachricht 
> On Wed, Jun 20, 2012 at 7:30 AM,  I wrote:
> > I'm running one of the nightly 3.2.0.16. I've always built squid 3.x
> "--with-filedescriptors=16384".
> >
> > Squid Cache: Version 3.2.0.16
> > configure options:  '--prefix=/usr/local/squid'
> '--build=i686-apple-darwin' '--mandir=/usr/local/share/man' 
> '--with-large-files'
> '--disable-ident-lookups' '--disable-dependency-tracking' '--enable-filters'
> '--enable-removal-policies=heap,lru' '--enable-delay-pools' 
> '--enable-multicast-miss'
> '--enable-default-err-language=templates' '--enable-fd-config'
> '--with-filedescriptors=16384' '--with-dl' '--enable-ltdl-convenience'
> '--enable-http-violations' '--enable-build-info' '--enable-log-daemon-helpers'
> '--enable-auth-basic=PAM,NCSA,LDAP,NCSA' '--enable-auth-digest=password'
> '--enable-external-acl-helpers=ip_user,ldap_group' '--enable-ssl' 
> '--enable-internal-dns'
> '--disable-eui' 'build_alias=i686-apple-darwin'
> >
> > So it's very surprising for me to see "WARNING! Your cache is running
> out of filedescriptors" in cache.log after I pumped a few hundred of
> connections through recently.
> >
> > According to squidclient -r -l localhost -U manager -W passwd mgr:info
> |grep 'file descr', my instance of squid 3.2.0.16 only has 256 file
> descriptors:
> >
> >        Maximum number of file descriptors:    256
> >        Available number of file descriptors:  221
> >        Reserved number of file descriptors:    64
> >
> > I can resolve this easily by setting "max_filedescriptors " in
> squid.conf. However, this is supposedly a "Squid 2.7+" feature, according to
> "http://wiki.squid-cache.org/SquidFaq/TroubleShooting#Running_out_of_filedescriptors";.
> It's not clear whether that's inclusive of Squid 3.x, which is a separate
> branch from squid 2.6 in my understanding. The other question is why the
> configure option "--with-filedescriptors=16384" has no effect or stopped
> taking effect in Squid 3.2.0.16. Is this just an isolated one-off incident?
> 
> Watch out for the ulimit in the shell launching squid.
> 
> 
> -- 
>     /kinkie

-- 
Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir
belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de


Re: [squid-users] Squid 3.1.x and Kemp loadbalancer.

2012-06-20 Thread Amos Jeffries

On 20.06.2012 22:40, Josef Karliak wrote:

Hi there,
  we use Kemp loadbalancer for balancing proxy (active-backup). All
users has set IP of kemp loadbalancer. But in the squid access_log is
IP of the loadbalancer, I want there an IP of the user that is
accessing the web pages (we use webalizer for analyzing top browsing
users).
  My logformat defined in squid.conf:
logformat combined %>a %ui %un [%{%d/%b/%Y:%H:%M:%S +}tl] \
  "%rm %ru HTTP/%rv" >Hs %h" "%{User-Agent}>h" 
%Ss:%Sh


  Do I've some bad variable in the logformat ?



Your format is accurate.

The kemp load balancer apparently operates in one of two ways:

 layer 4, using NAT alteration of packets before delivery to the Squid 
box. The real clients addresses are gone. There is no recovery possible.


 layer 7, using a proxy which itself makes HTTP requests through Squid. 
So it is the one and only *client* to Squid. It *might* be able to set 
X-Forwarded-For headers and inform Squid about the clients original IP 
address. If so configure:


  acl kemp src ... IP of kemp load balancer(s)
  follow_x_forwarded_for allow kempID
  follow_x_forwarded_for deny all



NOTE: You have the alternative option of active-passive load balancing 
in a PAC file which is performed directly in the client browser.



Amos



Re: [squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9

2012-06-20 Thread Amos Jeffries

On 21.06.2012 12:29, Noc Phibee Telecom wrote:

Hi,



We have a big issue with our squid proxy. We browse this website
(http://www.laroutedulait.fr) through squid 3.0. We get a blue
background and nothing else ( using IE8 & 9 ).
Using Firefox it works well – problem is that firefox is not our
validated browser …


No errors in log. a Idea of the problems ?


Do you get anything to indicate it is a Squid problem as opposed to a 
problem with the browser badly trying to cope with "-//W3C//DTD HTML 
4.01//FR" document type.
 For example; what XML format is "FR" ?? the strict DTD pointed at 
right afterwards defines English HTML tag names. Someone came up with a 
French version of ASCII?


Amos



Re: [squid-users] Full https in transparent mode

2012-06-20 Thread Amos Jeffries

On 21.06.2012 11:14, Romain wrote:

Hi,

I'm using squid-3.1.19 and i would like to setup a https l7 split in
transparent mode. The configuration seems relatively easy and there
is no problem to catch the https request with iptables and forward it 
to

the squid. (https_port 3130 intercept cert=... key=...)

But after that squid try to retrieve the page in http not in https...
Is it possible to keep the protocol throughout the request ?


It would seem so... but that forces a single certificate to be shared 
by every domain in existence. Your clients will pop up invalid 
certificate warnings on almost every single HTTP request.


You require the dynamic certificate generation feature of Squid-3.2 to 
avoid those popups.


This patch is also needs to be applied to the current 3.2 snapshot, it 
should be in tomorrows one.

http://www.squid-cache.org/Versions/v3/3.2/changesets/squid-3.2-11599.patch


Amos


[squid-users] Big issue on all Squid that we have (2.5 and 3.0) on a web site with IE8/IE9

2012-06-20 Thread Noc Phibee Telecom

Hi,



We have a big issue with our squid proxy. We browse this website 
(http://www.laroutedulait.fr) through squid 3.0. We get a blue 
background and nothing else ( using IE8 & 9 ).
Using Firefox it works well – problem is that firefox is not our 
validated browser …



No errors in log. a Idea of the problems ?

Thanks in advance for your help.
Jerome



Re: [squid-users] cache windows update

2012-06-20 Thread Amos Jeffries

On 21.06.2012 00:15, Muhammad Yousuf Khan wrote:
i am using this, do i have to worry about the version or it is fine 
to go.


The wiki contents is valid for Squid versions back to 2.5 unless 
specially noted otherwise.
Most of the content is also usable for older Squid, but we did an 
upgrade purge recently removing text explaining special cases for 
Squid-2.4 and older.


Amos



Re: [squid-users] stupid problem with squid and and local adresses.

2012-06-20 Thread Amos Jeffries

On 21.06.2012 00:48, Ton Muller wrote:

Hello.
This my setup:
routerbox openBSD 5.0, squid..(and some other fancy stuff installed)
webserver is on routerbox on port 80
Mailserver is a difrent machine, inc webmail access.
swat for SAMBA is also installed.
added port 901 to acl list in squid.


Relevance?


NAMED installed, and correcyt running (serving local names)
own OS windows7 / ubuntu ,browser, firefox.



Right that is your hardware inventory, with a few small hints about 
connectivity topology.


How are they connected together?
ie...
  is all traffic going through the router box on its way to Squid?
  are Squid and client machines on the same sub-net?
  are Squid and the servers on the same subnet?


Are they are all on the same company network OR with routing between 
such that you can resolve host domain names in DNS and ping any one of 
them from the Squid box?
 The OR is important here. I've recently had these same problems seen 
by someone who tried to place Squid on the cloud and still reach RFC 
1918 private IP ranges from the global Internet. The actual destination 
IP does not matter to Squid, connectivity configurations to and from it 
are paramount.




i have here a stupid problem.
i installed squid for a while back, went wel, configed it as it 
should be.


Hmm. Obviously wrong. Since you are having problems.



works as it should, all sites come over squid in.

however as it come to my lan servers.
i got each time a timeout when i access my routerbox over squid when 
i

want to check my webserver, or trying to access swat.


What is the config?



access webmail is not possible when i use name lookup, i must use IP
adres for it.


Aha. This hints that connectivity is working but your DNS system or 
squid configuration is screwed up.




so, my question..
where did i make a mistake , i used basic squid config, and added 
only

some ports for access.



What is this squid.conf file?


So far the only clear thing in the above is that you have a few server 
machines and Squid seems to be working for clients accessing the public 
Internet, but not the LAN.


Amos


Re: [squid-users] Time based Video Streaming Access

2012-06-20 Thread Amos Jeffries

On 20.06.2012 20:31, Anonymous wrote:

Dear Amos Jeffries and All,

Thank you very much for great help. I am trying to understand the
actual working of "http_reply_access [allow|deny]" and "http_access
[allow|deny]". Can you please tell me the format, especailly the
"ORDER" of ACL Statements, as "http_reply_access [allow|deny]" and
"http_access [allow|deny]" are bit tricky and I am confused howto set
the order of acl statements.



http_access lines are tested as soon as the HTTP request is received. 
Using only the TCP connection and HTTP request details (no HTTP reply 
details). To decide whether Squid is going to reject the request or try 
to handle it.


http_reply_access is tested as soon the HTTP reply is received. Using 
TCP connection details, HTTP request and reply details. To decide 
whether Squid is going to deliver the response or send an error instead.



There is no configuration relevant in ordering of between http_access 
and http_reply_access lines. Each one will be separated in to a sequence 
of its own type of line.

  eg
http_access allow A
http_reply_access deny B
http_access allow C

is the same as:

http_access allow A
http_access allow C

http_reply_access deny B



"acl" directive lines are just definitions of how to run a particular 
test. The only ordering they have is to be listed in the config before 
they are used on any other directive lines.



Lines for each access directive type (eg, http_access) are processed 
top-to-bottom first matching whole line does its action. Individual ACL 
on each line are tested left-to-right with first mis-matching ACL 
stopping that lines test.


For example:
  http_access allow A B C
  http_access deny D E

means:
  if A *and* B *and* C tests all match, ALLOW the request
  OR,
  if D *and* E tests all match, DENY the request
  OR
  do the opposite of DENY


With some logic performance tricks like:
  If B does not match the whole first line will not match so C will not 
be tested. (one less test == faster handling time).



More details can be found at 
http://wiki.squid-cache.org/SquidFaq/SquidAcl



HTH
Amos




Thank you very much for your time and help.


--- On Wed, 6/20/12, Amos Jeffries  wrote:


From: Amos Jeffries 
Subject: Re: [squid-users] Time based Video Streaming Access
To: squid-users@squid-cache.org
Date: Wednesday, June 20, 2012, 7:23 AM
On 19.06.2012 23:57, Anonymous
wrote:
> Hello Respected All,
>
> I want to setup Time based Video Streaming Access for
different IPs
> (same subnet), few IPs are allowed every time video/you
tube streaming
> access, while other IPs (IPs list in file as SRC) are
only allowed in
> set time duration any other IPs are not allowed to
access Video/You
> tube access. Here's setup:
> ---
> Ubuntu 12.04
> Squid 3.1.x
> Two Groups of IPs
> G-1 = Allowd Everytime
> G-2 = Time Restriction (09:00-14:59)
> G-3 = Everybody, Deny Access to Video/You tube
streaming every time.
> --
> acl OpenIPs src "/etc/squid3/AlwaysOpenIPs. txt" # G-1=
List of IPs
> allowed for Video Streaming Everytime.
> acl TimedTubed src "/etc/squid3/TimeBasedIPs.txt" # G-2
= List of IPs
> allowed for set time duration.
> acl NoTubeTime time SMTWHFA 08:30-14:59 # Time duration
when you
> access to Time based IPs.
> acl deny_rep_mime_flashvideo rep_mime_type video/x-flv
# ACL to Deny
> Video Straming for everyone else.
> http_reply_access allow OpenIPs TimedTubed NoTubeTime

This above line can only allow the IPs which are listed in
*both* OpenIPs and TimedTubed.
It will allow them only during NoTubeTime.


If I'm reading your policy description above correctly you
actually want:

  # G-1 policy = Allowed Everytime
  http_reply_access allow OpenIPs

  # G-2 policy = Time Restriction (09:00-14:59)
  http_reply_access allow TimedTubed NoTubeTime


> http_reply_access deny TimedTubed

That above line seems wrong according to your stated
policies. It will block TimedTubed IPs from going to
non-YouTube content.


  # G-3 policy = Deny Access to Video/You tube
streaming every time.
> http_reply_access deny deny_rep_mime_flashvideo

  http_reply_access allow all

> -- ---
> Above mentioned ACLs are not working properly, General
Internet
> Access (http_access) is also denied when used with
"http_reply_access
> deny" I want to only deny video streaming/you tube in
set time
> duration and allow internet access.
>
> Thank you in advance.


One thing to note here. Blocking in http_reply_access means
the video is already arriving when you decide not to deliver
it. squid id forced to do one of two things:

 a) close the server connection and wait out the TCP reset
timouts (15 minutes) before re-using the socket. Not a major
issue on networks with low web traffic, but can be a major
problem if you are needing to use those sockets again fast.

 b) read in the entire video from the server and discard it
before re-using the socket. Avoids T

[squid-users] Full https in transparent mode

2012-06-20 Thread Romain
Hi,

I'm using squid-3.1.19 and i would like to setup a https l7 split in
transparent mode. The configuration seems relatively easy and there
is no problem to catch the https request with iptables and forward it to
the squid. (https_port 3130 intercept cert=... key=...)

But after that squid try to retrieve the page in http not in https...
Is it possible to keep the protocol throughout the request ?

Regards
-- 
Romain 




RE: [squid-users] Cant login to certain flash page via squid?

2012-06-20 Thread Amos Jeffries

On 21.06.2012 06:11, Terry Dobbs wrote:

Thanks for the reply.

Incase this becomes an issue with a site many users need to access, 
what
is the best way to bypass squid entirely for specific sites? Is there 
a

clean, easy way to do it? I am running Ubuntu as my squid server.


* Using a PAC file to configure the clients not to sent that traffic 
through the proxy.


* For interception proxies using a bypass rule to skip interception for 
that traffic.



Later research indicates that some Flash players are at least pulling 
system proxy settings from somewhere and using them silently without any 
kind of editable control. Although no mention was made as to how or 
where those were setup, or which systems.


Amos



[squid-users] Re: squid3.1, squid_kerb_auth and Negotiate GSSAPI errors

2012-06-20 Thread Markus Moeller

Hi Mark,

 Do you have the token you received as base64  encoded  in the log or 
better in a wireshark capture ?  This could help identifying if the 
un-encrypted elements in the tokebn are correct.


Markus

"Mark Davies"  wrote in message 
news:201206201520.52498.m...@ecs.vuw.ac.nz...

Hi,
  we run a couple of squid caches using the squid_kerb_auth helper to
do Negotiate GSSAPI authentication and generally it all works rather
nicely but we will get little bursts of the following error

2012/06/20 14:54:02| authenticateNegotiateHandleReply: Error
validating user via Negotiate. Error returned 'BH
gss_accept_sec_context() failed:  A token was invalid. unknown
mech-code 1859794441 for mech unknown'


Always with that particular mech-code.

Given the number of successful hits on the cache (couple of million a
day) I'm struggling to identify whats causing these errors and how to
rectify so suggestions welcomed.

As well as wanting to identify the root cause, this problem has the
effect that every time squid_kerb_auth deals with one of these
requests the kerberos libraries (heimdal 1.5pre1 from NetBSD 5.99.59)
keeps a file descriptor open to the keytab file (actually two) so
eventually the squid_kerb_auth hits the max filedescriptors per
process limit and other things start to fail (if it hasn't been
restarted before then).


cheers
mark






RE: [squid-users] Cant login to certain flash page via squid?

2012-06-20 Thread Terry Dobbs
Thanks for the reply.

Incase this becomes an issue with a site many users need to access, what
is the best way to bypass squid entirely for specific sites? Is there a
clean, easy way to do it? I am running Ubuntu as my squid server.

Thanks again.

-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz] 
Sent: Tuesday, June 19, 2012 9:28 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Cant login to certain flash page via squid?

On 20.06.2012 09:13, Terry Dobbs wrote:
> When users are going through squid there are certain pages, like the 
> one
> I mentioned where you just can't click a specific button. It always
> seems flash related. If I reconfigure this user to not use squid I 
> can
> use the page just fine. This leads me to believe its not solely a
> browser issue.
>
> When I say I told it to ignore I meant in the squid.conf file, where 
> I
> allowed access to that specific domain without any kind of
> authentication. Thinking about it, I understand this step is pretty
> pointless as squid still processes the site. However I have had 
> success
> in the past by allowing access to sites before the proxy_auth 
> required
> command.
>
> Not really sure what the issue is, but it seems to happen with just a
> handful of random sites.


Flash player is separate software not permitted access to the browsers 
internal password manager information.
  * Flash player does not provide any means for users to enter passwords

unless the HTTP request is a GET.
  * Flash script frameworks do not provide easily available support 
unless the HTTP request is a POST.
  * recent Flash versions prevent HTTP authentication unless the visited

*website* provides explicit file-based (ONLY file based) CORS support 
for the relevant headers. NP: as documented this would prohibit 
Proxy-Authentication.


Website authentication only works if the author who wrote the script 
knows how to write a) the user I/O interface and b) the relevant 
encryption algorithms (rare for anything better than Basic auth), and c)

adds explicit CORS support to their site. AND decided it was worth the 
trouble.


As a result HTTP authentication of any type rarely works in Flash 
applications. Proxy authentication has never been reported working, not 
to say it can't, just that in my experience nobody has ever mentioned 
seeing it happen despite common complaints here and in many other places

online.


Personally I rate Flash as a worse problem than Java in this regard. At 
least Java provides libraries and API making it easy for developers who 
know where to look (most seem not to use it, but that is a 
knowledge/time issue not a technical barrier).

Amos



Re: [squid-users] acl forbidden_domains dstdom_regex "file.txt" with huge file fails

2012-06-20 Thread Marcus Kool



On 06/20/2012 06:43 AM, Matus UHLAR - fantomas wrote:

On 19.06.12 18:52, Stefan Bauer wrote:

with a 30 MB file. Squid is instantly terminating if this acl-stanza is set 
active. Where can and do we have to tune squid settings to achive this?


terminating with what reason? I would not wonder if all the regexed would not 
fit into the memory. Note that using bunch of regexes slows squid down very 
much - it has to compare tons of regexps for
every request.


We're aware of third-party software like squidguard for this task - we only 
want to use additional software if everything else fails :)


well, now squid failed...


ufdbGuard is much faster than squidguard with regular expressions since it 
optimises a set of regular expressions into one regular expression.
The regex man page says there is no limit to a regular expression other than 
limits imposes by hardware or the kernel.

Marcus


[squid-users] stupid problem with squid and and local adresses.

2012-06-20 Thread Ton Muller
Hello.
This my setup:
routerbox openBSD 5.0, squid..(and some other fancy stuff installed)
webserver is on routerbox on port 80
Mailserver is a difrent machine, inc webmail access.
swat for SAMBA is also installed.
added port 901 to acl list in squid.
NAMED installed, and correcyt running (serving local names)
own OS windows7 / ubuntu ,browser, firefox.

i have here a stupid problem.
i installed squid for a while back, went wel, configed it as it should be.
works as it should, all sites come over squid in.

however as it come to my lan servers.
i got each time a timeout when i access my routerbox over squid when i
want to check my webserver, or trying to access swat.

access webmail is not possible when i use name lookup, i must use IP
adres for it.

so, my question..
where did i make a mistake , i used basic squid config, and added only
some ports for access.

thanks.
Tony.



Re: [squid-users] cache windows update

2012-06-20 Thread Muhammad Yousuf Khan
i am using this, do i have to worry about the version or it is fine to go.


Squid Cache: Version 2.7.STABLE9
configure options:  '--prefix=/usr' '--exec_prefix=/usr'
'--bindir=/usr/sbin' '--sbindir=/usr/sbin'
'--libexecdir=/usr/lib/squid' '--sysconfdir=/etc/squid'
'--localstatedir=/var/spool/squid' '--datadir=/usr/share/squid'
'--enable-async-io' '--with-pthreads'
'--enable-storeio=ufs,aufs,coss,diskd,null' '--enable-linux-netfilter'
'--enable-arp-acl' '--enable-epoll'
'--enable-removal-policies=lru,heap' '--enable-snmp'
'--enable-delay-pools' '--enable-htcp' '--enable-cache-digests'
'--enable-underscores' '--enable-referer-log' '--enable-useragent-log'
'--enable-auth=basic,digest,ntlm,negotiate'
'--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-carp'
'--enable-follow-x-forwarded-for' '--with-large-files'
'--with-maxfd=65536' 'amd64-debian-linux'
'build_alias=amd64-debian-linux' 'host_alias=amd64-debian-linux'
'target_alias=amd64-debian-linux' 'CFLAGS=-Wall -g -O2' 'LDFLAGS='
'CPPFLAGS='

Thanks


On Wed, Jun 20, 2012 at 4:59 PM,   wrote:
>
> My bad, I typed it worng on my BlackBerry.
>
> Sorry.
>
>
> --Original Message--
> From: Markus Sonnenberg
> To: squid-users@squid-cache.org
> Subject: Re: [squid-users] cache windows update
> Sent: Jun 20, 2012 12:57
>
>  here you go.
>  http://wiki.squid-cache.org/SquidFaq/WindowsUpdate
>
> ct,
>
> On 6/20/2012 1:53 PM, Muhammad Yousuf Khan wrote:
>> when i click on the mentioned like it give me this
>>
>> This page does not exist yet. You can create a new empty page, or use
>> one of the page templates.
>>
>> Can you please check the url
>>
>> Thanks,
>>
>>
>> On Wed, Jun 20, 2012 at 4:12 PM,  wrote:
>>> I Was able to do by following wiki.squid-cache.org/squidfaq/windowsUpdate
>>>
>>> So far its been working great and also solved the problem with proxy auth.
>>>
>>>
>>> --Original Message--
>>> From: Muhammad Yousuf Khan
>>> To: squid-users
>>> Subject: [squid-users] cache windows update
>>> Sent: Jun 20, 2012 12:04
>>>
>>> i would like to cache all windows update data  i am not using WU
>>> server i just want to cache every update downloaded by windows 7 or XP
>>> and should be cached on my specified location and next time query for
>>> the same should be fulfilled with the existing cached file rather then
>>> downloading things directly from Microsoft websites.
>>> any help should be appreciated in this regard.
>>>
>>> Thanks,
>>>
>>>
>>> Sent from my BlackBerry® smartphone
>>> www.blackberry.com
>
>
>
> Sent from my BlackBerry® smartphone
> www.blackberry.com


Re: [squid-users] cache windows update

2012-06-20 Thread miguelmclara

My bad, I typed it worng on my BlackBerry.

Sorry.


--Original Message--
From: Markus Sonnenberg
To: squid-users@squid-cache.org
Subject: Re: [squid-users] cache windows update
Sent: Jun 20, 2012 12:57

  here you go.
  http://wiki.squid-cache.org/SquidFaq/WindowsUpdate

ct,

On 6/20/2012 1:53 PM, Muhammad Yousuf Khan wrote:
> when i click on the mentioned like it give me this
>
> This page does not exist yet. You can create a new empty page, or use
> one of the page templates.
>
> Can you please check the url
>
> Thanks,
>
>
> On Wed, Jun 20, 2012 at 4:12 PM,  wrote:
>> I Was able to do by following wiki.squid-cache.org/squidfaq/windowsUpdate
>>
>> So far its been working great and also solved the problem with proxy auth.
>>
>>
>> --Original Message--
>> From: Muhammad Yousuf Khan
>> To: squid-users
>> Subject: [squid-users] cache windows update
>> Sent: Jun 20, 2012 12:04
>>
>> i would like to cache all windows update data  i am not using WU
>> server i just want to cache every update downloaded by windows 7 or XP
>> and should be cached on my specified location and next time query for
>> the same should be fulfilled with the existing cached file rather then
>> downloading things directly from Microsoft websites.
>> any help should be appreciated in this regard.
>>
>> Thanks,
>>
>>
>> Sent from my BlackBerry® smartphone
>> www.blackberry.com



Sent from my BlackBerry® smartphone 
www.blackberry.com

Re: [squid-users] cache windows update

2012-06-20 Thread Markus Sonnenberg

 here you go.
 http://wiki.squid-cache.org/SquidFaq/WindowsUpdate

ct,

On 6/20/2012 1:53 PM, Muhammad Yousuf Khan wrote:

when i click on the mentioned like it give me this

This page does not exist yet. You can create a new empty page, or use
one of the page templates.

Can you please check the url

Thanks,


On Wed, Jun 20, 2012 at 4:12 PM,  wrote:

I Was able to do by following wiki.squid-cache.org/squidfaq/windowsUpdate

So far its been working great and also solved the problem with proxy auth.


--Original Message--
From: Muhammad Yousuf Khan
To: squid-users
Subject: [squid-users] cache windows update
Sent: Jun 20, 2012 12:04

i would like to cache all windows update data  i am not using WU
server i just want to cache every update downloaded by windows 7 or XP
and should be cached on my specified location and next time query for
the same should be fulfilled with the existing cached file rather then
downloading things directly from Microsoft websites.
any help should be appreciated in this regard.

Thanks,


Sent from my BlackBerry® smartphone
www.blackberry.com




Re: [squid-users] cache windows update

2012-06-20 Thread Muhammad Yousuf Khan
when i click on the mentioned like it give me this

This page does not exist yet. You can create a new empty page, or use
one of the page templates.

Can you please check the url

Thanks,


On Wed, Jun 20, 2012 at 4:12 PM,   wrote:
>
> I Was able to do by following wiki.squid-cache.org/squidfaq/windowsUpdate
>
> So far its been working great and also solved the problem with proxy auth.
>
>
> --Original Message--
> From: Muhammad Yousuf Khan
> To: squid-users
> Subject: [squid-users] cache windows update
> Sent: Jun 20, 2012 12:04
>
> i would like to cache all windows update data  i am not using WU
> server i just want to cache every update downloaded by windows 7 or XP
> and should be cached on my specified location and next time query for
> the same should be fulfilled with the existing cached file rather then
> downloading things directly from Microsoft websites.
> any help should be appreciated in this regard.
>
> Thanks,
>
>
> Sent from my BlackBerry® smartphone
> www.blackberry.com


Re: [squid-users] cache windows update

2012-06-20 Thread miguelmclara

I Was able to do by following wiki.squid-cache.org/squidfaq/windowsUpdate 

So far its been working great and also solved the problem with proxy auth.


--Original Message--
From: Muhammad Yousuf Khan
To: squid-users
Subject: [squid-users] cache windows update
Sent: Jun 20, 2012 12:04

i would like to cache all windows update data  i am not using WU
server i just want to cache every update downloaded by windows 7 or XP
and should be cached on my specified location and next time query for
the same should be fulfilled with the existing cached file rather then
downloading things directly from Microsoft websites.
any help should be appreciated in this regard.

Thanks,


Sent from my BlackBerry® smartphone 
www.blackberry.com

[squid-users] cache windows update

2012-06-20 Thread Muhammad Yousuf Khan
i would like to cache all windows update data  i am not using WU
server i just want to cache every update downloaded by windows 7 or XP
and should be cached on my specified location and next time query for
the same should be fulfilled with the existing cached file rather then
downloading things directly from Microsoft websites.
any help should be appreciated in this regard.

Thanks,


[squid-users] Squid 3.1.x and Kemp loadbalancer.

2012-06-20 Thread Josef Karliak

  Hi there,
  we use Kemp loadbalancer for balancing proxy (active-backup). All  
users has set IP of kemp loadbalancer. But in the squid access_log is  
IP of the loadbalancer, I want there an IP of the user that is  
accessing the web pages (we use webalizer for analyzing top browsing  
users).

  My logformat defined in squid.conf:
logformat combined %>a %ui %un [%{%d/%b/%Y:%H:%M:%S +}tl] \
  "%rm %ru HTTP/%rv" >Hs %h" "%{User-Agent}>h" %Ss:%Sh

  Do I've some bad variable in the logformat ?
  Thank you very much and best regards
  J.Karliak

--
Ma domena pouziva zabezpeceni a kontrolu SPF (www.openspf.org) a
DomainKeys/DKIM (with ADSP) . Pokud mate problemy s dorucenim emailu,
zacnete pouzivat metody overeni puvody emailu zminene vyse. Dekuji.
My domain use SPF (www.openspf.org) and DomainKeys/DKIM (with ADSP)
policy and check. If you've problem with sending emails to me, start
using email origin methods mentioned above. Thank you.


This message was sent using IMP, the Internet Messaging Program.



biny5iAWtCFdJ.bin
Description: Veřejný PGP klíč


Re: [squid-users] acl forbidden_domains dstdom_regex "file.txt" with huge file fails

2012-06-20 Thread Matus UHLAR - fantomas

On 19.06.12 18:52, Stefan Bauer wrote:

with a 30 MB file. Squid is instantly terminating if this acl-stanza is set 
active. Where can and do we have to tune squid settings to achive this?


terminating with what reason? I would not wonder if all the regexed 
would not fit into the memory. Note that using bunch of regexes slows 
squid down very much - it has to compare tons of regexps for every 
request.


We're aware of third-party software like squidguard for this task - we 
only want to use additional software if everything else fails :)


well, now squid failed...
--
Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
My mind is like a steel trap - rusty and illegal in 37 states.