[squid-users] squid 2.7 - problems with kerberos authentication

2009-09-01 Thread Дмитрий Нестеркин
I'm trying to configure Kerberos authentication for Squid 2.7 (Debian
Lenny, MIT kerberos; Windows Server 2003 no service packs), but no
luck :(

This is how my configuration files look like:

squid.conf:

auth_param negotiate program /usr/lib/squid/squid_kerb_auth -d
auth_param negotiate children 10
auth_param negotiate keep_alive on

acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
#acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
#acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.10.0/24 # RFC1918 possible internal network
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 3128
acl CONNECT method GET
#
#CHECKING USERS BY AD
#
external_acl_type ldap_check ttl=1200 %LOGIN
/usr/lib/squid/squid_ldap_group -R -b dc=mydomain,dc=local -f
((objectclass=user)(sAMAccountName=%v
(memberof=cn=%a,ou=internet,dc=mydomain,dc=local)) -D
proxyu...@mydomain.local -w password -K -d 192.168.100.42
#
acl auth proxy_auth REQUIRED
acl inet_access external ldap_check inet_allow
#
http_access allow inet_access
http_access allow manager localhost
http_access deny manager
# Deny requests to unknown ports
http_access deny !Safe_ports
# Deny CONNECT to other than SSL ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost
#
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny !auth
http_access allow auth
http_access deny all


/etc/init.d/squid содержит:
KRB5_KTNAME=/etc/squid/krbldap.mydomain.local.keytab
export KRB5_KTNAME
KRB5RCACHETYPE=none
export KRB5RCACHETYPE


/etc/krb5.conf:
[libdefaults]
default_realm = MYDOMAIN.LOCAL
dns_lookup_realm = no
dns_lookup_kdc = no
default_keytab_name = /etc/squid/krbldap.mydomain.local.keytab
default_tgs_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
default_tkt_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
permitted_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
ticket_lifetieme= 24h
# The following krb5.conf variables are only for MIT Kerberos.
# krb4_config = /etc/krb.conf
# krb4_realms = /etc/krb.realms
kdc_timesync = 1
ccache_type = 4
forwardable = true
proxiable = true
# The following libdefaults parameters are only for Heimdal Kerberos.
v4_instance_resolve = false
v4_name_convert = {
host = {
rcmd = host
ftp = ftp
}
plain = {
something = something-else
}
}
fcc-mit-ticketflags = true

[realms]
MYDOMAIN = {
kdc = dc.mydomain:88
admin_server = dc.mydomain:749
default_domain = mydomain
}

MYDOMAIN.LOCAL = {
kdc = dc.mydomain.local:88
admin_server = dc.mydomain.local:749
default_domain = mydomain.local
}

[domain_realm]
.linux.local = MYDOMAIN.LOCAL
.mydomain.local = MYDOMAIN.LOCAL
mydomain.local = MYDOMAIN.LOCAL
.mydomain = MYDOMAIN
mydomain = MYDOMAIN
#[appdefaults]
#pam = {
#debug = false
#ticket_lifetime = 36000
#renew_lifetime = 36000
#forwardable = true
#krb4_convert = false
#}

#[kdc]
#profile = /usr/share/krb5-kdc/kdc.conf

#[login]
# krb4_convert = false
# krb4_get_tickets = false

[logging]
default = FILE:/var/log/krb5lib.log
kdc = FILE:/var/log/kdc.log
kdc = SYSLOG:INFO AEMON
admin_server = FILE:/var/log/kadmin.log


When I try to check authorisation from terminal - it's OK:
$ sudo kinit -V -k -t /etc/squid/krbldap.mydomain.local.keytab
HTTP/Most2.mydomain.local
Authenticated to Kerberos v5

When I try to authenticate users by IP address - everything is OK

access.log:
1251706346.035 0 192.168.10.133 TCP_DENIED/407 1750 GET
http://www.debian.org/ - NONE/- text/html

Internet Explorer 7 show error message Internet Explorer cannot
display this page
Opera 9.6 requests login and password, but they are not being accepted.

What am I doing wrong?
--
Best regards,
Dmitry


[squid-users] StoreUrlRewrite + url_rewrite_program

2009-09-01 Thread pokeman

Hello 
can i use StoreUrlRewrite + url_rewrite_program at the same time ?


-- 
View this message in context: 
http://www.nabble.com/StoreUrlRewrite-%2B-url_rewrite_program-tp25236859p25236859.html
Sent from the Squid - Users mailing list archive at Nabble.com.



Re: [squid-users] Reverse Proxy Question

2009-09-01 Thread John Doe
From: Jones, Keven keven.jo...@ncr.com
 Example.com --server1 or server2
 Is this possible? If so anyone have the documentation on how to accomplish 
 this.

Something like this should work I think (I use squid 2.7):

  http_port IP:PORT accel defaultsite=example.com act-as-origin vhost
  cache_peer IP1 parent PORT 0 no-query originserver round-robin 
name=server1
  cache_peer IP2 parent PORT 0 no-query originserver round-robin 
name=server2
  acl mydomain dstdomain example.com
  cache_peer_access server1 allow mydomain
  cache_peer_access server2 allow mydomain
  cache_peer_access server1 deny all
  cache_peer_access server2 deny all

JD


  



Re: [squid-users] Java not working behind squid

2009-09-01 Thread Truth Seeker

Really thanks for your effort... i was not able to get back to you, just bcoz 
there were so many unexpected issues on the proxy...

Now your resolution didnt worked for me... 

I didnt even got the 
http://balancer.netdania.com/StreamingServer/StreamingServer? in my access.log

rather i could see always DENIED for balancer like the following 

TCP_DENIED/407 2912 CONNECT balancer.netdania.com:443 - NONE/- text/html


Any HELP please...



 We have a similar setup on one VLAN, with squid on linux
 authenticating
 users using active directory.  We've seen lots of
 issues with Java not
 being able to authenticate.
 
 Testing the page you're talking about (albeit with a linux
 desktop), I get
 a java popup window asking me for my AD
 username/password/domain, I type it
 in but repeatedly it fails.
 
 The squid access.log says:
 
 1251204847.837      0 172.16.1.3
 TCP_DENIED/407 1846 CONNECT balancer.netdania.com:443 -
 NONE/- text/html
 1251204847.842      0 172.16.1.3
 TCP_DENIED/407 1846 CONNECT balancer.netdania.com:443 -
 NONE/- text/html
 
 I'm not sure if these lines in cache.log are relevant or
 not.
 
 [2009/08/25 13:42:00, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:00, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:01, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:01, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:47:02, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 
 My usual workaround is to add an ACL for that site which is
 far from ideal.
 I've added the following ACL:
 
     acl dailyfx dstdomain
 balancer.netdania.com
     http_access allow dailyfx CONNECT
 
 That works around the issue for me.  I still get
 prompted for the username
 and password and the logs suggest some traffic isn't
 getting through.
 
 1251205769.600  14385 172.16.1.3 TCP_MISS/000 7263
 CONNECT balancer.netdania.com:443 -
 FIRST_UP_PARENT/172.20.2.3 - 1251205771.233   
   1 172.16.1.3 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205771.239      3 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205771.516    277 172.16.1.3 TCP_MISS/200
 1443 GET http://balancer.netdania.com/StreamingServer/StreamingServer?
 gavinmc FIRST_UP_PARENT/172.20.2.3 application/zip
 1251205774.813     55 172.16.1.3
 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205774.816      0 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205776.537   1721 172.16.1.3
 TCP_MISS/200 1125 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 gavinmc FIRST_UP_PARENT/172.20.2.3 application/zip
 1251205779.681      1 172.16.1.3
 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205779.685      1 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 
 If I drop the word CONNECT I get no errors at all, but that
 disables
 authentication entirely for that site.
 
 There is definitely some issue with austhentication and
 Java.  I'm not sure
 if it might actually be Authentication+Java+SSL.  Our
 problems are
 generally with java-driven online banking applications.
 
 Gavin 
 
 
 





Re: [squid-users] Restricting access to users logging onto windows domain

2009-09-01 Thread Amos Jeffries

Tejpal Amin wrote:

HI,

I have a squid proxy which uses NTLM authentication for authenticating users.

I would like to restrict access only to users logging onto domain for
the other users it should deny access.
The problem I am facing is that for machines that are not joined to
windows domain, the squid throws up an authentication dialog box.


So you require authentication to use the proxy, but do not want Squid to 
notify the browsers about this critical requirement?



http_access deny !auth all

Where auth is whatever ACL name you have in your squid.conf to test 
authentication.



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: [squid-users] StoreUrlRewrite + url_rewrite_program

2009-09-01 Thread Amos Jeffries

pokeman wrote:
Hello 
can i use StoreUrlRewrite + url_rewrite_program at the same time ?




Yes. You can use any two directives in squid.conf at the same time. How 
they interact is another matter.


I'm not sure what the effect on reliability of your cache would be when 
you actually fetch an object from URL-X and then store it with URL-Z 
then pass it to the client claiming its URL-Y.


Consider VERY carefully and beware of dragons.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: [squid-users] Java not working behind squid

2009-09-01 Thread Gavin McCullagh
On Tue, 01 Sep 2009, Truth Seeker wrote:

 Really thanks for your effort... i was not able to get back to you, just
 bcoz there were so many unexpected issues on the proxy...
 
 Now your resolution didnt worked for me... 
 
 I didnt even got the 
 http://balancer.netdania.com/StreamingServer/StreamingServer? in my access.log
 
 rather i could see always DENIED for balancer like the following 
 
 TCP_DENIED/407 2912 CONNECT balancer.netdania.com:443 - NONE/- text/html

Perhaps you might tell us (ie copy and paste config) exactly what you did.

Gavin



Re: [squid-users] StoreUrlRewrite + url_rewrite_program

2009-09-01 Thread pokeman

Thanks for your prompt response i am using custom url-rewrite script to catch
and save object like same as videocache. the only issue they cannot catch
youtube. the idea store-urlrewrite to save youtube object. one more thing i
am think about that currenlty i have over 500 GB content save in my httpd
server. and squid have only 200 gb cache drives. after implement
store_rewrite we need to add more storage. any other why to save this object
istead of squid save in my httpd server. 







Amos Jeffries-2 wrote:
 
 pokeman wrote:
 Hello 
 can i use StoreUrlRewrite + url_rewrite_program at the same time ?
 
 
 Yes. You can use any two directives in squid.conf at the same time. How 
 they interact is another matter.
 
 I'm not sure what the effect on reliability of your cache would be when 
 you actually fetch an object from URL-X and then store it with URL-Z 
 then pass it to the client claiming its URL-Y.
 
 Consider VERY carefully and beware of dragons.
 
 Amos
 -- 
 Please be using
Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
Current Beta Squid 3.1.0.13
 
 

-- 
View this message in context: 
http://www.nabble.com/StoreUrlRewrite-%2B-url_rewrite_program-tp25236859p25237905.html
Sent from the Squid - Users mailing list archive at Nabble.com.



Re: AW: [squid-users] Mixing cached and non-cached access of same URLs by session-id

2009-09-01 Thread Amos Jeffries

Schermuly-Koch, Achim wrote:

Hi amos,

thanks for your advise so far. I am still not sure wich path to follow...



We are using squid as a reverse-proxy cache to speed up our website.
A large area of the website is public. But there is also a
personalized area. If a user logs into his personal site, we maintain
a session for the user (using standard tomcat features jsession-id
cookie with optional url-rewriting).



[...] the pages on the public area has a small caveat: If the user
was logged in the private area, we maintain the logged-in state and
reflect that state on public pages also (outputting Welcome John
Doe in a small box). Of course we must not cache these pages.



# Recognizes mysite acl MYSITE url_regex ^http://[^.]*\.mysite\.de

# Don't cache pages, if user sends or gets a cookie
acl JSESSIONID1 req_header Cookie -i  jsessionid
cache deny MYSITE JSESSIONID1

acl JSESSIONID2 rep_header Set-Cookie -i jsessionid
cache deny MYSITE JSESSIONID2



This seemed to wor fine. Until i did a jmeter test, mixing Requests
with and without sessionid cookies. Is seems that if i request an
already cached url with a session-cookie, that the cached document is
flushed.  




[...]


Of course if Squid find that it has a cached copy it will erase. Because 
the _UR_ is not to be cached. Content is not considered.


This is NOT the right way to do privacy caching. See below for why and 
how to do it.


[...]


The biggest surprise of all is still hiding unseen by you:


Every other cache around the Internet visitors use maybe storing the 
private area pages!!


This is because you use a local configuration completely internal to 
your Squid to determine what is cacheable and what is not.



The correct way to do this is to:


 * have the web server which generates the pages add a header 
(Cache-Control: private) to all pages which are in the private area of 
the website. This tells every shared cache (your Squid included) not to 
store the private info.


I agree with that. Do i have to configure the reverse-proxy *explicitely*
to avoid caching Cache-Control: private marked pages?


No, the proxy will avoid caching them by default.



A problem i foresee with that solution is, if i set Cache-Control: 
private for pages  containing personalized content, they will bounce 
cached pages with the same URL - but without personalized content 
(rember: the page is rendered different, depending on wether the 
user is in a session.)


Yes, this is a problem in some versions of Squid. Proper ETag supporting 
Squid will/do not have this problem. Though Squid-2 series handle ETag 
better than Squid-3 at present.





 * have the personal adjustments to the public pages done as small 
includes so that the main body and content of the page can be cached 
normally, but the small modifications are not.
For example I like including a small CSS/AJAX script which changes a 
generic HTML div [..]


I have thought of that, too. But i would prefer not to touch 
the application.


Okay then you are stuck with the CC:private and ETag to work with.



The HTTP way to achieve similar is to add ETag: header with some hash 
of the page content in it. So each unique copy of the page is stored 
separately. The personalized pages get Cache-Control: private added as 
well so that whole request get discarded.


That sounds interesting... Are the following assumptions correct:

The ETag would be generated by the webserver. A public page (/index.jsp) 
would have _one_ ETag if rendered without and a different unique ETag for 
each request  (to the same /index.jsp) with a session-cookie. The cache 
for the publicly cached page would be left untouched, if the response 
bears a Cache-Control: private header but with a different ETag. That 
implies, the cache is flushed when the webserver responds, not when the 
client requests. 


Does the Etag have to be unique resource-wide, or is it also possible
to use the same ETag for different resources (since they have
different URLs)?

Is it another very bad idea (tm) to reuse the same ETag for each 
personalized page. I would assume, it doesn't matter since they are

marked private anyway?


Theoretically you are right, it _should_ not matter. However in practice 
the proxies when seeing 'private' may discard all copies of objects at 
the URL. Squid uses its limited ETag support to get around that issue. 
So the ETag which are marked private always get discarded even is 
previously marked public, but the others not discarded.


ETag is meant to identify a unique copy of each object at a URL. The 
compressed vs non-compressed version and the personalized vs 
non-personalized versions.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: [squid-users] sometimes the users can´t visit any webpage

2009-09-01 Thread Amos Jeffries

Jesus Angeles wrote:

Hi all, I have a problem. Three weeks ago I installed Squid 2.7.STABLE3 +
Dansguardian 2.10.1.1 in GNU/Linux Ubuntu Server 9.04. First week was ok,
but the service was started to fail, sometimes (once or twice for day ) the
users can´t visit any webpage, the web browser shows a blank page (delay on
load), in those moment I check:
-   The squid service is running.
-   The dansguardian is ok, because if the users try visit a prohibited
web, It shows the access denied page.
-   The logfile  (access.log) is generating logs (I checked with tail
-f).
-   The memory and HD space is ok (I have configured 256 MB in cache_mem
and 4096 MB in cache_dir)
Then, in those moments, I have to execute “/etc/init.d/squid reload” to
solve the problem.

What could be happening?


Anything could be happening.

init.d 'reload' is also known as 'squid -k reconfigure'. Which closes 
all network connections, reads the config file again and restarts all of 
Squid internal processes.


Look in your cache.log for any useful information. Change the 
debug_options setting to a higher logging level if there is nothing there.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: [squid-users] StoreUrlRewrite + url_rewrite_program

2009-09-01 Thread Amos Jeffries

pokeman wrote:

Thanks for your prompt response i am using custom url-rewrite script to catch
and save object like same as videocache. the only issue they cannot catch
youtube. the idea store-urlrewrite to save youtube object. one more thing i
am think about that currenlty i have over 500 GB content save in my httpd
server. and squid have only 200 gb cache drives. after implement
store_rewrite we need to add more storage. any other why to save this object
istead of squid save in my httpd server. 



Youtube requires special handling. The details are written here:

http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube
http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube/Discussion

Also note that the fine details change regularly, so even those pages 
change almost monthly as the interested people find additions and 
alterations.


Amos




Amos Jeffries-2 wrote:

pokeman wrote:
Hello 
can i use StoreUrlRewrite + url_rewrite_program at the same time ?


Yes. You can use any two directives in squid.conf at the same time. How 
they interact is another matter.


I'm not sure what the effect on reliability of your cache would be when 
you actually fetch an object from URL-X and then store it with URL-Z 
then pass it to the client claiming its URL-Y.


Consider VERY carefully and beware of dragons.

Amos
--
Please be using
   Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
   Current Beta Squid 3.1.0.13







--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: Fwd: [squid-users] Need help in integrating squid and samba

2009-09-01 Thread Amos Jeffries

Avinash Rao wrote:

On 8/31/09, Amos Jeffries squ...@treenet.co.nz wrote:

Avinash Rao wrote:



On Mon, Aug 24, 2009 at 1:00 AM, Henrik Nordstrom

hen...@henriknordstrom.net
mailto:hen...@henriknordstrom.net wrote:

   sön 2009-08-23 klockan 15:08 +0530 skrev Avinash Rao:
 I couldn't find any document that shows me how to enable wb_info
   for squid.
 Can anybody help me?

   external_acl_type NT_Group %LOGIN
   /usr/local/squid/libexec/wbinfo_group.pl

   acl group1 external NT_Group group1


   then use group1 whenever you want to match users belonging to that
   Windows group.

   Regards
   Henrik


Hi Henrik,

I have used the following in my squid.conf

external_acl_type NT_Group %LOGIN /usr/lib/squid/wbinfo_group.pl acl

group1 external NT_Group staff

acl net time M T W T F S S 9:00-18:00
http_access allow net

On my linux server, I have created a group called staff and made a couple

of users a member of this group called staff. My intention is to provide
access to users belonging to group staff on all days from morning 9am - 7PM.
The rest should be denied.

But this didn't work, when the Samba users login from a winxp client, it

doesn't get access to internet at all.
There is no http_access lien making any use of ACL group1

And _everybody_ (me included on this side of the Internet) is allowed to use
your proxy between 9am ad 6pm.


Amos
--
Please be using
 Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
 Current Beta Squid 3.1.0.13




Thanks for the reply, Ya i missed http_access allow group1
I didn't understand your second statement, are u telling me that i
should deny access to net?


You should combine the ACL with others on an http_access line so that 
its limited to who it allows.


This:
 acl net time M T W T F S S 9:00-18:00
 http_access allow net

simply says all requests are allowed between time X and Y.
Without additional controls, ie on IP address making the request,  you 
end up with an open proxy.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13


Re: [squid-users] Java not working behind squid

2009-09-01 Thread Tejpal Amin
Gavin ,

Try putting this acl

acl Java browser Java/1.4 Java/1.5 Java/1.6
http_access allow Java

This worked for me when using NTLauth.

Regards
Tej


On Tue, Sep 1, 2009 at 2:45 PM, Truth Seekertruth_seeker_3...@yahoo.com wrote:

 Really thanks for your effort... i was not able to get back to you, just bcoz 
 there were so many unexpected issues on the proxy...

 Now your resolution didnt worked for me...

 I didnt even got the 
 http://balancer.netdania.com/StreamingServer/StreamingServer? in my access.log

 rather i could see always DENIED for balancer like the following

 TCP_DENIED/407 2912 CONNECT balancer.netdania.com:443 - NONE/- text/html


 Any HELP please...



 We have a similar setup on one VLAN, with squid on linux
 authenticating
 users using active directory.  We've seen lots of
 issues with Java not
 being able to authenticate.

 Testing the page you're talking about (albeit with a linux
 desktop), I get
 a java popup window asking me for my AD
 username/password/domain, I type it
 in but repeatedly it fails.

 The squid access.log says:

 1251204847.837      0 172.16.1.3
 TCP_DENIED/407 1846 CONNECT balancer.netdania.com:443 -
 NONE/- text/html
 1251204847.842      0 172.16.1.3
 TCP_DENIED/407 1846 CONNECT balancer.netdania.com:443 -
 NONE/- text/html

 I'm not sure if these lines in cache.log are relevant or
 not.

 [2009/08/25 13:42:00, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:00, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:01, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:42:01, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1
 [2009/08/25 13:47:02, 1]
 libsmb/ntlmssp.c:ntlmssp_update(267)
   got NTLMSSP command 3, expected 1

 My usual workaround is to add an ACL for that site which is
 far from ideal.
 I've added the following ACL:

     acl dailyfx dstdomain
 balancer.netdania.com
     http_access allow dailyfx CONNECT

 That works around the issue for me.  I still get
 prompted for the username
 and password and the logs suggest some traffic isn't
 getting through.

 1251205769.600  14385 172.16.1.3 TCP_MISS/000 7263
 CONNECT balancer.netdania.com:443 -
 FIRST_UP_PARENT/172.20.2.3 - 1251205771.233
   1 172.16.1.3 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205771.239      3 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205771.516    277 172.16.1.3 TCP_MISS/200
 1443 GET http://balancer.netdania.com/StreamingServer/StreamingServer?
 gavinmc FIRST_UP_PARENT/172.20.2.3 application/zip
 1251205774.813     55 172.16.1.3
 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205774.816      0 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205776.537   1721 172.16.1.3
 TCP_MISS/200 1125 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 gavinmc FIRST_UP_PARENT/172.20.2.3 application/zip
 1251205779.681      1 172.16.1.3
 TCP_DENIED/407 1954 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html
 1251205779.685      1 172.16.1.3
 TCP_DENIED/407 1969 GET 
 http://balancer.netdania.com/StreamingServer/StreamingServer?
 - NONE/- text/html

 If I drop the word CONNECT I get no errors at all, but that
 disables
 authentication entirely for that site.

 There is definitely some issue with austhentication and
 Java.  I'm not sure
 if it might actually be Authentication+Java+SSL.  Our
 problems are
 generally with java-driven online banking applications.

 Gavin









Re: [squid-users] Restricting access to users logging onto windows domain

2009-09-01 Thread Tejpal Amin
AMos,

I tried putting this line in the conf file but it did not work.

My aim is to stop users not logging onto my AD domain from accessing
the internet.
I have configured NTLM authentication for my squid but the issue is
teh users not logging onto teh domain get a prompt for authentication.

There should be no way of accessing teh internet for non domain users.

Regards
Tej

On Tue, Sep 1, 2009 at 2:54 PM, Amos Jeffriessqu...@treenet.co.nz wrote:
 Tejpal Amin wrote:

 HI,

 I have a squid proxy which uses NTLM authentication for authenticating
 users.

 I would like to restrict access only to users logging onto domain for
 the other users it should deny access.
 The problem I am facing is that for machines that are not joined to
 windows domain, the squid throws up an authentication dialog box.

 So you require authentication to use the proxy, but do not want Squid to
 notify the browsers about this critical requirement?


 http_access deny !auth all

 Where auth is whatever ACL name you have in your squid.conf to test
 authentication.


 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13



Re: [squid-users] Java not working behind squid

2009-09-01 Thread Gavin McCullagh
Hi,

On Tue, 01 Sep 2009, Tejpal Amin wrote:

 Try putting this acl
 
 acl Java browser Java/1.4 Java/1.5 Java/1.6
 http_access allow Java
 
 This worked for me when using NTLauth.

Thanks, though I'm not the one in need of a solution and I'm not that keen
to give Java full unauthenticated browsing rights.  

Perhaps Truth Seeker(?) might try that though.

Am I to understand that Java is just really bad at NTLM auth, so much so
that people just whitelist it for unauthenticated access?

Gavin



[squid-users] cache.log errors

2009-09-01 Thread Mark Lodge

What can be the cause of these errors in my cache.log?

2009/08/31 07:35:02| storeDirWriteCleanLogs: Starting...
2009/08/31 07:35:02|   Finished.  Wrote 6991 entries.
2009/08/31 07:35:02|   Took 0.0 seconds (2124924.0 entries/sec).
2009/08/31 07:35:02| logfileRotate: /var/log/squid/store.log
2009/08/31 07:35:02| logfileRotate (stdio): /var/log/squid/store.log
2009/08/31 07:35:02| logfileRotate: /var/log/squid/access.log
2009/08/31 07:35:02| logfileRotate (stdio): /var/log/squid/access.log
2009/08/31 07:35:02| helperOpenServers: Starting 7 'python' processes
2009/08/31 07:46:38| sslReadClient: FD 36: read failure: (110) 
Connection timed out
2009/08/31 08:17:16| clientTryParseRequest: FD 27 (10.0.0.50:37271) 
Invalid Request

2009/08/31 08:19:57| parseHttpRequest: Unsupported method ''
2009/08/31 08:19:57| clientTryParseRequest: FD 27 (10.0.0.50:49041) 
Invalid Request

2009/08/31 08:20:37| parseHttpRequest: Unsupported method ''
2009/08/31 08:20:37| clientTryParseRequest: FD 19 (10.0.0.50:49043) 
Invalid Request

2009/08/31 08:20:55| parseHttpRequest: Unsupported method ''
2009/08/31 08:20:55| clientTryParseRequest: FD 27 (10.0.0.50:49044) 
Invalid Request

2009/08/31 08:21:33| parseHttpRequest: Unsupported method ''
2009/08/31 08:21:33| clientTryParseRequest: FD 29 (10.0.0.50:49046) 
Invalid Request

2009/08/31 08:21:49| parseHttpRequest: Unsupported method ''
2009/08/31 08:21:49| clientTryParseRequest: FD 29 (10.0.0.50:49047) 
Invalid Request
2009/08/31 08:26:17| sslReadClient: FD 32: read failure: (110) 
Connection timed out
2009/08/31 08:31:16| clientTryParseRequest: FD 26 (10.0.0.50:56244) 
Invalid Request

2009/08/31 08:31:37| parseHttpRequest: Unsupported method ''
2009/08/31 08:31:37| clientTryParseRequest: FD 26 (10.0.0.50:56246) 
Invalid Request
2009/08/31 08:32:26| clientTryParseRequest: FD 26 (10.0.0.50:56249) 
Invalid Request
2009/08/31 08:32:44| clientTryParseRequest: FD 26 (10.0.0.50:56250) 
Invalid Request

2009/08/31 08:47:47| parseHttpRequest: Unsupported method ''
2009/08/31 08:47:47| clientTryParseRequest: FD 29 (10.0.0.50:59602) 
Invalid Request
2009/08/31 08:48:07| clientTryParseRequest: FD 27 (10.0.0.50:48984) 
Invalid Request
2009/08/31 09:28:13| sslReadClient: FD 25: read failure: (110) 
Connection timed out

2009/08/31 14:53:25| parseHttpRequest: Unsupported method 'CONNECT'
2009/08/31 14:53:25| clientTryParseRequest: FD 26 (10.0.0.50:2653) 
Invalid Request
2009/08/31 17:15:30| sslReadClient: FD 28: read failure: (110) 
Connection timed out
2009/08/31 17:51:08| sslReadClient: FD 25: read failure: (110) 
Connection timed out

2009/08/31 18:53:01| WARNING: Disk space over limit: 102724 KB  102400 KB
2009/08/31 22:48:29| parseHttpRequest: Unsupported method ''
2009/08/31 22:48:29| clientTryParseRequest: FD 31 (10.0.0.50:45807) 
Invalid Request
2009/08/31 22:53:31| clientTryParseRequest: FD 32 (10.0.0.50:47779) 
Invalid Request
2009/08/31 22:57:56| clientTryParseRequest: FD 44 (10.0.0.50:47808) 
Invalid Request
2009/08/31 23:00:40| clientTryParseRequest: FD 30 (10.0.0.50:55705) 
Invalid Request
2009/08/31 23:10:32| clientTryParseRequest: FD 19 (10.0.0.50:50205) 
Invalid Request
2009/08/31 23:14:45| clientTryParseRequest: FD 28 (10.0.0.50:60430) 
Invalid Request
2009/08/31 23:18:35| clientTryParseRequest: FD 31 (10.0.0.50:33908) 
Invalid Request




Re: [squid-users] Bdigest_pw_auth???

2009-09-01 Thread Henrik Nordstrom
mån 2009-08-31 klockan 21:04 -0500 skrev Luis Daniel Lucio Quiroz:
 2009/08/31 20:45:40| AuthConfig::CreateAuthUser: Unsupported or 
 unconfigured/inactive proxy-auth scheme, 'Bdigest_pw_auth(LDAP_backend) 
 WARNING, LDAP error 'No such object'

Looks like a mix between an error from Squid and digest_pw_auth. Both
are writing to the same log file.

What does the previous and next lines look like?

Regards
Henrik





Re: [squid-users] squid 2.7 - problems with kerberos authentication

2009-09-01 Thread Henrik Nordstrom
tis 2009-09-01 klockan 11:41 +0400 skrev Дмитрий Нестеркин:
 I'm trying to configure Kerberos authentication for Squid 2.7 (Debian
 Lenny, MIT kerberos; Windows Server 2003 no service packs), but no
 luck :(


Have you set the env variable telling squid_kerb_auth which keytab to
use?

Do the user Squid is running as have read access to this keytab file?

Do the principal in that keytab match the proxy name your clients have
configured?

Is there any errors in cache.log?

Regards
Henrik




Re: [squid-users] StoreUrlRewrite + url_rewrite_program

2009-09-01 Thread Henrik Nordstrom
tis 2009-09-01 klockan 02:07 -0700 skrev pokeman:
 Hello 
 can i use StoreUrlRewrite + url_rewrite_program at the same time ?

Yes.

url_rewrite_program takes place before store url rewrites.

Regards
Henrik



Re: [squid-users] Restricting access to users logging onto windows domain

2009-09-01 Thread Henrik Nordstrom
tis 2009-09-01 klockan 17:07 +0530 skrev Tejpal Amin:

 My aim is to stop users not logging onto my AD domain from accessing
 the internet.

I am afraid that is not possible. At the HTTP level (what Squid sees)
there is no difference between clients logging on automatically due to
having cached credentials from a domain logon or manually by entering
the credentials in a browser login box. Both is domain logons, differing
only in how the client got the logon information.

MAYBE it's possible with some domain policy settings, but I would not
think so.

Regards
Henrik



Re: [squid-users] Java not working behind squid

2009-09-01 Thread Henrik Nordstrom
tis 2009-09-01 klockan 02:15 -0700 skrev Truth Seeker:
 Really thanks for your effort... i was not able to get back to you, just bcoz 
 there were so many unexpected issues on the proxy...
 
 Now your resolution didnt worked for me... 
 
 I didnt even got the 
 http://balancer.netdania.com/StreamingServer/StreamingServer? in my access.log
 
 rather i could see always DENIED for balancer like the following 
 
 TCP_DENIED/407 2912 CONNECT balancer.netdania.com:443 - NONE/- text/html

That looks like arequest for https://balancer.netdania.com/...

Regards
Henrik



Re: Fwd: [squid-users] Need help in integrating squid and samba

2009-09-01 Thread Avinash Rao
Thank you for your explanation...
I understood what you said.. i will check the squid configuration and get back..

On 9/1/09, Amos Jeffries squ...@treenet.co.nz wrote:
 Avinash Rao wrote:
  On 8/31/09, Amos Jeffries squ...@treenet.co.nz wrote:
 
   Avinash Rao wrote:
  
  
   
On Mon, Aug 24, 2009 at 1:00 AM, Henrik Nordstrom
   
   hen...@henriknordstrom.net
   mailto:hen...@henriknordstrom.net wrote:
  
  sön 2009-08-23 klockan 15:08 +0530 skrev Avinash Rao:
I couldn't find any document that shows me how to enable wb_info
  for squid.
Can anybody help me?
   
  external_acl_type NT_Group %LOGIN
  /usr/local/squid/libexec/wbinfo_group.pl
   
  acl group1 external NT_Group group1
   
   
  then use group1 whenever you want to match users belonging to that
  Windows group.
   
  Regards
  Henrik
   
   
Hi Henrik,
   
I have used the following in my squid.conf
   
external_acl_type NT_Group %LOGIN /usr/lib/squid/wbinfo_group.pl acl
   
   group1 external NT_Group staff
  
acl net time M T W T F S S 9:00-18:00
http_access allow net
   
On my linux server, I have created a group called staff and made a
 couple
   
   of users a member of this group called staff. My intention is to provide
   access to users belonging to group staff on all days from morning 9am -
 7PM.
   The rest should be denied.
  
But this didn't work, when the Samba users login from a winxp client,
 it
   
   doesn't get access to internet at all.
   There is no http_access lien making any use of ACL group1
  
   And _everybody_ (me included on this side of the Internet) is allowed to
 use
   your proxy between 9am ad 6pm.
  
  
   Amos
   --
   Please be using
Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
Current Beta Squid 3.1.0.13
  
  
 
 
  Thanks for the reply, Ya i missed http_access allow group1
  I didn't understand your second statement, are u telling me that i
  should deny access to net?
 

 You should combine the ACL with others on an http_access line so that its
 limited to who it allows.

 This:
  acl net time M T W T F S S 9:00-18:00
  http_access allow net

 simply says all requests are allowed between time X and Y.
 Without additional controls, ie on IP address making the request,  you end
 up with an open proxy.


 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13



[squid-users] Defining exceptions for URL acls

2009-09-01 Thread pent 5971
Hi,

On a squid.conf which more than one acl groups for URL destinations
filtering is configured , how can we

let some choosen clients to have access to an acl group  but still
blocked by others?


the acls are similiar like

acl myblocks  url_regex blockmeifyoucan.com

http_access myblock deny

acl blockedsites dstdomain .catchmeifyoucan.com
http_access deny blockedsites.

etc.


[squid-users] NTLM or fakeauth_auth

2009-09-01 Thread apmailist

Hello,


We are switching from an LDAP authentication to an AD one.
It works GREAT either with basic [password in clear :-(  ] or ntlm
authentication schemes. SSO was also requested, and works great.

We have one problem though :
- during the tests, some user accounts get locked very often. ( after 5
attempts).
We know it comes from software trying to connect to internet with older
passwords. But as we cannot guarantee it will not happen on a large scale when
we migrate,
- I am looking for a way to prevent these accounts getting locked.

I thought of two solutions :

1.
I searched for a way to make Squid only ask 3 times in a row for a valid
credential. But couldn't find it : Any clue ?
(After three bad attempts, Squid would not send a 407, but a 200 with the error
page , maybe ?)

2.
The other solution I went for was a more relaxed authentication scheme : using
fakeauth_auth (NTLM), and basic as a failback for non-sso browsers.
The idea is the following :
IE ( the in-house main browser ) would send the windows credential in a sso way
(thus the user is logged) in an automatic way (meaning the user doesn't see it,
and cannot tamper the authentication). We rely on IE to send us the username
(windows logon credential)
Other browsers (FF) would use the basic scheme to send it's credentials.

The problem is that at least one browser that is NTLM-compatible (Opera) is able
to provide the user with a prompt during the authentication : And the user may
give any valid account, along with any password.
Here are the two lines :
auth_param ntlm program /proxy3/libexec/fakeauth_auth
auth_param basic program /proxy3/libexec/squid_ldap_auth  -P -ZZ -v 3 -c 5 -t 5
-b ou=BLABLA -f(sAMAccountName=%s) -D cn=reqaccount-BLABLA -W
/proxy3/etc/ldapauth_prd_secretfile -h dc002.fgn.com dc003.global.fgn.com
Inverting the two lines forces all browsers to use the basic authentication.
Is there a way to do NTLM only with SSO able browsers, and then revert to BASIC
for all the others ?
I figure playing with useragent strings wouldn't be enough, because Opera can
easily masquerade as IE (or used to).



Thank you for your ideas.



Andrew




Re: [squid-users] Defining exceptions for URL acls

2009-09-01 Thread apmailist
Quoting pent 5971 pent5...@gmail.com:

 Hi,

 On a squid.conf which more than one acl groups for URL destinations
 filtering is configured , how can we

 let some choosen clients to have access to an acl group  but still
 blocked by others?

You must define those chosen clients
- by their IP address
- by their username
- other

define some ACL's like :

acl users_allowed proxy_auth suzan romeo juliet
acl users_notallowed proxy_auth karen alan
these define clients by their authenticated username

with
acl AUTENT proxy_auth REQUIRED


 the acls are similiar like

acl myblocks  url_regex blockmeifyoucan.com
http_access allow AUTENT myblock users_allowed
http_access deny AUTENT myblock users_notallowed

this allows suzan romeo juliet to access the myblocks sites.


acl blockedsites dstdomain .catchmeifyoucan.com
http_access deny blockedsites

this denies everyone from accessing the blockedsites.

 etc.
BTW, finish off with :
http_access deny all


HTH

Andrew


[squid-users] url blocking using url_regex not working on squid2.5

2009-09-01 Thread g f
Hello all,
I am running squid2.5STABLE14 on RHEL4.
I am close to rolling out squid3 on debian but unfortunately I still
need to support the above RHEL build.
Redhat doesnt seem to have a 2.6 rpm for RHEL4 so I cannot go to 2.6.

All is working fine but I need to implement url blocking.
I followed docs and numerous posts to attempt to implement url
blocking but squid just seems to ignore these acls.

Here is a snippet of my config:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443 563# https, snews
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

acl our_networks src 10.150.15.0/24
http_access allow our_networks
acl our_servers src 10.150.7.0/24
http_access allow our_servers
acl msn url_regex toyota
http_access deny msn

http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all


Now I also tried the following:
acl msn dstdomain .toyota.com
http_access deny msn

acl msn_file url_regex /etc/squid/blocker.txt
http_access deny msn_file

I started squid using debug /usr/sbin/squid -NCd10 and get no errors.
It seems to just ignore these acls.

Any ideas?
Thanks in advance.
Graham


Re: [squid-users] url blocking using url_regex not working on squid2.5

2009-09-01 Thread Chris Robertson

g f wrote:

Hello all,
I am running squid2.5STABLE14 on RHEL4.
I am close to rolling out squid3 on debian but unfortunately I still
need to support the above RHEL build.
Redhat doesnt seem to have a 2.6 rpm for RHEL4 so I cannot go to 2.6.

All is working fine but I need to implement url blocking.
I followed docs and numerous posts to attempt to implement url
blocking but squid just seems to ignore these acls.

Here is a snippet of my config:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443 563# https, snews
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

acl our_networks src 10.150.15.0/24
http_access allow our_networks
  


With this, you allow all traffic (that hasn't already been denied) from 
10.150.15.0/24.  For clients in this IP range, no more access rules will 
be checked.  Have a look at the FAQ 
(http://wiki.squid-cache.org/SquidFaq/SquidAcl) for more.



acl our_servers src 10.150.7.0/24
http_access allow our_servers
acl msn url_regex toyota
http_access deny msn

http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all


Now I also tried the following:
acl msn dstdomain .toyota.com
http_access deny msn

acl msn_file url_regex /etc/squid/blocker.txt
http_access deny msn_file

I started squid using debug /usr/sbin/squid -NCd10 and get no errors.
It seems to just ignore these acls.

Any ideas?
Thanks in advance.
Graham
  


Chris



[squid-users] Re: squid 2.7 - problems with kerberos authentication

2009-09-01 Thread Markus Moeller
Please post extracts of the cache.log file.  both squid_kerb-auth and 
squid_kerb_ldap produce lots of debug with -d.


Regards
Markus

Дмитрий Нестеркин undelb...@gmail.com wrote in message 
news:cf132a050909010041x59898e38naa49ca3eab974...@mail.gmail.com...

I'm trying to configure Kerberos authentication for Squid 2.7 (Debian
Lenny, MIT kerberos; Windows Server 2003 no service packs), but no
luck :(

This is how my configuration files look like:

squid.conf:

auth_param negotiate program /usr/lib/squid/squid_kerb_auth -d
auth_param negotiate children 10
auth_param negotiate keep_alive on

acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
#acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
#acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.10.0/24 # RFC1918 possible internal network
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 3128
acl CONNECT method GET
#
#CHECKING USERS BY AD
#
external_acl_type ldap_check ttl=1200 %LOGIN
/usr/lib/squid/squid_ldap_group -R -b dc=mydomain,dc=local -f
((objectclass=user)(sAMAccountName=%v
(memberof=cn=%a,ou=internet,dc=mydomain,dc=local)) -D
proxyu...@mydomain.local -w password -K -d 192.168.100.42
#
acl auth proxy_auth REQUIRED
acl inet_access external ldap_check inet_allow
#
http_access allow inet_access
http_access allow manager localhost
http_access deny manager
# Deny requests to unknown ports
http_access deny !Safe_ports
# Deny CONNECT to other than SSL ports
http_access deny CONNECT !SSL_ports
http_access deny to_localhost
#
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny !auth
http_access allow auth
http_access deny all


/etc/init.d/squid содержит:
KRB5_KTNAME=/etc/squid/krbldap.mydomain.local.keytab
export KRB5_KTNAME
KRB5RCACHETYPE=none
export KRB5RCACHETYPE


/etc/krb5.conf:
[libdefaults]
default_realm = MYDOMAIN.LOCAL
dns_lookup_realm = no
dns_lookup_kdc = no
default_keytab_name = /etc/squid/krbldap.mydomain.local.keytab
default_tgs_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
default_tkt_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
permitted_enctypes = des-cbc-crc rc4-hmac des-cbc-md5
ticket_lifetieme= 24h
# The following krb5.conf variables are only for MIT Kerberos.
# krb4_config = /etc/krb.conf
# krb4_realms = /etc/krb.realms
kdc_timesync = 1
ccache_type = 4
forwardable = true
proxiable = true
# The following libdefaults parameters are only for Heimdal Kerberos.
v4_instance_resolve = false
v4_name_convert = {
host = {
rcmd = host
ftp = ftp
}
plain = {
something = something-else
}
}
fcc-mit-ticketflags = true

[realms]
MYDOMAIN = {
kdc = dc.mydomain:88
admin_server = dc.mydomain:749
default_domain = mydomain
}

MYDOMAIN.LOCAL = {
kdc = dc.mydomain.local:88
admin_server = dc.mydomain.local:749
default_domain = mydomain.local
}

[domain_realm]
.linux.local = MYDOMAIN.LOCAL
.mydomain.local = MYDOMAIN.LOCAL
mydomain.local = MYDOMAIN.LOCAL
.mydomain = MYDOMAIN
mydomain = MYDOMAIN
#[appdefaults]
#pam = {
#debug = false
#ticket_lifetime = 36000
#renew_lifetime = 36000
#forwardable = true
#krb4_convert = false
#}

#[kdc]
#profile = /usr/share/krb5-kdc/kdc.conf

#[login]
# krb4_convert = false
# krb4_get_tickets = false

[logging]
default = FILE:/var/log/krb5lib.log
kdc = FILE:/var/log/kdc.log
kdc = SYSLOG:INFO AEMON
admin_server = FILE:/var/log/kadmin.log


When I try to check authorisation from terminal - it's OK:
$ sudo kinit -V -k -t /etc/squid/krbldap.mydomain.local.keytab
HTTP/Most2.mydomain.local
Authenticated to Kerberos v5

When I try to authenticate users by IP address - everything is OK

access.log:
1251706346.035 0 192.168.10.133 TCP_DENIED/407 1750 GET
http://www.debian.org/ - NONE/- text/html

Internet Explorer 7 show error message Internet Explorer cannot
display this page
Opera 9.6 requests login and password, but they are not being accepted.

What am I doing wrong?
--
Best regards,
Dmitry




Re: [squid-users] Restricting access to users logging onto windows domain

2009-09-01 Thread Amos Jeffries
On Tue, 1 Sep 2009 17:07:52 +0530, Tejpal Amin tejpal.a...@gmail.com
wrote:
 AMos,
 
 I tried putting this line in the conf file but it did not work.
 
 My aim is to stop users not logging onto my AD domain from accessing
 the internet.
 I have configured NTLM authentication for my squid but the issue is
 teh users not logging onto teh domain get a prompt for authentication.
 
 There should be no way of accessing teh internet for non domain users.
 

Which is exactly what that line I gave you does.

I assume when you said squid throws up an authentication dialog box that
you already had authentication working. This line replaces whatever you
currently have doing deny !auth in your config and causing the dialog box
to appear. The 'all' at the end of the line prevents the dialog being
requested by Squid.

Amos


 Regards
 Tej
 
 On Tue, Sep 1, 2009 at 2:54 PM, Amos Jeffriessqu...@treenet.co.nz
wrote:
 Tejpal Amin wrote:

 HI,

 I have a squid proxy which uses NTLM authentication for authenticating
 users.

 I would like to restrict access only to users logging onto domain for
 the other users it should deny access.
 The problem I am facing is that for machines that are not joined to
 windows domain, the squid throws up an authentication dialog box.

 So you require authentication to use the proxy, but do not want Squid to
 notify the browsers about this critical requirement?


 http_access deny !auth all

 Where auth is whatever ACL name you have in your squid.conf to test
 authentication.


 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18
  Current Beta Squid 3.1.0.13



Re: [squid-users] Java not working behind squid

2009-09-01 Thread Amos Jeffries
On Tue, 1 Sep 2009 12:43:13 +0100, Gavin McCullagh gavin.mccull...@gcd.ie
wrote:
 Hi,
 
 On Tue, 01 Sep 2009, Tejpal Amin wrote:
 
 Try putting this acl
 
 acl Java browser Java/1.4 Java/1.5 Java/1.6
 http_access allow Java
 
 This worked for me when using NTLauth.
 
 Thanks, though I'm not the one in need of a solution and I'm not that
keen
 to give Java full unauthenticated browsing rights.  
 
 Perhaps Truth Seeker(?) might try that though.
 
 Am I to understand that Java is just really bad at NTLM auth, so much so
 that people just whitelist it for unauthenticated access?

Yes.
Personally I recommend adding other ACL such as sources which are allowed
to use Java in this way. To reduce the impact and security holes this
method opens.

Amos


Re: [squid-users] cache.log errors

2009-09-01 Thread Amos Jeffries
On Tue, 01 Sep 2009 14:38:57 +0200, Mark Lodge mlodg...@gmail.com wrote:
 What can be the cause of these errors in my cache.log?
 
 2009/08/31 07:35:02| storeDirWriteCleanLogs: Starting...
 2009/08/31 07:35:02|   Finished.  Wrote 6991 entries.
 2009/08/31 07:35:02|   Took 0.0 seconds (2124924.0 entries/sec).
 2009/08/31 07:35:02| logfileRotate: /var/log/squid/store.log
 2009/08/31 07:35:02| logfileRotate (stdio): /var/log/squid/store.log
 2009/08/31 07:35:02| logfileRotate: /var/log/squid/access.log
 2009/08/31 07:35:02| logfileRotate (stdio): /var/log/squid/access.log
 2009/08/31 07:35:02| helperOpenServers: Starting 7 'python' processes
 2009/08/31 07:46:38| sslReadClient: FD 36: read failure: (110) 
 Connection timed out
 2009/08/31 08:17:16| clientTryParseRequest: FD 27 (10.0.0.50:37271) 
 Invalid Request
 2009/08/31 08:19:57| parseHttpRequest: Unsupported method ''
 2009/08/31 08:19:57| clientTryParseRequest: FD 27 (10.0.0.50:49041) 
 Invalid Request

Somebody at 10.0.0.50 pushing a binary protocol through port 80.
Your Squid is detecting it and dropping the connection.

Amos


Re: [squid-users] NTLM or fakeauth_auth

2009-09-01 Thread Amos Jeffries
On Tue, 01 Sep 2009 15:38:24 +0200, apmail...@free.fr wrote:
 Hello,
 
 
 We are switching from an LDAP authentication to an AD one.
 It works GREAT either with basic [password in clear :-(  ] or ntlm
 authentication schemes. SSO was also requested, and works great.
 
 We have one problem though :
 - during the tests, some user accounts get locked very often. ( after 5
 attempts).
 We know it comes from software trying to connect to internet with older
 passwords. But as we cannot guarantee it will not happen on a large scale
 when
 we migrate,
 - I am looking for a way to prevent these accounts getting locked.
 
 I thought of two solutions :
 
 1.
 I searched for a way to make Squid only ask 3 times in a row for a valid
 credential. But couldn't find it : Any clue ?

Not possible.  There is no such thing as a 'repeat' in HTTP.  Every request
is 'new'.

 (After three bad attempts, Squid would not send a 407, but a 200 with the
 error
 page , maybe ?)
 
 2.
 The other solution I went for was a more relaxed authentication scheme :
 using
 fakeauth_auth (NTLM), and basic as a failback for non-sso browsers.
 The idea is the following :
 IE ( the in-house main browser ) would send the windows credential in a
sso
 way
 (thus the user is logged) in an automatic way (meaning the user doesn't
see
 it,
 and cannot tamper the authentication). We rely on IE to send us the
 username
 (windows logon credential)
 Other browsers (FF) would use the basic scheme to send it's credentials.

IE is the most limited of all browsers security-wise. Other web browsers
are mostly capable of NTLM and more advanced authentication Schemes without
the bugs IE has.

 
 The problem is that at least one browser that is NTLM-compatible (Opera)
is
 able
 to provide the user with a prompt during the authentication : And the
user
 may
 give any valid account, along with any password.

This is true of _all_ web browsers.

 Here are the two lines :
 auth_param ntlm program /proxy3/libexec/fakeauth_auth
 auth_param basic program /proxy3/libexec/squid_ldap_auth  -P -ZZ -v 3 -c
5
 -t 5
 -b ou=BLABLA -f(sAMAccountName=%s) -D cn=reqaccount-BLABLA -W
 /proxy3/etc/ldapauth_prd_secretfile -h dc002.fgn.com dc003.global.fgn.com
 Inverting the two lines forces all browsers to use the basic
 authentication.
 Is there a way to do NTLM only with SSO able browsers, and then revert to
 BASIC
 for all the others ?

Yes. By using what you have configured above.
The problem you face is that Squid sends out a list of available methods.
Then the browser chooses the authentication method its going to use and
sends credentials. If those credentials fail Squid responds with a
407/'failed try again' and the browser does whatever it can to get new
credentials. Usually they start with a popup window to ask the user.


 I figure playing with useragent strings wouldn't be enough, because Opera
 can
 easily masquerade as IE (or used to).

Agent strings is not relevant, only the credentials the browser pass to
Squid and the method chosen to send them.


What I would do in your place is setup an external ACL which accepted the
Proxy-Auth header and processed it.
Detect old-style logins and redirect to a special error page saying to
change their settings.
If the type is 'Basic' it returns OK. Otherwise ERR. 

external_acl_type oldAuthTest %{Proxy-Authentication} /bla.sh
acl oldAuth external oldAuthTest
deny_info http://blah.example.com/fix-your-proxy-login.html oldAuth
http_access deny oldAuth

... http_access bits to do the new login stuff go below ...

Amos


[squid-users] persistent connection

2009-09-01 Thread xetorthio

Hi everyone!
I ran today to a really strange behavior of squid.
My application (A) opened by mistake a lot of persistent connection to
another application (C) going through a squid (B) for caching purposes.
When I saw the connections I changed that behavior and restarted my
application to start using the new configuration that won't open persistent
connection at all. By doing that all the existing persistent connection from
A to B were closed. But for some reason squid left the persistent connection
to C and was keeping them alive for a REALLY long time.
This if my squid configuration... I couldn't really find something that
would change the default behavior of 2 minutes of timeout for idle
persistent connection. Am I doing something wrong?

http_port 10.0.0.10:8983 accel vhost
http_port 3128
hierarchy_stoplist cgi-bin
acl QUERY urlpath_regex cgi-bin
no_cache deny QUERY
cache_mem 7168 MB
cache_swap_low 90
cache_swap_high 95
maximum_object_size 4096 KB
minimum_object_size 0 KB
cache_replacement_policy lru
memory_replacement_policy lru
cache_dir diskd /var/cache/squid/1/vol1 2 100 10
cache_dir diskd /var/cache/squid/1/vol2 2 100 10
access_log   /var/log/squid/1/access.log 
cache_log /var/log/squid/1/cache.log
cache_store_log none
cache_swap_log /var/cache/squid/1/
emulate_httpd_log on
log_ip_on_direct off
mime_table /etc/squid/mime.conf
log_mime_hdrs off
pid_filename /var/run/squid/1.pid
log_fqdn off
client_netmask 255.255.255.255
redirect_rewrites_host_header off
auth_param basic children 5
auth_param basic realm OLX Cache Manager
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl monitor src 10.0.0.100/255.255.255.255
acl web_ports port 8983 8080 80
http_access allow web_ports
http_access allow manager localhost
http_access allow manager monitor
http_access deny manager
acl purge method PURGE
http_access allow purge localhost
http_access allow purge monitor
http_access deny purge
http_access deny all
http_reply_access allow all
icp_access allow all
cache_peer 172.28.1.44 parent 8983 0 no-query originserver login=PASS
cache_mgr infrastruct...@olx.com
cache_effective_user squid
cache_effective_group squid
visible_hostname solr1.proxy1.olx.com
httpd_suppress_version_string   on
icon_directory /usr/local/squid/share/icons
coredump_dir /var/cache/squid/1/

Thank you very much!

Jonathan
-- 
View this message in context: 
http://www.nabble.com/persistent-connection-tp25251065p25251065.html
Sent from the Squid - Users mailing list archive at Nabble.com.