RE: [squid-users] Cant login to certain flash page via squid?

2012-06-20 Thread Terry Dobbs
Thanks for the reply.

Incase this becomes an issue with a site many users need to access, what
is the best way to bypass squid entirely for specific sites? Is there a
clean, easy way to do it? I am running Ubuntu as my squid server.

Thanks again.

-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz] 
Sent: Tuesday, June 19, 2012 9:28 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Cant login to certain flash page via squid?

On 20.06.2012 09:13, Terry Dobbs wrote:
 When users are going through squid there are certain pages, like the 
 one
 I mentioned where you just can't click a specific button. It always
 seems flash related. If I reconfigure this user to not use squid I 
 can
 use the page just fine. This leads me to believe its not solely a
 browser issue.

 When I say I told it to ignore I meant in the squid.conf file, where 
 I
 allowed access to that specific domain without any kind of
 authentication. Thinking about it, I understand this step is pretty
 pointless as squid still processes the site. However I have had 
 success
 in the past by allowing access to sites before the proxy_auth 
 required
 command.

 Not really sure what the issue is, but it seems to happen with just a
 handful of random sites.


Flash player is separate software not permitted access to the browsers 
internal password manager information.
  * Flash player does not provide any means for users to enter passwords

unless the HTTP request is a GET.
  * Flash script frameworks do not provide easily available support 
unless the HTTP request is a POST.
  * recent Flash versions prevent HTTP authentication unless the visited

*website* provides explicit file-based (ONLY file based) CORS support 
for the relevant headers. NP: as documented this would prohibit 
Proxy-Authentication.


Website authentication only works if the author who wrote the script 
knows how to write a) the user I/O interface and b) the relevant 
encryption algorithms (rare for anything better than Basic auth), and c)

adds explicit CORS support to their site. AND decided it was worth the 
trouble.


As a result HTTP authentication of any type rarely works in Flash 
applications. Proxy authentication has never been reported working, not 
to say it can't, just that in my experience nobody has ever mentioned 
seeing it happen despite common complaints here and in many other places

online.


Personally I rate Flash as a worse problem than Java in this regard. At 
least Java provides libraries and API making it easy for developers who 
know where to look (most seem not to use it, but that is a 
knowledge/time issue not a technical barrier).

Amos



Re: [squid-users] Cant login to certain flash page via squid?

2012-06-19 Thread Terry Dobbs
When users are going through squid there are certain pages, like the one
I mentioned where you just can't click a specific button. It always
seems flash related. If I reconfigure this user to not use squid I can
use the page just fine. This leads me to believe its not solely a
browser issue. 

When I say I told it to ignore I meant in the squid.conf file, where I
allowed access to that specific domain without any kind of
authentication. Thinking about it, I understand this step is pretty
pointless as squid still processes the site. However I have had success
in the past by allowing access to sites before the proxy_auth required
command. 

Not really sure what the issue is, but it seems to happen with just a
handful of random sites.


[squid-users] Cant login to certain flash page via squid?

2012-06-13 Thread Terry Dobbs
I have had this issue with one or two pages and can't figure it out. For
example, if you go to
http://www.complianceonline.com/ecommerce/control/trainingFocus?product_
id=702317channel=M-New_JU13_Mark_JN04_DM and then click the Buy Now
button it takes you to a screen with your shopping cart. Users accessing
the site via the proxy are unable to click continue on this shopping
cart screen. I can access it fine directly. I have told squid to ignore
these sites but it doesn't seem to matter. Below is the only thing I can
find in the log in relation to this site. Any ideas? I am running squid
on a Ubuntu box.



1339187191.071102 192.168.70.125 TCP_MISS/200 1448 GET
http://static.complianceonline.com/images/cart/cart_delete.gif -
DIRECT/209.128.85.3 image/gif [Accept: */*\r\nReferer:
http://www.complianceonline.com/ecommerce/control/showcart\r\nAccept-Lan
guage: en-us\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows
NT 5.1; Trident/4.0; .NET CLR 2.0.50727; InfoPath.1; .NET4.0C; .NET4.0E;
.NET CLR 3.5.30729; .NET CLR 3.0.4506.2152)\r\nAccept-Encoding: gzip,
deflate\r\nHost: static.complianceonline.com\r\nProxy-Connection:
Keep-Alive\r\nCookie:
__utma=211283463.1311775861.1339095920.1339099387.1339166536.3;
__utmz=211283463.1339166536.3.3.utmcsr=M-New_JU13_Mark_JN04_DM|utmccn=(n
ot%2520set)|utmcmd=(not%2520set)\r\n] [HTTP/1.1 200 OK\r\nServer:
nginx\r\nDate: Fri, 08 Jun 2012 19:50:04 GMT\r\nContent-Type:
image/gif\r\nConnection: keep-alive\r\nKeep-Alive: timeout=300\r\nETag:
W/1157-133656597\r\nContent-Length: 1157\r\n\r]
1339187191.080100 192.168.70.125 TCP_MISS/200 2345 GET
http://static.complianceonline.com/images/cart/continue.gif -
DIRECT/209.128.85.3 image/gif [Accept: */*\r\nReferer:
http://www.complianceonline.com/ecommerce/control/showcart\r\nAccept-Lan
guage: en-us\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows
NT 5.1; Trident/4.0; .NET CLR 2.0.50727; InfoPath.1; .NET4.0C; .NET4.0E;
.NET CLR 3.5.30729; .NET CLR 3.0.4506.2152)\r\nAccept-Encoding: gzip,
deflate\r\nHost: static.complianceonline.com\r\nProxy-Connection:
Keep-Alive\r\nCookie:
__utma=211283463.1311775861.1339095920.1339099387.1339166536.3;
__utmz=211283463.1339166536.3.3.utmcsr=M-New_JU13_Mark_JN04_DM|utmccn=(n
ot%2520set)|utmcmd=(not%2520set)\r\n] [HTTP/1.1 200 OK\r\nServer:
nginx\r\nDate: Fri, 08 Jun 2012 19:50:04 GMT\r\nContent-Type:
image/gif\r\nConnection: keep-alive\r\nKeep-Alive: timeout=300\r\nETag:
W/2054-1336565804000\r\nContent-Length: 2054\r\n\r]
1339187191.120 90 192.168.70.125 TCP_MISS/200 1039 GET
http://static.complianceonline.com/images/main/foo_go.jpg -
DIRECT/209.128.85.3 image/jpeg [Accept: */*\r\nReferer:
http://www.complianceonline.com/ecommerce/control/showcart\r\nAccept-Lan
guage: en-us\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows
NT 5.1; Trident/4.0; .NET CLR 2.0.50727; InfoPath.1; .NET4.0C; .NET4.0E;
.NET CLR 3.5.30729; .NET CLR 3.0.4506.2152)\r\nAccept-Encoding: gzip,
deflate\r\nHost: static.complianceonline.com\r\nProxy-Connection:
Keep-Alive\r\nCookie:
__utma=211283463.1311775861.1339095920.1339099387.1339166536.3;
__utmz=211283463.1339166536.3.3.utmcsr=M-New_JU13_Mark_JN04_DM|utmccn=(n
ot%2520set)|utmcmd=(not%2520set)\r\n] [HTTP/1.1 200 OK\r\nServer:
nginx\r\nDate: Fri, 08 Jun 2012 19:50:04 GMT\r\nContent-Type:
image/jpeg\r\nConnection: keep-alive\r\nKeep-Alive: timeout=300\r\nETag:
W/749-133656589\r\nContent-Length: 749\r\n\r]
1339187191.157 92 192.168.70.125 TCP_MISS/200 1711 GET
http://static.complianceonline.com/images/main/foot_MS.jpg -
DIRECT/209.128.85.3 image/jpeg [Accept: */*\r\nReferer:
http://www.complianceonline.com/ecommerce/control/showcart\r\nAccept-Lan
guage: en-us\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows
NT 5.1; Trident/4.0; .NET CLR 2.0.50727; InfoPath.1; .NET4.0C; .NET4.0E;
.NET CLR 3.5.30729; .NET CLR 3.0.4506.2152)\r\nAccept-Encoding: gzip,
deflate\r\nHost: static.complianceonline.com\r\nProxy-Connection:
Keep-Alive\r\nCookie:
__utma=211283463.1311775861.1339095920.1339099387.1339166536.3;
__utmz=211283463.1339166536.3.3.utmcsr=M-New_JU13_Mark_JN04_DM|utmccn=(n
ot%2520set)|utmcmd=(not%2520set)\r\n] [HTTP/1.1 200 OK\r\nServer:
nginx\r\nDate: Fri, 08 Jun 2012 19:50:04 GMT\r\nContent-Type:
image/jpeg\r\nConnection: keep-alive\r\nKeep-Alive: timeout=300\r\nETag:
W/1419-1336565934000\r\nContent-Length: 1419\r\n\r]
1339187191.165 92 192.168.70.125 TCP_MISS/200 597 GET
http://static.complianceonline.com/images/main/ho_dotline3.jpg -
DIRECT/209.128.85.3 image/jpeg [Accept: */*\r\nReferer:
http://www.complianceonline.com/ecommerce/control/showcart\r\nAccept-Lan
guage: en-us\r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows
NT 5.1; Trident/4.0; .NET CLR 2.0.50727; InfoPath.1; .NET4.0C; .NET4.0E;
.NET CLR 3.5.30729; .NET CLR 3.0.4506.2152)\r\nAccept-Encoding: gzip,
deflate\r\nHost: static.complianceonline.com\r\nProxy-Connection:
Keep-Alive\r\nCookie:
__utma=211283463.1311775861.1339095920.1339099387.1339166536.3;

RE: [squid-users] After reloading squid3, takes about 2 minutes to serve pages?

2011-12-20 Thread Terry Dobbs
Thanks.

After looking into it more, it appears squidGuard seems to be taking a
while to initialize the blacklists. The only reason I have to reload
squid3 is for squidGuard to recognize the new blacklist entries.

I am using Berkley DB for the first time, perhaps that's why it takes
longer? Although, I don't really see what Berkley DB is doing for me as
I am still using flat files for my domains/urls? Guess I should take
this to the squidGuard list!

-Original Message-
From: Eliezer Croitoru [mailto:elie...@ec.hadorhabaac.com] 
Sent: Monday, December 19, 2011 1:04 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] After reloading squid3, takes about 2 minutes
to serve pages?

On 19/12/2011 19:12, Terry Dobbs wrote:
it's an old issue from squid 3.1 to 3.2 there is nothing yet as far as i

know that solves this issue.

Regards
Eliezer
 Hi All.

 I just installed squid3 after running squid2.5 for a number of years.
I
 find after reloading squid3 and trying to access the internet on a
proxy
 client it takes about 2 minutes until pages load. For example, if I
 reload squid3 and try to access a page, such as www.tsn.ca it will try
 to load for a minute or 2 until it finally displays. I understand I
 shouldn't need to reload squid3 too much, but is there something I am
 missing to make this happen? I am not using it for cacheing just for
 monitoring/website control. Here is the log from when I was trying to
 access the mentioned site:

 1324310991.377  2 192.168.70.97 TCP_DENIED/407 2868 GET
 http://www.tsn.ca/ - NONE/- text/html [Accept: image/gif, image/jpeg,
 image/pjpeg, image/pjpeg, application/x-shockwave-flash,
 application/xaml+xml, application/vnd.ms-xpsdocument,
 application/x-ms-xbap, application/x-ms-application,
 application/vnd.ms-excel, application/vnd.ms-powerpoint,
 application/msword, */*\r\nAccept-Language: en-us\r\nUser-Agent:
 Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET
CLR
 2.0.50727; InfoPath.1)\r\nAccept-Encoding: gzip,
 deflate\r\nProxy-Connection: Keep-Alive\r\nHost: www.tsn.ca\r\nCookie:
 TSN=NameKey={ffc1186b-54bb-47ef-b072-097f5fafc5f2};
 __utma=54771374.1383136889.1323806167.1324305925.1324309890.7;

__utmz=54771374.1323806167.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(n
 one); __utmb=54771374.1.10.1324309890\r\n] [HTTP/1.0 407 Proxy
 Authentication Required\r\nServer: squid/3.0.STABLE19\r\nMime-Version:
 1.0\r\nDate: Mon, 19 Dec 2011 16:09:51 GMT\r\nContent-Type:
 text/html\r\nContent-Length: 2485\r\nX-Squid-Error:
 ERR_CACHE_ACCESS_DENIED 0\r\nProxy-Authenticate: NTLM\r\n\r]
 1324310991.447  5 192.168.70.97 TCP_DENIED/407 3244 GET
 http://www.tsn.ca/ - NONE/- text/html [Accept: image/gif, image/jpeg,
 image/pjpeg, image/pjpeg, application/x-shockwave-flash,
 application/xaml+xml, application/vnd.ms-xpsdocument,
 application/x-ms-xbap, application/x-ms-application,
 application/vnd.ms-excel, application/vnd.ms-powerpoint,
 application/msword, */*\r\nAccept-Language: en-us\r\nUser-Agent:
 Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET
CLR
 2.0.50727; InfoPath.1)\r\nAccept-Encoding: gzip,
 deflate\r\nProxy-Connection: Keep-Alive\r\nCookie:
 TSN=NameKey={ffc1186b-54bb-47ef-b072-097f5fafc5f2};
 __utma=54771374.1383136889.1323806167.1324305925.1324309890.7;

__utmz=54771374.1323806167.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(n
 one); __utmb=54771374.1.10.1324309890\r\nProxy-Authorization: NTLM
 TlRMTVNTUAABB4IIogAFASgKDw==\r\nHost:
 www.tsn.ca\r\n] [HTTP/1.0 407 Proxy Authentication Required\r\nServer:
 squid/3.0.STABLE19\r\nMime-Version: 1.0\r\nDate: Mon, 19 Dec 2011
 16:09:51 GMT\r\nContent-Type: text/html\r\nContent-Length:
 2583\r\nX-Squid-Error: ERR_CACHE_ACCESS_DENIED
0\r\nProxy-Authenticate:
 NTLM

TlRMTVNTUAACEgASADAFgomid3FHZLqI7WsAAIoAigBCQwBPAE4A

VgBFAEMAVABPAFIAAgASAEMATwBOAFYARQBDAFQATwBSAAEACgBTAFEAVQBJAEQABAAmAGEA

cwBzAG8AYwBpAGEAdABlAGQAYgByAGEAbgBkAHMALgBjAGEAAwA0AHUAYgB1AG4AdAB1AC4A
 YQBzAHMAbwBjAGkAYQB0AGUAZABiAHIAYQBuAGQAcwAuAGMAYQAA\r\n\r]



[squid-users] After reloading squid3, takes about 2 minutes to serve pages?

2011-12-19 Thread Terry Dobbs
Hi All.

I just installed squid3 after running squid2.5 for a number of years. I
find after reloading squid3 and trying to access the internet on a proxy
client it takes about 2 minutes until pages load. For example, if I
reload squid3 and try to access a page, such as www.tsn.ca it will try
to load for a minute or 2 until it finally displays. I understand I
shouldn't need to reload squid3 too much, but is there something I am
missing to make this happen? I am not using it for cacheing just for
monitoring/website control. Here is the log from when I was trying to
access the mentioned site:

1324310991.377  2 192.168.70.97 TCP_DENIED/407 2868 GET
http://www.tsn.ca/ - NONE/- text/html [Accept: image/gif, image/jpeg,
image/pjpeg, image/pjpeg, application/x-shockwave-flash,
application/xaml+xml, application/vnd.ms-xpsdocument,
application/x-ms-xbap, application/x-ms-application,
application/vnd.ms-excel, application/vnd.ms-powerpoint,
application/msword, */*\r\nAccept-Language: en-us\r\nUser-Agent:
Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR
2.0.50727; InfoPath.1)\r\nAccept-Encoding: gzip,
deflate\r\nProxy-Connection: Keep-Alive\r\nHost: www.tsn.ca\r\nCookie:
TSN=NameKey={ffc1186b-54bb-47ef-b072-097f5fafc5f2};
__utma=54771374.1383136889.1323806167.1324305925.1324309890.7;
__utmz=54771374.1323806167.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(n
one); __utmb=54771374.1.10.1324309890\r\n] [HTTP/1.0 407 Proxy
Authentication Required\r\nServer: squid/3.0.STABLE19\r\nMime-Version:
1.0\r\nDate: Mon, 19 Dec 2011 16:09:51 GMT\r\nContent-Type:
text/html\r\nContent-Length: 2485\r\nX-Squid-Error:
ERR_CACHE_ACCESS_DENIED 0\r\nProxy-Authenticate: NTLM\r\n\r]
1324310991.447  5 192.168.70.97 TCP_DENIED/407 3244 GET
http://www.tsn.ca/ - NONE/- text/html [Accept: image/gif, image/jpeg,
image/pjpeg, image/pjpeg, application/x-shockwave-flash,
application/xaml+xml, application/vnd.ms-xpsdocument,
application/x-ms-xbap, application/x-ms-application,
application/vnd.ms-excel, application/vnd.ms-powerpoint,
application/msword, */*\r\nAccept-Language: en-us\r\nUser-Agent:
Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR
2.0.50727; InfoPath.1)\r\nAccept-Encoding: gzip,
deflate\r\nProxy-Connection: Keep-Alive\r\nCookie:
TSN=NameKey={ffc1186b-54bb-47ef-b072-097f5fafc5f2};
__utma=54771374.1383136889.1323806167.1324305925.1324309890.7;
__utmz=54771374.1323806167.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(n
one); __utmb=54771374.1.10.1324309890\r\nProxy-Authorization: NTLM
TlRMTVNTUAABB4IIogAFASgKDw==\r\nHost:
www.tsn.ca\r\n] [HTTP/1.0 407 Proxy Authentication Required\r\nServer:
squid/3.0.STABLE19\r\nMime-Version: 1.0\r\nDate: Mon, 19 Dec 2011
16:09:51 GMT\r\nContent-Type: text/html\r\nContent-Length:
2583\r\nX-Squid-Error: ERR_CACHE_ACCESS_DENIED 0\r\nProxy-Authenticate:
NTLM
TlRMTVNTUAACEgASADAFgomid3FHZLqI7WsAAIoAigBCQwBPAE4A
VgBFAEMAVABPAFIAAgASAEMATwBOAFYARQBDAFQATwBSAAEACgBTAFEAVQBJAEQABAAmAGEA
cwBzAG8AYwBpAGEAdABlAGQAYgByAGEAbgBkAHMALgBjAGEAAwA0AHUAYgB1AG4AdAB1AC4A
YQBzAHMAbwBjAGkAYQB0AGUAZABiAHIAYQBuAGQAcwAuAGMAYQAA\r\n\r]


RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-05 Thread Terry Dobbs
The internet line is DSL, and does use a username/password (PPoE).
However, on the actual DSL router (provided by ISP) I don't see any MTU
options. 

I will have to look into ip tables. I can add static routes via the
interface card which are permanent, however doing it this way doesn't
give me any options for mss, mtu, etc.. All I can enter this way is
Source, Destination, Gateway.

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Friday, April 04, 2008 6:19 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

fre 2008-04-04 klockan 13:56 -0400 skrev Terry Dobbs:
 Thanks so much, the advmss worked like a charm. How do I make it so
this
 route stays there? When I restart networking it seems to vanish.

Some things first.. you should figure out if the MTU is local or remote.
As it's mostly you having issues I would suspect it's local. In such
case you should have a lower mss on the default route to make TCP/IP
work better.

How are you connected to the Internet? ADSL with PPPoE, or some other
tunneling method which has a lover MTU than the default 1500?

How to set the routing is quite distribution dependent, and I am not
very familiar with SuSE. But on the good side you can use iptables to
acheive the same thing, or maybe rules in your router.

Regards
Henrik



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-04 Thread Terry Dobbs
Thanks so much, the advmss worked like a charm. How do I make it so this
route stays there? When I restart networking it seems to vanish.

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Friday, April 04, 2008 1:13 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.


tor 2008-04-03 klockan 12:36 -0400 skrev Terry Dobbs:

 Also, the second command gives me an error and says mss is a
garbage.

Sorry, should be advmss

  /sbin/ip route add 63.148.24.5 via your.internet.gateway advmss 496

to replace an already existing route use replace instead of add..

Regards
Henrik



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-03 Thread Terry Dobbs
I tried adding the first route, but it didn't seem to make a difference.
The ethereal capture still shows my squid box sending window size of
1460? Do I need to restart networking to take effect, when I do this it
wipes out the route?

Also, the second command gives me an error and says mss is a garbage.

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, April 02, 2008 7:44 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

ons 2008-04-02 klockan 15:43 -0400 skrev Terry Dobbs:
 Ok folks, here is my packet capture; I included only the transmissions
 between the 2 relevant devices (SUSE Server and the problematic
 website).

The capture looks very much like the issues seen by window scaling, but
there is no window in scaling in this trace... A bit confused..

Guessing wildly here, but my first action would be to upgrade the kernel
just in case it's a known tcp problem which has been worked around
already..

Another thing you can try is to decrease the window size to a very small
size

  /sbin/ip route add 63.148.24.5 via your.internet.gateway window 1480

this isn't optimal for performance, but may work around certain broken
firewalls if there is packet reordering at play..

You can also try lowering the MSS, in case there is a MTU blackhole...

  /sbin/ip route add 63.148.24.5 via your.internet.gateway mss 496

Regards
Henrik



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-02 Thread Terry Dobbs
Hey,

I did the command you mentioned and it didn't seem to make a difference.
Is there anything special I need to do after running the command.

Also, when running ethereal it doesn't seem to be capturing web traffic,
catching lots of ARP, but nothing web related. When running on Windows
behind the SUSE box I can capture web traffic, is there something
obvious I am missing here?

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 01, 2008 7:07 PM
To: Terry Dobbs
Cc: J Beris; squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

tis 2008-04-01 klockan 18:00 -0400 skrev Terry Dobbs:
 Would you want the trace from the squid server, or from a client
behind
 the squid server?
 
 Also, the TCP scaling fix, it was just to add a record to the file
 right?
 
 Also, I tried doing the window scaling again. Is it just as simple as
 creating the file tcp_default_win_scale in /proc/sys/net/ipv4?

The simplest way to test if it's window scaling biting the host (or to
be correct it's firewall) is to disable window scaling.

echo 0 /proc/sys/net/ipv4/tcp_window_scaling

The sysctls have changed somewhat since the lwn.net article was written
many years ago.

Regards
Henrik



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-02 Thread Terry Dobbs
Hi, I got the capture working, and sent you the file earlier on. When I
tried sending it to the list it kept bouncing back. It is very small,
and I zipped it up.

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, April 02, 2008 5:54 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

ons 2008-04-02 klockan 11:56 -0400 skrev Terry Dobbs:

 Also, when running ethereal it doesn't seem to be capturing web
traffic,
 catching lots of ARP, but nothing web related. When running on Windows
 behind the SUSE box I can capture web traffic, is there something
 obvious I am missing here?

Should just work.

Try capturing on the Any interface, in case traffic isn't going the
direction you think..

Regards
Henrik



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-01 Thread Terry Dobbs
Yea, I understand that this issue really isn't squid related, just was
hoping someone running squid on suse linux has had a similar issue. I am
running Suse Linux 10 and I can ping the domain from the server. I just
cant browse to it, I get an error box in Mozilla saying Document
contains no data.

This is obviously why the squid users cant access. I thought it might be
a DNS issue, but that's crossed off as I can ping the domain, and it
resolves to correct address. 


-Original Message-
From: J Beris [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 01, 2008 9:36 AM
To: Terry Dobbs; Henrik Nordstrom
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

 Thanks for checking. Odd, not sure why this wont work here, the only
 problem like this that I have had in the few years ive used it.

Hi Terry/Henrik,

No problem, little effort to click the link :-)
I made one small mistake, our proxy runs on openSUSE 10.2, not 10.3 as
reported earlier.

Which release of openSUSE do you run? Perhaps there's a difference
between those 2 versions (although, having used both, I can't think of
anything related to this case...)

Regards,

Joop

 
Dit bericht is gescand op virussen en andere gevaarlijke
inhoud door MailScanner en lijkt schoon te zijn.
Mailscanner door http://www.prosolit.nl
Professional Solutions fot IT



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-01 Thread Terry Dobbs
Thanks for checking. Odd, not sure why this wont work here, the only
problem like this that I have had in the few years ive used it. 

-Original Message-
From: J Beris [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 01, 2008 3:46 AM
To: Henrik Nordstrom; Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

  Can other people here access this site using Suse Linux?

Yes, works perfectly here behind a squid-2.6.STABLE6-0.8 proxy on
openSUSE 10.3. Both Firefox and IE.
 
 What was the site again?

http://www.franklintraffic.com/

Regards,

Joop

 
Dit bericht is gescand op virussen en andere gevaarlijke
inhoud door MailScanner en lijkt schoon te zijn.
Mailscanner door http://www.prosolit.nl
Professional Solutions fot IT



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-01 Thread Terry Dobbs
Yea, im lost on this one. Ethereal doesn't show anything strange, just
the initial connection request, just doesn't seem to get anything back.

Doesn't really make sense that only this one site (at least that I know
of) is having this issue. The SUSE firewall is turned off, network card
is configured properly, etc...

-Original Message-
From: J Beris [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 01, 2008 10:05 AM
To: Terry Dobbs; Henrik Nordstrom
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

 This is obviously why the squid users cant access. I thought it might
 be
 a DNS issue, but that's crossed off as I can ping the domain, and it
 resolves to correct address.

Yes, if you can ping and resolve, it's not DNS related.
I'd fire up wireshark/ethereal and grab the communication that way, see
if that clears things up a bit more. Like this, it's hard to
troubleshoot.

Regards,

Joop

 
Dit bericht is gescand op virussen en andere gevaarlijke
inhoud door MailScanner en lijkt schoon te zijn.
Mailscanner door http://www.prosolit.nl
Professional Solutions fot IT



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-04-01 Thread Terry Dobbs
Would you want the trace from the squid server, or from a client behind
the squid server?

Also, the TCP scaling fix, it was just to add a record to the file
right?

Also, I tried doing the window scaling again. Is it just as simple as
creating the file tcp_default_win_scale in /proc/sys/net/ipv4?

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 01, 2008 5:37 PM
To: Terry Dobbs
Cc: J Beris; squid-users@squid-cache.org
Subject: RE: [squid-users] Unable to access a website through
Suse/Squid.

tis 2008-04-01 klockan 17:29 -0400 skrev Terry Dobbs:
 Yea, im lost on this one. Ethereal doesn't show anything strange, just
 the initial connection request, just doesn't seem to get anything
back.
 
 Doesn't really make sense that only this one site (at least that I
know
 of) is having this issue. The SUSE firewall is turned off, network
card
 is configured properly, etc...

Post the trace somewhere and we may take a look if something can be
identified.

My bet is still TCP window scaling.. it's the most common source to this
problem these days.

Regards
Henrik



[squid-users] Unable to access a website through Suse/Squid.

2008-03-31 Thread Terry Dobbs
Hi,

 

Some users in our company need to access a website
(http://www.franklintraffic.com http://www.franklintraffic.com/ ). Any
user that is going through the squid proxy (running on SUSE Linux) is
unable to get to this site, just kind of times out. When I try to get to
this site directly from the SUSE machine I am unable to, it just says
Document contains no data. This site however works fine from Internet
Explorer on a machine open to the internet (not going through proxy).

 

I have been racking my brain over this one. I am able to ping the
website from the SUSE machine, just cant www to it. Anyone know why this
is? Is it a configuration issue on the server, on the website?

 

I understand this may not be 100% squid related, but im sure others
running squid on SLE have experienced a similar issue?



RE: [squid-users] Unable to access a website through Suse/Squid.

2008-03-31 Thread Terry Dobbs
Yea, I did stumble across those a few days ago, and tried doing what it said to 
no avail.

Can other people here access this site using Suse Linux?

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Monday, March 31, 2008 3:15 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Unable to access a website through Suse/Squid.

mån 2008-03-31 klockan 11:30 -0400 skrev Terry Dobbs:


 I have been racking my brain over this one. I am able to ping the
 website from the SUSE machine, just cant www to it. Anyone know why this
 is? Is it a configuration issue on the server, on the website?

There is quite many broken firewalls out on the Internet which falls
down when clients  servers have modern TCP/IP implementations such as
Linux..

The Squid FAQ has workarounds for most of them.

http://wiki.squid-cache.org/SquidFaq/SystemWeirdnesses#head-699d810035c099c8b4bff21e12bb365438a21027
and
http://wiki.squid-cache.org/SquidFaq/SystemWeirdnesses#head-4920199b311ce7d20b9a0d85723fd5d0dfc9bc84

There is more, but these two is the most common ones..

some sites have also been seen having problems with tcp timestamping,
but these are very rare today..

Regards
Henrik



RE: [squid-users] Anyone Use wbinfo_group.pl?

2007-11-28 Thread Terry Dobbs
What exactly do you mean?

Should I set it up like this?
external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
acl NoInternet external ntgroup NoInternet

http_access deny NoInternet ALL

So by default the last thing on the line is AUTH? What exactly does the
ALL do to make it not pop up (it appears to work btw).

Also, when changing group membership in AD, for the changes to take
effect would you have to reload squid, samba, and winbind? Is there
anyway (other than editing the default squid error page, to redirect
them if they are blocked? I do this with squidguard, not sure if its
possible with this script/squid.

Thanks


-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 28, 2007 3:15 AM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Anyone Use wbinfo_group.pl?

Terry Dobbs wrote:
 Hey
 
 I have a transparent proxy setup using squid, winbind, samba, etc... I
 got sick of manually blocking IP addresses from accessing the internet
 and stumbled across an article (thank god for google!) that allows
 access based on AD Group.
 
 It pretty much looks like...
 
 external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
 acl NoInternet external ntgroup NoInternet
 
 Then there is the http_access deny line that denies the NoInternet
 group.
 
 This seems to work fine, if a user belongs to the NoInternet group
they
 are prompted for Username/Password and even if they put in the correct
 credentials they aren't allowed to go anywhere. 
 
 My question is, instead of prompting for username/password if a user
 belongs to the group, how do I just redirect them to a page? No other
 time is my users prompted for authentication as it uses the NT pass
 through credentials, so not sure why it wants to prompt now.
 
 Hoping someone out there is doing something similar? 

The credientials are asked again because auth is the last option to 
complete the http_access rule.

There is a hack/workaround of adding 'all' as the last item on the line 
which apparently prevents the credentials being sought if they fail the 
first time.

I suspect your other rules go something like
   http_access !noauth localnet
which has the same effect of not requesting again on failure.

Amos


[squid-users] wbinfo_group.pl - This ever happen to anyone?

2007-11-28 Thread Terry Dobbs
Ok, so I have wbinfo_group.pl working nicely on our local squid box, it
blocks users belonging to a particular group.

However, I have done the exact same thing on a remote box in the USA and
it doesn't want to work. When I run wbinfo -r username I get no results,
I used to get could not get groups for user now I get nothing
returned. I am pretty sure this is what is causing the wbinfo_group.pl
not to work. The logs don't give me much useful information.

On the local box that wbinfo_group.pl is working I get the error Could
Not convert SID='S-1-5-21-1122444-424242525-5353622-42124- User(1) to
gid when I do it manually, but I thought nothing of it as it works. The
same thing happens on the box that isn't working as well, but that box
will not even do a wbinfo -r.

I have verified that I do indeed had idmap uid and gids mapped in
smb.conf. I have used the local Domain Controller at the remote site to
authenticate thinking it may be timing out or something (is there a
timeout?) I have goggled like crazy, I know this is a SMB issue but im
wondering if anyone has ever had a similar issue in their squid setup. 

I would be VERY grateful if anyone has any kind of insight.


FW: [squid-users] wbinfo_group.pl - This ever happen to anyone?

2007-11-28 Thread Terry Dobbs
Ok, for anyone with a similar issue the problem was in smb.conf file. 

I had:

idmap uid 1 - 2

There can not be any spaces... ughhh the time I spent looking at
completely different things!

Also, when adding a user to a group in a wbinfo_group.pl access list,
does squid need to be reloaded?

-Original Message-
From: Terry Dobbs [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 28, 2007 1:38 PM
To: squid-users@squid-cache.org
Subject: [squid-users] wbinfo_group.pl - This ever happen to anyone?

Ok, so I have wbinfo_group.pl working nicely on our local squid box, it
blocks users belonging to a particular group.

However, I have done the exact same thing on a remote box in the USA and
it doesn't want to work. When I run wbinfo -r username I get no results,
I used to get could not get groups for user now I get nothing
returned. I am pretty sure this is what is causing the wbinfo_group.pl
not to work. The logs don't give me much useful information.

On the local box that wbinfo_group.pl is working I get the error Could
Not convert SID='S-1-5-21-1122444-424242525-5353622-42124- User(1) to
gid when I do it manually, but I thought nothing of it as it works. The
same thing happens on the box that isn't working as well, but that box
will not even do a wbinfo -r.

I have verified that I do indeed had idmap uid and gids mapped in
smb.conf. I have used the local Domain Controller at the remote site to
authenticate thinking it may be timing out or something (is there a
timeout?) I have goggled like crazy, I know this is a SMB issue but im
wondering if anyone has ever had a similar issue in their squid setup. 

I would be VERY grateful if anyone has any kind of insight.


[squid-users] Anyone Use wbinfo_group.pl?

2007-11-27 Thread Terry Dobbs
Hey

I have a transparent proxy setup using squid, winbind, samba, etc... I
got sick of manually blocking IP addresses from accessing the internet
and stumbled across an article (thank god for google!) that allows
access based on AD Group.

It pretty much looks like...

external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
acl NoInternet external ntgroup NoInternet

Then there is the http_access deny line that denies the NoInternet
group.

This seems to work fine, if a user belongs to the NoInternet group they
are prompted for Username/Password and even if they put in the correct
credentials they aren't allowed to go anywhere. 

My question is, instead of prompting for username/password if a user
belongs to the group, how do I just redirect them to a page? No other
time is my users prompted for authentication as it uses the NT pass
through credentials, so not sure why it wants to prompt now.

Hoping someone out there is doing something similar? 

Thanks!


[squid-users] RE: [Bulk] Re: [squid-users] Anyone Use wbinfo_group.pl?

2007-11-27 Thread Terry Dobbs
Sorry, I just mean the authentication is transparent. Where, the users just
open up IE and don't need to login, it passes the credentials from
Windows... 

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, November 27, 2007 9:36 PM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: [Bulk] Re: [squid-users] Anyone Use wbinfo_group.pl?

How do you mean transparent proxy ? Are you referring to the
authentication
being transparent, or are you referring to using port 80 TCP redirection
rather than statically controlled proxy configurations in browsers?



Adrian

On Tue, Nov 27, 2007, Terry Dobbs wrote:
 Hey
 
 I have a transparent proxy setup using squid, winbind, samba, etc... I
 got sick of manually blocking IP addresses from accessing the internet
 and stumbled across an article (thank god for google!) that allows
 access based on AD Group.
 
 It pretty much looks like...
 
 external_acl_type ntgroup %LOGIN /usr/lib/squid/wbinfo_group.pl
 acl NoInternet external ntgroup NoInternet
 
 Then there is the http_access deny line that denies the NoInternet
 group.
 
 This seems to work fine, if a user belongs to the NoInternet group they
 are prompted for Username/Password and even if they put in the correct
 credentials they aren't allowed to go anywhere. 
 
 My question is, instead of prompting for username/password if a user
 belongs to the group, how do I just redirect them to a page? No other
 time is my users prompted for authentication as it uses the NT pass
 through credentials, so not sure why it wants to prompt now.
 
 Hoping someone out there is doing something similar? 
 
 Thanks!

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid
Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


-- 
No virus found in this incoming message.
Checked by AVG Free Edition. 
Version: 7.5.503 / Virus Database: 269.16.8/1154 - Release Date: 11/27/2007
11:40 AM




RE: [squid-users] Confusing about login name in AD-proxy authentication?

2007-09-19 Thread Terry Dobbs
Couldn't you just user seamless authentication? Where users don't have
to authenticate if they are already logged in?

-Original Message-
From: chowalit.lab Chowalit Lab Linux [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, September 19, 2007 12:26 PM
To: Kinkie
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Confusing about login name in AD-proxy
authentication?

Oh.. Bad news for me... Oh coding 

Thanks you


On 9/17/07, Kinkie [EMAIL PROTECTED] wrote:
 On 9/15/07, chowalit.lab Chowalit Lab Linux [EMAIL PROTECTED]
wrote:
  Dear All
First of all I will explain about my system.
I have authenticate proxy with account from windows 2003
server.
  I use ntlm.
On login pop-up, I must use MYDOMAIN\username  into login
box.
   My question is -- How to configurate my system (both of
windows
  and squid) to support login name like [EMAIL PROTECTED]?

 It's doable but it requires some coding on the auth helpers to parse
 and normalize the user name.


 --
 /kinkie



RE: [squid-users] Squid + WPAD issues

2007-06-06 Thread Terry Dobbs
Yes, your right. I need the myIpAddress(), however like you said it
doesn't always works as desired. I also read somewhere that not all
browsers support that particular function. Right now that's what im
using (in theory I really don't care what proxy they use as they can
authenticate to either, but it makes logical and geographical sense to
distinguish between the two), but your idea seems pretty cool.

What exactly do you do though? What kind of script do you point them to,
is it the .pac java script? (anyway we can see a sample?). Im assuming
you do it in the Automatic Configuration Script field in Internet
Explorer, or do you still use the WPAD.dat file?  

Thanks for any input.

-Original Message-
From: K K [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, June 06, 2007 4:30 AM
To: Terry Dobbs
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid + WPAD issues

On 6/5/07, Terry Dobbs [EMAIL PROTECTED] wrote:
 We have been using a proxy server with a WPAD.dat file for a year or 
 two. Now, we have setup another squid server in a remote site. I need 
 to configure the WPAD.dat file in a way where if you are on subnet A 
 use Proxy Server A and if you are on subnet B user proxy server B.

In my environment, I've solved this by having a single proxy script and
setting all browsers to use the same URL, but the server where the file
is hosted actually generates the contents on the fly.

This way the script can be customized by the server in ways not
supported in the client, including providing a different default proxy
server/port to different clients.

The other reason I do this is to eliminate 99.9% of the DNS lookups by
the client -- in theory, we could disable Internet resolution by
internal workstations (we've done this once or twice,mostly by
accident) and so long as the proxy server was able to resolve, browsers
would never notice.


 For the life of me, I cannot get this to work. For example, I am using

 what is seen below, and it seems the only line that works is the
else
 statement so everyone is using the same server?

Where you say:
  if (isInNet(host,192.168.0.0,255.255.0.0))

I think you meant:
  if (isInNet(myIpAddress() ,192.168.0.0,255.255.0.0))

While myIpAddress() is documented in the original Netscape
specification, it doesn't have provisions for hosts with multiple
interfaces. In the past I've seen false negatives, where the above test
returns false when it really should have been true.  That's one reason
we instead have the web server hosting the script look at REMOTE_ADDR
instead.


Kevin
--
http://wiki.squid-cache.org/Technology/WPAD
^Watch this space^




[squid-users] Squid + WPAD issues

2007-06-05 Thread Terry Dobbs
Hi All,

We have been using a proxy server with a WPAD.dat file for a year or
two. Now, we have setup another squid server in a remote site. I need to
configure the WPAD.dat file in a way where if you are on subnet A use
Proxy Server A and if you are on subnet B user proxy server B.

For the life of me, I cannot get this to work. For example, I am using
what is seen below, and it seems the only line that works is the else
statement so everyone is using the same server?

function FindProxyForURL(url, host)
{
if (isPlainHostName(host))
return DIRECT;
else if (isInNet(host,192.168.0.0,255.255.0.0))
return PROXY 192.168.10.14:3128;
else if (isInNet(host,192.150.170.0,255.255.255.0))
return PROXY 192.150.170.120:3128;
else
return PROXY 192.150.170.120:3128;
}

Any help would be GREATLY appreciated!! All machines run IE 6 or 7 and
are on Win2K/WinXP.

Thanks


[squid-users] Fw: [Bulk] [squid-users] Certain sites (mainly java) causing problems

2006-12-17 Thread Terry Dobbs

Nobody with similar issues?

- Original Message - 
From: Terry Dobbs [EMAIL PROTECTED]

To: squid-users@squid-cache.org
Sent: Friday, December 15, 2006 2:30 PM
Subject: [Bulk] [squid-users] Certain sites (mainly java) causing problems



Hi All,

I have been using squid for about 2 years and it has been working fairly 
well. We run a transparent proxy with NTLM authentication. The only 
headaches I get are when users are using certain websites. Sites that use 
java (such as banking sites, shipping sites, etc...) often cause these 
problems. The users get a prompt box asking for their credendtials, I can 
make this go away by setting up an ACL for each dstdomain but sometimes 
the websites are using external java applets. Another common problem when 
accessing these sites is I get a blank page and then about 2 minutes later 
the page loads.


I was wondering if many people are experiencing these types of issues, and 
what they have done to get around them?


Thanks for any help.


--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.15.20/588 - Release Date: 
12/15/2006




--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.15.20/588 - Release Date: 
12/15/2006







--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.15.22/590 - Release Date: 12/16/2006



[squid-users] Certain sites (mainly java) causing problems

2006-12-15 Thread Terry Dobbs

Hi All,

I have been using squid for about 2 years and it has been working fairly 
well. We run a transparent proxy with NTLM authentication. The only 
headaches I get are when users are using certain websites. Sites that use 
java (such as banking sites, shipping sites, etc...) often cause these 
problems. The users get a prompt box asking for their credendtials, I can 
make this go away by setting up an ACL for each dstdomain but sometimes the 
websites are using external java applets. Another common problem when 
accessing these sites is I get a blank page and then about 2 minutes later 
the page loads.


I was wondering if many people are experiencing these types of issues, and 
what they have done to get around them?


Thanks for any help. 




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.15.20/588 - Release Date: 12/15/2006



[squid-users] Squid+NTLM picks on a single user.. Cant figure it out.

2006-09-27 Thread Terry Dobbs

Hey,

We are running squid with NTLM authentication. We have WPAD set up in DHCP 
and everything has running fairly smoothly for the past 6 months.


Yesterday one user couldnt access the internet, the login box would always 
pop up asking for his credentials, if he enters them it works fine, it just 
doesnt do it quietly in the background like it should. This users AD account 
looks right, it is not locked out/disabled, etc... and 80 other users can 
access the internet fine.


Now today, that user is not in the office, and another user is having the 
same issue. The only way for him to get on the internet is to manually type 
his credentials in the login prompt box that appears.


I have searched all the logs and cant find anything insightful, you 
guys/gals are my last resort, what could be happening?? 




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.405 / Virus Database: 268.12.8/455 - Release Date: 9/22/2006



Re: [squid-users] Is LDAP better than NTLM?

2006-09-14 Thread Terry Dobbs
Is there a guide somewhere that explains using NTLM Authentication via squid 
and restricting based on Winbindd groups?


- Original Message - 
From: Henrik Nordstrom [EMAIL PROTECTED]

To: Terry Dobbs [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Thursday, September 14, 2006 4:04 AM
Subject: Re: [squid-users] Is LDAP better than NTLM?




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.405 / Virus Database: 268.12.3/447 - Release Date: 9/13/2006



[squid-users] Is LDAP better than NTLM?

2006-09-13 Thread Terry Dobbs
Currently I am using NTLM Authentication (with winbindd) to authenticate 
users accessing the internet. This works pretty good after the initial 
setup, however there are nuances like once the DC is restarted or loses 
connectivity you need to restart the squid server (or winbindd) to get up 
and running again.


My question is whether LDAP is a better option? Will using LDAP require a 
user to login to access the internet? The thing I like about NTLM is it 
using the currently logged on credentials so the users doesn't need to 
login. I assume that by using LDAP I wont need to reboot the squid server if 
the connection to the DC is temporarily lost? It would also be nice to 
restrict users based on their AD group which I will be able to do with LDAP.


Any opinions are appreciated, as well as any guides people may have. 




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.405 / Virus Database: 268.12.3/446 - Release Date: 9/12/2006



Re: [squid-users] thoughts about squidGuard?

2006-05-20 Thread Terry Dobbs
Sorry to but in, but you can use squidguard without any database. We have 
been blocking site in a production environment for over a year using text 
files downloaded from the web.


- Original Message - 
From: Con Tassios [EMAIL PROTECTED]

To: squid-users@squid-cache.org
Sent: Saturday, May 20, 2006 3:25 AM
Subject: Re: [squid-users] thoughts about squidGuard?



On Fri, 19 May 2006, Philip Hachey wrote:


That's what I was hoping for: a package with all of the patches.

Unfortunately, I read this:
It needs a recent version of Berkeley Database ( 3.2 but  4.x) 
Since I'm using DB 4.2 and I do not wish to downgrade, I think I'll pass.


squidGuard can be made to use Berkeley DB 4.2 with the following patch


--- src/sgDb.c.orig 2004-03-09 03:45:59.0 +0100
+++ src/sgDb.c  2004-03-09 03:48:43.0 +0100
@@ -98,13 +98,13 @@
if(createdb)
  flag = flag | DB_TRUNCATE;
if ((ret =
-Db-dbp-open(Db-dbp, dbfile, NULL, DB_BTREE, flag, 0664)) != 0) 
{
+Db-dbp-open(Db-dbp, NULL, dbfile, NULL, DB_BTREE, flag, 0664)) 
!= 0) {

  (void) Db-dbp-close(Db-dbp, 0);
  sgLogFatalError(Error db_open: %s, strerror(ret));
}
  } else {
if ((ret =
-Db-dbp-open(Db-dbp, dbfile, NULL, DB_BTREE, DB_CREATE, 0664)) 
!= 0) {
+Db-dbp-open(Db-dbp, NULL, dbfile, NULL, DB_BTREE, DB_CREATE, 
0664)) != 0) {

  sgLogFatalError(Error db_open: %s, strerror(ret));
}
  }



--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.392 / Virus Database: 268.6.1/344 - Release Date: 5/19/2006






--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.392 / Virus Database: 268.6.1/344 - Release Date: 5/19/2006



Re: [squid-users] Updating Block Lists

2006-02-24 Thread Terry Dobbs
From my experience you have to reload squid to get it to recognize any 

changes...

squid -k reconfigure never worked for me, but I use the command
/etc/init.d/squid reload and it seems to work as desired.


- Original Message - 
From: Joseph Zappacosta [EMAIL PROTECTED]

To: squid-users@squid-cache.org
Sent: Friday, February 24, 2006 11:11 AM
Subject: [squid-users] Updating Block Lists



Hello,
I have came in to a situation where we are using squid to block adult web 
sites.  I search and searched and none of the previous posts seem to apply 
to my situation.  Squid guard is not installed, but there exists a list in 
the /etc directory called xxxsites.  I tested it and these are the sites 
that are being blocked.  However, when I add sites to the list, they are 
not then blocked.  Do I need some type of command to reinstate the list?


Thanks,

--
Joseph Zappacosta
Reading Public Library (Reading Consortium)
100 South Fifth Street
Reading, Pennsylvania 19602
Voice : 610-655-6350
FAX : 610-478-9035


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.1.0/269 - Release Date: 2/24/2006






--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.1.0/269 - Release Date: 2/24/2006



Re: [squid-users] no auth for one domain?

2006-02-24 Thread Terry Dobbs
The dstdomain workaround works perfectly. I had a training site users needed 
to access that contained WMPlayer streams, and users couldnt hear the 
background speech and would get prompted for the userid/passwd.


I did the following... 1st add a ACL for the domain.
acl NTLM_Bypass dstdomain foobar.com

Then allow the domain access, then the Authorized Users
http_access allow NTLM_Bypass
http_access allow AuthorizedUsers


- Original Message - 
From: nairb rotsak [EMAIL PROTECTED]

To: Mark Elsen [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Friday, February 24, 2006 3:57 PM
Subject: Re: [squid-users] no auth for one domain?



We ended up using AD Group policy to not go through
the proxy for that site... not ideal, but just to make
sure I understand the other way to do it

You can put the http_access with the acl before the
http_access allow_ntlm and it should work?

--- Mark Elsen [EMAIL PROTECTED] wrote:


 Is it possible to have my ntlm users go around 1
 domain?  We can't seem to get a state web site
(which
 uses a weird front end to it's client... but it
ends
 up on the web) to go through the proxy.  When we
sniff
 the traffic locally, it is popping up a 407, but
their
 isn't anyway to log in.

 I tried to put an acl and http_access higher in
the
 list in the .conf, but that didn't seem to matter?


It would have been more productive to show that
line, which you put
for that domain in squid.conf, offhand  probably it
should
resemble something like this :

acl ntlm_go_around dstdomain name-excluded-domain
...

http_access allow ntlm_go_around
http_access allow ntlm_users (provided proxy
AUTH ACL is named 'ntlm_users')

M.




__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around
http://mail.yahoo.com


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.1.0/269 - Release Date: 2/24/2006






--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.1.0/269 - Release Date: 2/24/2006



Re: [squid-users] no auth for one domain?

2006-02-23 Thread Terry Dobbs
Reading this, would it be possible to not require AUTH for a certain MIME 
header?


http_access allow header_type
http_access allow ntlm_users (provided proxy AUTH ACL is named 
'ntlm_users')


Sorry for butting in, just wondering..

Thanks


- Original Message - 
From: Mark Elsen [EMAIL PROTECTED]

To: nairb rotsak [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Thursday, February 23, 2006 7:44 PM
Subject: Re: [squid-users] no auth for one domain?



Is it possible to have my ntlm users go around 1
domain?  We can't seem to get a state web site (which
uses a weird front end to it's client... but it ends
up on the web) to go through the proxy.  When we sniff
the traffic locally, it is popping up a 407, but their
isn't anyway to log in.

I tried to put an acl and http_access higher in the
list in the .conf, but that didn't seem to matter?



It would have been more productive to show that  line, which you put
for that domain in squid.conf, offhand  probably it should
resemble something like this :

acl ntlm_go_around dstdomain name-excluded-domain
...

http_access allow ntlm_go_around
http_access allow ntlm_users (provided proxy AUTH ACL is named 
'ntlm_users')


M.


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.0.0/267 - Release Date: 2/22/2006




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 268.0.0/267 - Release Date: 2/22/2006



[squid-users] squid + windows media player

2006-02-22 Thread Terry Dobbs
Has anyone got this working properly? When users access a page that plays a 
.wav/mp3 there is userid/password prompt. If you click cancel it goes away 
until you go to the next page. These pages with the audio have a NSPlayer 
header.


I don't have any rules setup to allow only header X. Surely, someone has 
this working right?


If not, I guess the users will just have to get in the habit of clicking 
cancel!


Thanks for any help 




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.12/266 - Release Date: 2/21/2006



[squid-users] Re: [Bulk] Re: [squid-users] Certain header not authenticating

2006-02-20 Thread Terry Dobbs
My squid server has apache on it as well. The wpad.dat (and proxy.pac) is on 
the squid/apache server.


The only time this prompt appears is when users are using sites with the 
NSPlayer/9.00.00.2980. I'm getting the header from the squid useragent.log 
file. The prompt is attached. The squid prompt box asks for the domain name 
right?


Guess I may be barking up the wrong tree here.





- Original Message - 
From: Mark Elsen [EMAIL PROTECTED]

To: Terry Dobbs [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Monday, February 20, 2006 6:27 AM
Subject: [Bulk] Re: [squid-users] Certain header not authenticating



Hey,

I am running squid/squidGuard with NTLM authentication. Everything works
perfectly, except there is a site that some employees use for interactive
training. It seems when these employees go to this site they are 
continually

prompted for username/password with wpad.domainname.com as the realm.

After investigating, the useragent log is showing this:
192.168.12.102 - - [15/Feb/2006:15:30:49 -0500] GET /wpad.dat HTTP/1.1 
200

251 - NSPlayer/9.00.00.2980

It seems that the NSPlayer header is somehow not retrieving the wpad file
correctly? If the users click cancel nothing happens - they can continue,
but it pops up when they click to go to the next page. This is a major
annoyance for some users, and has become a headache for me. I haven't
explicitly set anything in squid.conf to only allow certain headers, im 
not
even sure if you can. I have searched hell and high-water, but to no 
avail.


Does anyone have any ideas?



 - WPAD is normally used for automatic proxy detection in IE.
 - I don't  understand why the remote webserver , would offer a WPAD.dat 
files

for 'internet users',  that doesn't make sense.
 - The remote webserver may be broken.

M.


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.11/264 - Release Date: 2/17/2006

attachment: prompt.JPG
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.11/264 - Release Date: 2/17/2006


[squid-users] Re: [Bulk] Re: [squid-users] Re: [Bulk] Re: [squid-users] Certain header not authenticating

2006-02-20 Thread Terry Dobbs
Well, NTLM authentication is working perfectly for users accessing every 
site except for a claritynet training site. I am able to report via SARG 
using userid's, and restrict sites using squid-guard.


There isn't any authentication setup on the webserver, its only use is to 
view SARG reports, SARG Realtime, and serve the wpad.dat file. Anyone 
(internally) can access these pages.


The part I don't understand is why when a user uses a page with the NSPlayer 
header it gives me the prompt. I'm not sure if it is squid giving me the 
problem, or Apache. I suspect it may be Apache, so I will look into that if 
nobody has encountered similar issues.



- Original Message - 
From: Mark Elsen [EMAIL PROTECTED]

To: Terry Dobbs [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Monday, February 20, 2006 9:35 AM
Subject: [Bulk] Re: [squid-users] Re: [Bulk] Re: [squid-users] Certain 
header not authenticating



On 2/20/06, Terry Dobbs [EMAIL PROTECTED] wrote:
My squid server has apache on it as well. The wpad.dat (and proxy.pac) is 
on

the squid/apache server.

The only time this prompt appears is when users are using sites with the
NSPlayer/9.00.00.2980. I'm getting the header from the squid useragent.log
file. The prompt is attached. The squid prompt box asks for the domain 
name

right?

Guess I may be barking up the wrong tree here.




- Note also; that if the remote webserver uses NTLM, things won´t
work either , as this protocol is not proxyable.

M.


--
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.11/264 - Release Date: 2/17/2006




--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.11/264 - Release Date: 2/17/2006



[squid-users] Certain header not authenticating

2006-02-19 Thread Terry Dobbs

Hey,

I am running squid/squidGuard with NTLM authentication. Everything works 
perfectly, except there is a site that some employees use for interactive 
training. It seems when these employees go to this site they are continually 
prompted for username/password with wpad.domainname.com as the realm.


After investigating, the useragent log is showing this:
192.168.12.102 - - [15/Feb/2006:15:30:49 -0500] GET /wpad.dat HTTP/1.1 200 
251 - NSPlayer/9.00.00.2980


It seems that the NSPlayer header is somehow not retrieving the wpad file 
correctly? If the users click cancel nothing happens - they can continue, 
but it pops up when they click to go to the next page. This is a major 
annoyance for some users, and has become a headache for me. I haven't 
explicitly set anything in squid.conf to only allow certain headers, im not 
even sure if you can. I have searched hell and high-water, but to no avail.


Does anyone have any ideas?

I really appreciate it.



--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.375 / Virus Database: 267.15.11/264 - Release Date: 2/17/2006