[squid-users] Kerb auth with LDAP groups

2010-11-01 Thread Kelly, Jack
Hi everyone,
I've successfully set up authentication to my proxy with squid_kerb_auth
to get us away from using basic LDAP authentication for everything. I
used the config guide from the squid-cache wiki (below) which worked
perfectly.
http://wiki.squid-cache.org/ConfigExamples/Authenticate/Kerberos


One thing I'd like to do is continue using LDAP Groups and/or
Organizational Units to grant permissions to certain websites. So my
question is in two parts:

Is there a way to use squid_ldap_auth such that it will only prompt for
credentials when you try to visit a certain website? (Previously I've
had it set up so it would prompt you right when the browser opens.)

Alternatively: Is there a straightforward equivalent to squid_ldap_group
when using Kerberos authentication?

Running 3.1.1 on Ubuntu x64, installed from Synaptic.

Thanks!
Jack
 


This message and any attachments are the property of WS Development, may be 
privileged or confidential 
and are intended only for the addressee. If you have received this email in 
error, please delete it 
immediately. Any views expressed herein are the author's and do not necessarily 
represent those of the company.


[squid-users] Slowness in downloading files, but not web browsing

2010-05-20 Thread Kelly, Jack
Hi everyone,
We're running Squid 3.1.1 on a virtual Ubuntu x64 server sitting on a
fiber LUN. It's been up for a couple of months without issue any issues
until recently.

Over the past week or so I've had users calling in to report that
downloading files from the internet has been very slow. They'll start
out with a fast download speed, but it will quickly go down to about
5kb/sec.

If I circumvent the proxy server and connect to these sites directly,
the download goes right through no problem.

I've tried restarting the Squid service with no luck. Any suggestions?

Thanks
Jack
 


This message and any attachments are the property of WS Development, may be 
privileged or confidential 
and are intended only for the addressee. If you have received this email in 
error, please delete it 
immediately. Any views expressed herein are the author's and do not necessarily 
represent those of the company.


[squid-users] Keeping archiving access.log

2010-01-12 Thread Kelly, Jack
Hi everyone,
Incredibly dumb question, I'm almost embarrassed asking it.

My access.log only seems to store a day's worth of proxy traffic data.
Do I just need to add a squid3 -k rotate task to my crontab?

Also, when creating the VM to run Squid, I sized the disk to hold about
a month's worth of log data before it gets pulled off and archived by
our file server. 

So given the fact that I fill up a logfile in about a day and want to
save that much info, I should set logfile_rotate to something like 35,
right? Are there any other config changes I need to make?

What I'm describing probably sounds *very* unusual, but so are the
requirements I've been given for this project :)

Thanks!
Jack
 


This message (and any associated files) is the property of
S. R. Weiner and Associates Inc. and W/S Development Associates LLC
and is intended only for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
subject to copyright or constitutes a trade secret. If you are not
the intended recipient you are hereby notified that any dissemination,
copying or distribution of this message, or files associated with this
message, is strictly prohibited. If you have received this message
in error, please notify us immediately by calling our corporate office
at 617-232-8900 and deleting this message from your computer.

Internet communications cannot be guaranteed to be secure or error-free
as information could be intercepted, corrupted, lost, destroyed,
arrive late or incomplete, or contain viruses. Therefore, S. R. Weiner
and Associates, Inc. and W/S Development Associates LLC do not accept
responsibility for any errors or omissions that are present in this
message, or any attachment, that have arisen as a result of e-mail
transmission. If verification is required, please request a hard-copy
version of this message.

Any views or opinions presented in this message are solely those of
the author and do not necessarily represent those of the company.


RE: [squid-users] Keeping archiving access.log

2010-01-12 Thread Kelly, Jack
Logical, but part of the requirements of this project call for me to keep the 
logfiles uncompressed while we store them.

-Original Message-
From: Guido Marino Lorenzutti [mailto:glorenzu...@jusbaires.gov.ar] 
Sent: Tuesday, January 12, 2010 11:32 AM
To: Kelly, Jack
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Keeping  archiving access.log

You should consider compress the logs...
My two cents... this is my logrotate config for squid.

/var/log/squid/*.log {
 daily
 compress
 rotate 31
 missingok
 nocreate
 sharedscripts
 postrotate
 test ! -e /var/run/squid.pid || /usr/sbin/squid -k rotate
 endscript
}


Kelly, Jack jack.ke...@wsdevelopment.com escribió:

 Hi everyone,
 Incredibly dumb question, I'm almost embarrassed asking it.

 My access.log only seems to store a day's worth of proxy traffic data.
 Do I just need to add a squid3 -k rotate task to my crontab?

 Also, when creating the VM to run Squid, I sized the disk to hold 
 about a month's worth of log data before it gets pulled off and 
 archived by our file server.

 So given the fact that I fill up a logfile in about a day and want to 
 save that much info, I should set logfile_rotate to something like 35, 
 right? Are there any other config changes I need to make?

 What I'm describing probably sounds *very* unusual, but so are the 
 requirements I've been given for this project :)

 Thanks!
 Jack

 

 This message (and any associated files) is the property of S. R. 
 Weiner and Associates Inc. and W/S Development Associates LLC and is 
 intended only for the use of the individual or entity to which it is 
 addressed and may contain information that is confidential, subject to 
 copyright or constitutes a trade secret. If you are not the intended 
 recipient you are hereby notified that any dissemination, copying or 
 distribution of this message, or files associated with this message, 
 is strictly prohibited. If you have received this message in error, 
 please notify us immediately by calling our corporate office at 
 617-232-8900 and deleting this message from your computer.

 Internet communications cannot be guaranteed to be secure or 
 error-free as information could be intercepted, corrupted, lost, 
 destroyed, arrive late or incomplete, or contain viruses. Therefore, 
 S. R. Weiner and Associates, Inc. and W/S Development Associates LLC 
 do not accept responsibility for any errors or omissions that are 
 present in this message, or any attachment, that have arisen as a 
 result of e-mail transmission. If verification is required, please 
 request a hard-copy version of this message.

 Any views or opinions presented in this message are solely those of 
 the author and do not necessarily represent those of the company.






[squid-users] Simple pass-through authentication?

2009-12-01 Thread Kelly, Jack
Hi everyone,

I have a very broad question that I simply can't seem to find enough
documentation on. 

In my environment, my users are authenticating against a S2003 domain
controller using squid_ldap_auth. I'm doling out permissions to access
certain websites this way and it's working splendidly.

Each time they open a web browser, they're prompted for their
credentials. This is fine, but to reduce 'annoy' factor I'd really like
to find a way to implement a pass-through solution. I've been able to
find tidbits here and there on how to accomplish this with ntlm, but I
haven't seen any concrete examples of what to put in my conf file.

Below are the relevant lines. Simply put, could someone briefly describe
what packages I need to add + configure, and what lines need to be added
to my conf file? (Running Squid 3.1 on Debian.) Any articles you think
would help would also be appreciated.

Thanks!

auth_param basic program /usr/lib/squid3/squid_ldap_auth -b redacted
-D redacted -w redacted -d -f sAMAccountName=%s -h redacted 
auth_param basic children 5 
auth_param basic realm Proxy Service 
auth_param basic credentialsttl 2 hours

Jack
 


This message (and any associated files) is the property of
S. R. Weiner and Associates Inc. and W/S Development Associates LLC
and is intended only for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
subject to copyright or constitutes a trade secret. If you are not
the intended recipient you are hereby notified that any dissemination,
copying or distribution of this message, or files associated with this
message, is strictly prohibited. If you have received this message
in error, please notify us immediately by calling our corporate office
at 617-232-8900 and deleting this message from your computer.

Internet communications cannot be guaranteed to be secure or error-free
as information could be intercepted, corrupted, lost, destroyed,
arrive late or incomplete, or contain viruses. Therefore, S. R. Weiner
and Associates, Inc. and W/S Development Associates LLC do not accept
responsibility for any errors or omissions that are present in this
message, or any attachment, that have arisen as a result of e-mail
transmission. If verification is required, please request a hard-copy
version of this message.

Any views or opinions presented in this message are solely those of
the author and do not necessarily represent those of the company.


[squid-users] Squid 2.7 with ICAP

2009-10-29 Thread Kelly, Jack
Hi everyone,
Another quick question I hope you don't mind answering. We're running
Squid 2.7 here and a need has recently arisen to have it talk to an ICAP
server. Squid 3.1 and 3.0 appear to explicitly support ICAP in
squid.conf, but 2.7 doesn't seem to have the lines in the config file
that you can uncomment to configure it to talk to an ICAP server. Is
upgrading to 3.0 or 3.1 our only option, or do plugins/patches exist
that would make this possible?

Thanks! 
Jack
 


This message (and any associated files) is the property of
S. R. Weiner and Associates Inc. and W/S Development Associates LLC
and is intended only for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
subject to copyright or constitutes a trade secret. If you are not
the intended recipient you are hereby notified that any dissemination,
copying or distribution of this message, or files associated with this
message, is strictly prohibited. If you have received this message
in error, please notify us immediately by calling our corporate office
at 617-232-8900 and deleting this message from your computer.

Internet communications cannot be guaranteed to be secure or error-free
as information could be intercepted, corrupted, lost, destroyed,
arrive late or incomplete, or contain viruses. Therefore, S. R. Weiner
and Associates, Inc. and W/S Development Associates LLC do not accept
responsibility for any errors or omissions that are present in this
message, or any attachment, that have arisen as a result of e-mail
transmission. If verification is required, please request a hard-copy
version of this message.

Any views or opinions presented in this message are solely those of
the author and do not necessarily represent those of the company.


[squid-users] Skipping logging certain traffic in access.log?

2009-10-28 Thread Kelly, Jack
Hi everyone,
I have what will probably be a pretty simple question... unfortunately I
need to provide a few details to help explain what I'm trying to do and
why.
 
One of the big uses of Squid to our managers is seeing how much time
employees are spending on the internet. To that extent, we've got Squint
installed for analyzing our logs and generating a shiny report that does
exactly that, and can be viewed in an html document hosted right on the
Squid box. Works great. We also authenticate with LDAP so requests can
be tied to user credentials in Squid. Again, works great.
 
Here's where the minor hiccup comes in:
I have an acl called 'passthrough' which is basically a list of
domains/keywords/etc that the proxy server will allow requests for
without prompting the user for their credentials. This comes in handy
for programs that like to check for updates online, like Adobe Reader
and iTunes. Unfortunately for my purposes, requests that go through
unauthenticated are recorded in access.log by requestor IP address,
which subsequently gets parsed by Squint and adds gobs of useless
information to the report.
 
So, my question:
Is there any way to get Squid to exclude certain types of records from
access.log? Or would I be better off just beefing up our PAC file to
send these 'passthrough' requests around the proxy?
 
On second thought, I suppose I could just write and cron a perl script
that nukes lines containing an IP in our DHCP range right before Squint
updates. That feels messy though :)
 
Thanks everyone!
Jack
 


This message (and any associated files) is the property of
S. R. Weiner and Associates Inc. and W/S Development Associates LLC
and is intended only for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
subject to copyright or constitutes a trade secret. If you are not
the intended recipient you are hereby notified that any dissemination,
copying or distribution of this message, or files associated with this
message, is strictly prohibited. If you have received this message
in error, please notify us immediately by calling our corporate office
at 617-232-8900 and deleting this message from your computer.

Internet communications cannot be guaranteed to be secure or error-free
as information could be intercepted, corrupted, lost, destroyed,
arrive late or incomplete, or contain viruses. Therefore, S. R. Weiner
and Associates, Inc. and W/S Development Associates LLC do not accept
responsibility for any errors or omissions that are present in this
message, or any attachment, that have arisen as a result of e-mail
transmission. If verification is required, please request a hard-copy
version of this message.

Any views or opinions presented in this message are solely those of
the author and do not necessarily represent those of the company.


[squid-users] Strange issues with accessing facebook and other php driven sites via proxy

2009-10-08 Thread Kelly, Jack
Hi everyone,
At my office I've implemented a Squid server which uses LDAP credentials
to give certain users access to certain websites. Basically, everyone
belongs to a base 'Filtered' group, and individual users can be added to
a 'FacebookAccess' group for access to facebook. This is mainly because
some departments (read: marketing) need access to facebook while others
do not. 
 
I've only been working on in Squid for about a month and although I've
gotten pretty proficient at getting it to do what I want, I've
encountered what's seeming to be a higher-level problem.
 
Here's the relevant section of my conf file:
 
acl Unfiltered external InetGroup Unfiltered
acl FacebookAccess external InetGroup FacebookAccess
acl Filtered external InetGroup Filtered
 
acl blocksites url_regex /etc/squid3/block.acl
acl whitelist url_regex /etc/squid3/whitelist.acl
acl facebook url_regex .facebook.
acl fbcdn url_regex .fbcdn.
 
#Note: these two lines were added to troubleshoot
always_direct allow fbcdn
always_direct allow facebook  
 
http_access allow Unfiltered
http_access allow Filtered whitelist
http_access allow FacebookAccess facebook
http_access allow FacebookAccess whitelist
http_access deny Filtered blocksites
http_access deny FacebookAccess blocksites
http_access allow FacebookAccess
http_access allow Filtered

And here's the problem:
Users in the FacebookAccess group can get to www.facebook.com
http://www.facebook.com/  without a problem, and users who are only in
the Filtered group cannot. So that's great. However, when they log in
and reach www.facebook.com/home.php?, they just get a white screen -
sometimes. Occasionally it works and occasionally it doesnt; there
appears to be no rhyme or reason to it. I've added .fbcdn. to my
whitelist.acl file, because I saw that content from that domain was
getting denied when facebook loads... but even after that, no go.
 
When I visit the site and log in, the access.log just shows:
 
jackk 08/Oct/2009 11:54:30 TCP_MISS/200 GET http://www.facebook.com/
jackk 08/Oct/2009 11:54:36 TCP_MISS/200 CONNECT login.facebook.com:443
jackk 08/Oct/2009 11:54:36 TCP_MISS/200 GET
http://www.facebook.com/home.php?
 
And to troubleshoot I tried accessing facebook from a member of the
'Unfiltered' group, to which no restrictive acl policies apply. Same
problem. Meanwhile obviously a direct, proxy-free connection to facebook
from my office works just fine.

I'm very, very stuck. Any advice on what to try next would be hugely
appreciated.
 
Thanks!
 
Jack Kelly
Network Services Administrator
W/S Development Associates, LLC
Chestnut Hill, MA
 


This message (and any associated files) is the property of
S. R. Weiner and Associates Inc. and W/S Development Associates LLC
and is intended only for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
subject to copyright or constitutes a trade secret. If you are not
the intended recipient you are hereby notified that any dissemination,
copying or distribution of this message, or files associated with this
message, is strictly prohibited. If you have received this message
in error, please notify us immediately by calling our corporate office
at 617-232-8900 and deleting this message from your computer.

Internet communications cannot be guaranteed to be secure or error-free
as information could be intercepted, corrupted, lost, destroyed,
arrive late or incomplete, or contain viruses. Therefore, S. R. Weiner
and Associates, Inc. and W/S Development Associates LLC do not accept
responsibility for any errors or omissions that are present in this
message, or any attachment, that have arisen as a result of e-mail
transmission. If verification is required, please request a hard-copy
version of this message.

Any views or opinions presented in this message are solely those of
the author and do not necessarily represent those of the company.


RE: [squid-users] Strange issues with accessing facebook and other php driven sites via proxy

2009-10-08 Thread Kelly, Jack
Erg, I should've mentioned: I'm running Squid 3.0. I've poured over a
lot of documentation and I haven't been able to decipher whether 3.0
natively supports 1.1, or has no support whatsoever because of the
differences in code between 2.7 and 3.1.

Regardless, I went back and added incoming and outgoing headers to my
access.log format to see what the deal is. Headers from facebook are
coming in as HTTP 1.0. 

Is it still possible that my problem lies in needing to find a way to
enable 1.1? 

-Original Message-
From: Chudy Fernandez [mailto:chudy_fernan...@yahoo.com] 
Sent: Thursday, October 08, 2009 1:04 PM
To: Kelly, Jack; squid-users@squid-cache.org
Subject: Re: [squid-users] Strange issues with accessing facebook and
other php driven sites via proxy

server_http11 on will do the trick


- Original Message 
 From: Kelly, Jack jack.ke...@wsdevelopment.com
 To: squid-users@squid-cache.org
 Sent: Fri, October 9, 2009 12:10:02 AM
 Subject: [squid-users] Strange issues with accessing facebook and 
 other php driven sites via proxy
 
 Hi everyone,
 At my office I've implemented a Squid server which uses LDAP 
 credentials to give certain users access to certain websites. 
 Basically, everyone belongs to a base 'Filtered' group, and individual

 users can be added to a 'FacebookAccess' group for access to facebook.

 This is mainly because some departments (read: marketing) need access 
 to facebook while others do not.
 
 I've only been working on in Squid for about a month and although I've

 gotten pretty proficient at getting it to do what I want, I've 
 encountered what's seeming to be a higher-level problem.
 
 Here's the relevant section of my conf file:
 
 acl Unfiltered external InetGroup Unfiltered acl FacebookAccess 
 external InetGroup FacebookAccess acl Filtered external InetGroup 
 Filtered
 
 acl blocksites url_regex /etc/squid3/block.acl
 acl whitelist url_regex /etc/squid3/whitelist.acl
 acl facebook url_regex .facebook.
 acl fbcdn url_regex .fbcdn.
 
 #Note: these two lines were added to troubleshoot always_direct allow 
 fbcdn always_direct allow facebook
 
 http_access allow Unfiltered
 http_access allow Filtered whitelist
 http_access allow FacebookAccess facebook http_access allow 
 FacebookAccess whitelist http_access deny Filtered blocksites 
 http_access deny FacebookAccess blocksites http_access allow 
 FacebookAccess http_access allow Filtered
 
 And here's the problem:
 Users in the FacebookAccess group can get to www.facebook.com
   without a problem, and users who are only in the Filtered group 
 cannot. So that's great. However, when they log in and reach 
 www.facebook.com/home.php?, they just get a white screen - sometimes. 
 Occasionally it works and occasionally it doesnt; there appears to be 
 no rhyme or reason to it. I've added .fbcdn. to my whitelist.acl 
 file, because I saw that content from that domain was getting denied 
 when facebook loads... but even after that, no go.
 
 When I visit the site and log in, the access.log just shows:
 
 jackk 08/Oct/2009 11:54:30 TCP_MISS/200 GET http://www.facebook.com/ 
 jackk 08/Oct/2009 11:54:36 TCP_MISS/200 CONNECT login.facebook.com:443

 jackk 08/Oct/2009 11:54:36 TCP_MISS/200 GET 
 http://www.facebook.com/home.php?
 
 And to troubleshoot I tried accessing facebook from a member of the 
 'Unfiltered' group, to which no restrictive acl policies apply. Same 
 problem. Meanwhile obviously a direct, proxy-free connection to 
 facebook from my office works just fine.
 
 I'm very, very stuck. Any advice on what to try next would be hugely 
 appreciated.
 
 Thanks!
 
 Jack Kelly
 Network Services Administrator
 W/S Development Associates, LLC
 Chestnut Hill, MA
 
 
 
 This message (and any associated files) is the property of S. R. 
 Weiner and Associates Inc. and W/S Development Associates LLC and is 
 intended only for the use of the individual or entity to which it is 
 addressed and may contain information that is confidential, subject to

 copyright or constitutes a trade secret. If you are not the intended 
 recipient you are hereby notified that any dissemination, copying or 
 distribution of this message, or files associated with this message, 
 is strictly prohibited. If you have received this message in error, 
 please notify us immediately by calling our corporate office at 
 617-232-8900 and deleting this message from your computer.
 
 Internet communications cannot be guaranteed to be secure or 
 error-free as information could be intercepted, corrupted, lost, 
 destroyed, arrive late or incomplete, or contain viruses. Therefore, 
 S. R. Weiner and Associates, Inc. and W/S Development Associates LLC 
 do not accept responsibility for any errors or omissions that are 
 present in this message, or any attachment, that have arisen as a 
 result of e-mail transmission. If verification is required, please 
 request a hard-copy version of this message