Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread K K
On 3/22/08, Sadiq Walji <[EMAIL PROTECTED]> wrote:
> When squid fails, all the users cannot browse and we have to manually stop
> squid to bypass it. Is there any way/feature that enables to bypass squid
> automatically if and when it fails or has some problems?

Yes, use a PAC (Proxy Automatic Configuration) set in the browser.

 http://wiki.squid-cache.org/Technology/ProxyPac

The PAC script instructs the browser wat explicit (non-transparent)
proxy or proxies to use, and can fall back to DIRECT.  For
Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
GPO.  For non-microsoft, this needs to be configured manually on each
client.

PAC is supported in all modern graphical browsers.

Kevin


Re: [squid-users] Squid Future (was Re: [squid-users] Squid-2, Squid-3, roadmap)

2008-03-23 Thread Adrian Chadd
On Sat, Mar 22, 2008, Henrik Nordstrom wrote:

> Squid-3 is different and uses a splay tree for the memory nodes of the
> object, and should behave a lot better in this regard.

The bounds are probably saner but the runtime hit for small objects
is noticable.

The real solution is a tree for offset lookups, and a linear walk of order
O(1) for subsequent sequential accesses.




Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread Tim Bates

K K wrote:

For
Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
GPO.  For non-microsoft, this needs to be configured manually on each
client.
  
For non-MS browsers you can often still use WPAD (Firefox on Linux for 
example can do that still).
You can also get a modified version of Firefox (made by FrontMotion) 
that supports GPO for certain settings.


TB


Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread Amos Jeffries

Tim Bates wrote:

K K wrote:

For
Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
GPO.  For non-microsoft, this needs to be configured manually on each
client.
  
For non-MS browsers you can often still use WPAD (Firefox on Linux for 
example can do that still).
You can also get a modified version of Firefox (made by FrontMotion) 
that supports GPO for certain settings.


TB


The only real trouble with WPAD is that it has never been formally 
standardised.

 Microsoft products use only the 'obsolete' DHCP methods of WPAD.
 Linux/Mac/*BSD products use the easier but non-official DNS methods of 
WPAD.


So you need to configure both methods for it to work properly on the 
network for all clients.


Amos
--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


[squid-users] Help on squid Installation

2008-03-23 Thread Anand K
Hi All,
iam trying to install squid 2.6 on my SPARC system with solaris 10 OS
when i run ./configure it creates the necessary files
but when i try executing
# make

i get the following error


/usr/ccs/bin//ar cru libbasic.a basic/auth_basic.o
ranlib libbasic.a
make[3]: Leaving directory `/squid/squid-2.6.STABLE18/src/auth'
make[3]: Entering directory `/squid/squid-2.6.STABLE18/src'
if gcc -DHAVE_CONFIG_H
-DDEFAULT_CONFIG_FILE=\"/usr/local/squid/etc/squid.conf\" -I. -I.
-I../include -I. -I. -I../include -I../include-Wall -g -O2 -MT
access_log.o -MD -MP -MF ".deps/access_log.Tpo" -c -o access_log.o
access_log.c; \
then mv -f ".deps/access_log.Tpo" ".deps/access_log.Po"; else
rm -f ".deps/access_log.Tpo"; exit 1; fi
if gcc -DHAVE_CONFIG_H
-DDEFAULT_CONFIG_FILE=\"/usr/local/squid/etc/squid.conf\" -I. -I.
-I../include -I. -I. -I../include -I../include-Wall -g -O2 -MT
acl.o -MD -MP -MF ".deps/acl.Tpo" -c -o acl.o acl.c; \
then mv -f ".deps/acl.Tpo" ".deps/acl.Po"; else rm -f
".deps/acl.Tpo"; exit 1; fi
if gcc -DHAVE_CONFIG_H
-DDEFAULT_CONFIG_FILE=\"/usr/local/squid/etc/squid.conf\" -I. -I.
-I../include -I. -I. -I../include -I../include-Wall -g -O2 -MT
asn.o -MD -MP -MF ".deps/asn.Tpo" -c -o asn.o asn.c; \
then mv -f ".deps/asn.Tpo" ".deps/asn.Po"; else rm -f
".deps/asn.Tpo"; exit 1; fi
if gcc -DHAVE_CONFIG_H
-DDEFAULT_CONFIG_FILE=\"/usr/local/squid/etc/squid.conf\" -I. -I.
-I../include -I. -I. -I../include -I../include-Wall -g -O2 -MT
authenticate.o -MD -MP -MF ".deps/authenticate.Tpo" -c -o
authenticate.o authenticate.c; \
then mv -f ".deps/authenticate.Tpo" ".deps/authenticate.Po";
else rm -f ".deps/authenticate.Tpo"; exit 1; fi
if gcc -DHAVE_CONFIG_H
-DDEFAULT_CONFIG_FILE=\"/usr/local/squid/etc/squid.conf\" -I. -I.
-I../include -I. -I. -I../include -I../include-Wall -g -O2 -MT
cache_cf.o -MD -MP -MF ".deps/cache_cf.Tpo" -c -o cache_cf.o
cache_cf.c; \
then mv -f ".deps/cache_cf.Tpo" ".deps/cache_cf.Po"; else rm
-f ".deps/cache_cf.Tpo"; exit 1; fi
cache_cf.c: In function `requirePathnameExists':
cache_cf.c:3047: error: `SIGHUP' undeclared (first use in this function)
cache_cf.c:3047: error: (Each undeclared identifier is reported only once
cache_cf.c:3047: error: for each function it appears in.)
make[3]: *** [cache_cf.o] Error 1
make[3]: Leaving directory `/squid/squid-2.6.STABLE18/src'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/squid/squid-2.6.STABLE18/src'
make[1]: *** [all] Error 2
make[1]: Leaving directory `/squid/squid-2.6.STABLE18/src'
make: *** [all-recursive] Error 1
bash-3.00#

what can be the problem, kindly guide me on this, and how to resolve this
thanks in advance

regards,
Andy


Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread ian j hart
On Sunday 23 March 2008 11:12:22 Amos Jeffries wrote:
> Tim Bates wrote:
> > K K wrote:
> >> For
> >> Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
> >> GPO.  For non-microsoft, this needs to be configured manually on each
> >> client.
> >
> > For non-MS browsers you can often still use WPAD (Firefox on Linux for
> > example can do that still).
> > You can also get a modified version of Firefox (made by FrontMotion)
> > that supports GPO for certain settings.
> >
> > TB
>
> The only real trouble with WPAD is that it has never been formally
> standardised.
>   Microsoft products use only the 'obsolete' DHCP methods of WPAD.

Are you sure about this?

IIRC I'm using only DNS. Which is clunky, but it works. (XP)

>   Linux/Mac/*BSD products use the easier but non-official DNS methods of
> WPAD.
>
> So you need to configure both methods for it to work properly on the
> network for all clients.
>
> Amos



-- 
ian j hart


Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread Amos Jeffries

ian j hart wrote:

On Sunday 23 March 2008 11:12:22 Amos Jeffries wrote:

Tim Bates wrote:

K K wrote:

For
Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
GPO.  For non-microsoft, this needs to be configured manually on each
client.

For non-MS browsers you can often still use WPAD (Firefox on Linux for
example can do that still).
You can also get a modified version of Firefox (made by FrontMotion)
that supports GPO for certain settings.

TB

The only real trouble with WPAD is that it has never been formally
standardised.
  Microsoft products use only the 'obsolete' DHCP methods of WPAD.


Are you sure about this?

IIRC I'm using only DNS. Which is clunky, but it works. (XP)


I'm not 100% on anything to do with WPAD, despite a few months 
experimenting with it and various setups.


I last tried it with XP and 2k running IE 5.5 SP1 and WindowsUpdate 
3-something or MicrosoftUpdate 1-something.


What versions of IE, WindowsUpdate/MicrosoftUpdate have you seen working 
with WPAD-DNS?






  Linux/Mac/*BSD products use the easier but non-official DNS methods of
WPAD.

So you need to configure both methods for it to work properly on the
network for all clients.

Amos




Amos
--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


[squid-users] ACLs and localhost

2008-03-23 Thread paul cooper
4 users , 1 machine, with squid running and a GUI



Im having problems getting the time-based ACLs sorted. To test it ive
added a sat/sun ACL which should allow access between 08:00 and 10:00



 Config 1

hepworth emma # cat /etc/squid/squid.conf |grep ^acl
acl all src 0.0.0.0/0.0.0.0
acl localhost src 127.0.0.1/255.255.255.255
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 22 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 901 # SWAT
acl purge method PURGE
acl CONNECT method CONNECT
acl andrew proxy_auth REQUIRED
acl emma proxy_auth REQUIRED
acl QUERY urlpath_regex cgi-bin \?
acl apache rep_header Server ^Apache
acl weekends time SA 08:00-10:00
acl beforeschool  time MTWHF 07:30-09:00
acl afterschool  time  MTWHF 16:00-20:00
hepworth emma # cat /etc/squid/squid.conf |grep  ^http
http_port 3128
http_access allow emma weekends
http_access allow Safe_ports
http_access allow andrew
http_access deny localhost
http_access deny all


it asks me for a login (emma) and  then gives access

2008/03/23 16:05:44| aclCheckFast: list: 0x82a7748
2008/03/23 16:05:44| aclMatchAclList: checking all
2008/03/23 16:05:44| aclMatchAcl: checking 'acl all src 0.0.0.0/0.0.0.0'
2008/03/23 16:05:44| aclMatchIp: '127.0.0.1' found
2008/03/23 16:05:44| aclMatchAclList: returning 1
2008/03/23 16:05:44| aclCheck: checking 'http_access allow emma weekends'
2008/03/23 16:05:44| aclMatchAclList: checking emma
2008/03/23 16:05:44| aclMatchAcl: checking 'acl emma proxy_auth REQUIRED'
2008/03/23 16:05:44| aclMatchAcl: returning 0 sending authentication
challenge.
2008/03/23 16:05:44| aclMatchAclList: no match, returning 0
2008/03/23 16:05:44| aclCheck: requiring Proxy Auth header.
2008/03/23 16:05:44| aclCheck: match found, returning 2
2008/03/23 16:05:44| aclCheckCallback: answer=2
2008/03/23 16:05:44| The request GET http://grolma.no-ip.org/ is DENIED,
because it matched 'emma'
2008/03/23 16:05:44| The reply for GET http://grolma.no-ip.org/ is
ALLOWED, because it matched 'emma'
2008/03/23 16:05:49| aclCheckFast: list: 0x82a7748
2008/03/23 16:05:49| aclMatchAclList: checking all
2008/03/23 16:05:49| aclMatchAcl: checking 'acl all src 0.0.0.0/0.0.0.0'
2008/03/23 16:05:49| aclMatchIp: '127.0.0.1' found
2008/03/23 16:05:49| aclMatchAclList: returning 1
2008/03/23 16:05:50| aclCheck: checking 'http_access allow emma weekends'
2008/03/23 16:05:50| aclMatchAclList: checking emma
2008/03/23 16:05:50| aclMatchAcl: checking 'acl emma proxy_auth REQUIRED'
2008/03/23 16:05:50| aclMatchAcl: returning 0 sending credentials to helper.
2008/03/23 16:05:50| aclMatchAclList: no match, returning 0
2008/03/23 16:05:50| aclCheck: checking password via authenticator
2008/03/23 16:05:50| aclCheck: checking 'http_access allow emma weekends'
2008/03/23 16:05:50| aclMatchAclList: checking emma
2008/03/23 16:05:50| aclMatchAcl: checking 'acl emma proxy_auth REQUIRED'
2008/03/23 16:05:50| aclMatchUser: user is emma, case_insensitive is 0
2008/03/23 16:05:50| Top is (nil), Top->data is Unavailable
2008/03/23 16:05:50| aclMatchUser: user REQUIRED and auth-info present.
2008/03/23 16:05:50| aclMatchAclList: checking weekends
2008/03/23 16:05:50| aclMatchAcl: checking 'acl weekends time SA 08:00-10:00'
2008/03/23 16:05:50| aclMatchTime: checking 965 in 480-600, weekbits=41
2008/03/23 16:05:50| aclMatchAclList: no match, returning 0
2008/03/23 16:05:50| aclCheck: checking 'http_access allow Safe_ports'
2008/03/23 16:05:50| aclMatchAclList: checking Safe_ports
2008/03/23 16:05:50| aclMatchAcl: checking 'acl Safe_ports port 80 # http'
2008/03/23 16:05:50| aclMatchAclList: returning 1
2008/03/23 16:05:50| aclCheck: match found, returning 1
2008/03/23 16:05:50| aclCheckCallback: answer=1
2008/03/23 16:05:50| The request GET http://grolma.no-ip.org/ is ALLOWED,
because it matched 'Safe_ports'
2008/03/23 16:05:50| aclCheck: checking 'cache deny QUERY'
2008/03/23 16:05:50| aclMatchAclList: checking QUERY
2008/03/23 16:05:50| aclMatchAcl: checking 'acl QUERY urlpath_regex
cgi-bin \?'
2008/03/23 16:05:50| aclMatchRegex: checking '/'
2008/03/23 16:05:50| aclMatchRegex: looking for 'cgi-bin'
2008/03/23 16:05:50| aclMatchRegex: looking for '\?'
2008/03/23 16:05:50| aclMatchAclList: no match, returning 0
2008/03/23 16:05:50| aclCheck: NO match found, returning 1
2008/03/23 16:05:50| aclCheckCallback: answer=1
2008/03/23 16:05:50| clientProcessHit: HIT
2008/03/23 16:05:50| aclCheckFast: list: 0x82a7df8
2008/03/23 16:05:50| aclMatchAclList: checking all
2008/03/23 16:05:50| aclMatchAcl: checking 'acl all src 0.0.0.0/0.0.0.0'
2008/03/23 16:05:50| aclMatchIp: '127.0.0.1' found
2008/03/23 16:05:50| aclM

RE: [squid-users] When will be the bug 2206 included into the 3.0-HEAD project.

2008-03-23 Thread S.KOBAYASHI
Amos, 
I appreciate that you made an answer quickly. I got it, and I'm looking
forward to being released 3.1.

Thanks a lot,
Seiji Kobayashi



-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Friday, March 21, 2008 5:49 PM
To: S.KOBAYASHI
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] When will be the bug 2206 included into the
3.0-HEAD project.

S.KOBAYASHI wrote:
> Hello there,
> 
Greetings.

> When will be the bug 2206 "drops Proxy-Authenticate header" included into
> the 3.0-HEAD project?

Okay two things here;

Firstly, 3.0 and 3-HEAD are two separate sets of code.
   There is a 3.0, already released and in STABLE cycles.
   3-HEAD will soon become a separate 3.1 release.

Secondly, the 2206 was caused by a mistake fixing a less-important bug. 
We have temporarily re-broken the minor bug to fix 3.0.STABLE.

That has already been done, and daily snapshots of 3.0 already contain 
that fix. STABLE3 is due out end of next weekend with all that inside.

I would have released it by now with the fix, but our VCS change muddled 
the maintenance job down slightly during the changeover. We are just 
doing the countdown to see if any last-minute bugs occur or can be fixed.

> I don't understand the schedule yet when some bugs are found and fixed.

We don't have a schedule of when we are going to find or fix bugs. They 
are found by people when they are found. They are fixed as soon as 
anybody can identify both the problem and a good fix for that problem.

Usually, in the development version first then ported back through the 
current stables.

This time we were lucky enough to find a quick-fix that can keep 3.0 
stable and working before having to find the much more difficult 
permanent fix.

Right now Christos is working on that fix for 3-HEAD (3.1) that will fix 
both 2206 and the other bug. We developers have decided that since 
3-HEAD is going to be development for some months yet before its ready 
for production use we can spare the time for better-quality patch there.

When the permanent fix is created we'll be deciding whether its simple 
enough to port back to 3.0 as well.

Hope this helps.

Amos
-- 
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.



Re: [squid-users] BYPASS UPON FAILURE

2008-03-23 Thread Joel Jaeggli

Amos Jeffries wrote:

ian j hart wrote:

On Sunday 23 March 2008 11:12:22 Amos Jeffries wrote:

Tim Bates wrote:

K K wrote:

For
Windows/MSIE the setting can be done automatically by WPAD, DHCP, or
GPO.  For non-microsoft, this needs to be configured manually on each
client.

For non-MS browsers you can often still use WPAD (Firefox on Linux for
example can do that still).
You can also get a modified version of Firefox (made by FrontMotion)
that supports GPO for certain settings.

TB

The only real trouble with WPAD is that it has never been formally
standardised.
  Microsoft products use only the 'obsolete' DHCP methods of WPAD.


Are you sure about this?

IIRC I'm using only DNS. Which is clunky, but it works. (XP)


internet exploder will use dns...

firefox won't without configuration which means effectively half your 
users won't.


The draft expired eons ago (it was expired when I first taught how use 
in a workshop in 1999)


I'm not 100% on anything to do with WPAD, despite a few months 
experimenting with it and various setups.


I last tried it with XP and 2k running IE 5.5 SP1 and WindowsUpdate 
3-something or MicrosoftUpdate 1-something.


What versions of IE, WindowsUpdate/MicrosoftUpdate have you seen working 
with WPAD-DNS?






  Linux/Mac/*BSD products use the easier but non-official DNS methods of
WPAD.

So you need to configure both methods for it to work properly on the
network for all clients.

Amos




Amos




RE: [squid-users] ntlm_auth seems to have losts it mind

2008-03-23 Thread Martin, Jeremy
Ok, logged in as the squid user and executed the command 
/usr/local/samba/bin/ntlm_auth --helper-protocol=squid-2.5-basic; entered my 
domain\username password and reply is OK.  Squid still gives error message.

Jeremy


-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
Sent: Fri 3/21/2008 6:32 PM
To: Martin, Jeremy
Cc: squid-users@squid-cache.org
Subject: RE: [squid-users] ntlm_auth seems to have losts it mind
 
On Fri, 2008-03-21 at 11:25 -0400, Martin, Jeremy wrote:
> 2.  This time I changed the /usr/local/samba/bin/ntlm_auth to run as
> root using chmod, just to make sure it has rights.

Don't do that.

> 3.  Created a squid user and a service group.  I made squid and the
> service group the owner of both the squid and samba folders in the
> /usr/local.

It's the privileged_pipe directory that this group needs access to.
Quite often found in /var/samba/ or similar locations..

> 4.  wbinfo -t -g u all do what they are supposed to and ntlm_auth at the
> command prompt works correctly.

You need to run the ntlm_auth test as your cache_effective_user set in
squid.conf.

Regards
Henrik


--
This message was scanned by ESVA and is believed to be clean.
Click here to report this message as spam. 
http://spam.emcc.edu/cgi-bin/learn-msg.cgi?id=836C127F02.55111







RE: [squid-users] How squid does Src/Dst IP address matching

2008-03-23 Thread Saurabh Agarwal
Thanks Amos, I have one follow up question though on your reply

src - performs an OS call to retrieve the IP of the other end of the TCP

connection socket its been given.

dst - retrieves the FQDN being looked up from the request headers, and 
performs a DNS lookup on it to retrieve the address.

>> To determine the dst IP address, why do we don't perform an OS call
to retrieve the destination IP address. Is it technically possible? If
yes how? IF we can do it, then we can save some time in the DNS lookup
that squid performs.

Thanks
Saurabh
-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Monday, March 17, 2008 4:01 PM
To: Saurabh Agarwal
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] How squid does Src/Dst IP address matching

Saurabh Agarwal wrote:
> Hi 
> 
> Can someone please tell how does squid does the acl evaluation related
> to Src/Dst IP address? Like "acl myNet dst 10.0.0.0/255.255.0.0"
> 
> As I understand squid does not get to know the IP layer information
> which has the destination IP address field.
> 
> But in the HTTP header we have the name of the server like 
> "Host mail.yahoo.com", which can be used to determine the destination
IP
> Address.
> 
> Does squid resolves the IP address of mail.yahoo.com before it does
the
> Dst Address acls matching or evaluation?


With src and dst it differs in the methods of attaining the IP. But the 
evaluation is identical.

src - performs an OS call to retrieve the IP of the other end of the TCP

connection socket its been given.

dst - retrieves the FQDN being looked up from the request headers, and 
performs a DNS lookup on it to retrieve the address.

Both then pass the IP to the ACL processing to be checked.

Amos
-- 
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


Re: [squid-users] How squid does Src/Dst IP address matching

2008-03-23 Thread Amos Jeffries

Saurabh Agarwal wrote:

Thanks Amos, I have one follow up question though on your reply

src - performs an OS call to retrieve the IP of the other end of the TCP

connection socket its been given.

dst - retrieves the FQDN being looked up from the request headers, and 
performs a DNS lookup on it to retrieve the address.



To determine the dst IP address, why do we don't perform an OS call

to retrieve the destination IP address. Is it technically possible? If
yes how? IF we can do it, then we can save some time in the DNS lookup
that squid performs.


It's possible. Most OS provide sgetsockopt() calls to retrieve them.
Squid does not use these in order to protect its cache against 
compromised users.
When trusting the users requested dst-IP a single infected web client 
retrieving a bad web page could poison the cache and pass the infection 
on to all other users.


Amos



Thanks
Saurabh
-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Monday, March 17, 2008 4:01 PM

To: Saurabh Agarwal
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] How squid does Src/Dst IP address matching

Saurabh Agarwal wrote:
Hi 


Can someone please tell how does squid does the acl evaluation related
to Src/Dst IP address? Like "acl myNet dst 10.0.0.0/255.255.0.0"

As I understand squid does not get to know the IP layer information
which has the destination IP address field.

But in the HTTP header we have the name of the server like 
"Host mail.yahoo.com", which can be used to determine the destination

IP

Address.

Does squid resolves the IP address of mail.yahoo.com before it does

the

Dst Address acls matching or evaluation?



With src and dst it differs in the methods of attaining the IP. But the 
evaluation is identical.


src - performs an OS call to retrieve the IP of the other end of the TCP

connection socket its been given.

dst - retrieves the FQDN being looked up from the request headers, and 
performs a DNS lookup on it to retrieve the address.


Both then pass the IP to the ACL processing to be checked.

Amos



--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


Re: [squid-users] ACLs and localhost

2008-03-23 Thread Amos Jeffries

paul cooper wrote:

4 users , 1 machine, with squid running and a GUI



Im having problems getting the time-based ACLs sorted. To test it ive
added a sat/sun ACL which should allow access between 08:00 and 10:00




Your time ACL appears to be working. It's your usage of http_access 
thats screwing things up. Check the lines saying "request ALLOWED 
because it matched".




 Config 1

hepworth emma # cat /etc/squid/squid.conf |grep ^acl
acl all src 0.0.0.0/0.0.0.0
acl localhost src 127.0.0.1/255.255.255.255
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 22 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 901 # SWAT
acl purge method PURGE
acl CONNECT method CONNECT
acl andrew proxy_auth REQUIRED
acl emma proxy_auth REQUIRED
acl QUERY urlpath_regex cgi-bin \?
acl apache rep_header Server ^Apache
acl weekends time SA 08:00-10:00
acl beforeschool  time MTWHF 07:30-09:00
acl afterschool  time  MTWHF 16:00-20:00
hepworth emma # cat /etc/squid/squid.conf |grep  ^http
http_port 3128
http_access allow emma weekends

- fails on first test sequence
- allow request on second sequence

http_access allow Safe_ports

- allow request on first sequence
- never reached on second

http_access allow andrew

- never reached

http_access deny localhost

- never reached

http_access deny all

- never reached.



it asks me for a login (emma) and  then gives access




2008/03/23 16:05:44| The request GET http://grolma.no-ip.org/ is DENIED,
because it matched 'emma'


... bounce for login.




2008/03/23 16:05:50| The request GET http://grolma.no-ip.org/ is ALLOWED,
because it matched 'Safe_ports'


... bingo!




so i negate the time , and it still gives me access

hepworth emma # cat /etc/squid/squid.conf |grep ^http
http_port 3128
http_access allow emma !weekends
http_access allow Safe_ports
http_access allow andrew
http_access deny localhost
http_access deny all
hepworth emma #



2008/03/23 16:10:41| The request GET http://grolma.no-ip.org/ is DENIED,
because it matched 'emma'


... bounce for login again.



2008/03/23 16:10:47| The request GET http://grolma.no-ip.org/ is ALLOWED,
because it matched 'weekends'


... boing!




so i try denying emma and it gives me access without asking for a username

hepworth emma # cat /etc/squid/squid.conf |grep ^http
http_port 3128
http_access allow Safe_ports

- accepts all port 80 requests.

http_access allow andrew

- never reached

http_access deny localhost

- never reached

http_access deny emma

- never reached

http_access deny all

- never reached

hepworth emma #




2008/03/23 16:14:32| The request GET http://grolma.no-ip.org/ is ALLOWED,
because it matched 'Safe_ports'


.. bingo! on the first line.





I think its giving me access from localhost.
Ive commented out  all the  default localhost configs and added http_acess
deny localhost but its not stopping it
How do i configure this ?




Drop the global access to Safe_ports. And I do mean GLOBAL. You have an 
open-proxy on your hands.


It's best to use:
http_access deny !Safe_ports

to only use Safe_ports for blocking unsafe port usage.

Amos
--
Please use Squid 2.6STABLE17+ or 3.0STABLE1+
There are serious security advisories out on all earlier releases.


[squid-users] [Fwd: Squid + ClamAV]

2008-03-23 Thread Tarak Ranjan
Sorry , this question is already answered..
Thank you

/\
Tarak

--- Begin Message ---
Hi List,
Has anyone done the integration of ClamAV in Squid


/\
Tarak
--- End Message ---


[squid-users] Squid + ClamAV

2008-03-23 Thread Tarak Ranjan
Hi List,
Has anyone done the integration of ClamAV in Squid


/\
Tarak




[squid-users] how to bypass firewall for some sites

2008-03-23 Thread Mr Crack
Is there any way in squid to bypass some sites that are banned by firewall or
special tools as squid-plug in ...?
Because ISP banned some sites such as GMail
I dont want to use Windows software e.g. YourFreedom, UltraSurf
I want to install as server s/w with squid


Any help is appriciated

Mr. Crack 007