RE: [squid-users] bind

2003-02-14 Thread Reckhard, Tobias
 http://cr.yp.to/djbdns/run-cache-bind-2.html
 
 Indicates bind and a cache won't get along. I don't quite get 
 it. Can you

Could you please specify what you believe to make that indication on the
page you mention? A short glance revealed nothing to do with WWW caches. I
should think all mentions of the word 'cache' on that page refer to a DNS
cache, i.e. a caching DNS proxy, the server that will go out and look up DNS
RRs for clients. Check
http://homepages.tesco.net./~J.deBoynePollard/FGA/dns-server-roles.html.

Cheers,
Tobias



[squid-users] R: [squid-users] Client Computer Name in access.log

2003-02-14 Thread FRANCO Battista (Baky)
I set :
log_fqdn on
after 
squid -k reconfigure
but it doesn't work :o

-Messaggio originale-
Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
Inviato:venerdi 14 febbraio 2003 1.43
A:  FRANCO Battista (Baky)
Cc: [EMAIL PROTECTED]
Oggetto:Re: [squid-users] Client Computer Name in access.log

log_fqdn

Regards
Henrik


FRANCO Battista (Baky) wrote:
 
 In my access.log i find client Ip address and its url links can i modify my
 confiuratin file to write client computer name instad of IP address.
 Thank You




RE: [squid-users] NTLM authentication in Cache Hierachy

2003-02-14 Thread Mark A Lewis
It would seem that option A would be the best one. Just set the parent
proxy to only accept requests from the child proxies. This would also
spread the load a bit as well. If logging is an issue NFS is a possible
solution. 

I am not very familiar with NFS, but is it possible for multiple proxies
to share one central log file? The windows in me keeps screaming about
locks but AFIK this doesn't occur with *nix. If this is not possible
then each could keep its own log centrally at least.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]] On Behalf Of
Henrik Nordstrom
Sent: Thursday, February 13, 2003 7:20 PM
To: Chris Vaughan
Cc: '[EMAIL PROTECTED]'
Subject: Re: [squid-users] NTLM authentication in Cache Hierachy


The browser can only authenticate to the first proxy. This is a
limitation of the HTTP protocol. It is then the responsibility of this
proxy to authenticate to any upstream proxy if required.

When using Basic HTTP authentication you can chain the authentication on
multiple proxies IFF all of them shares the same password database. See
the cache_peer login= option. This also works for Digest if the first
proxy is not doing any authentication, but cannot be used for proxying
the NTLM authentication scheme.

If using NTLM of Digest scheme on the first proxy you cannot forward the
authentication of the client to the upstream proxy. Your alternatives
are then to either

 a) Reconfigure the upstream to allow requests from the sibling without
requiring authentication

 b) Use the login=  cach_peer option on the sibling to specify which
user the sibling should authenticate as to the upstream proxy.

Regards
Henrik


Chris Vaughan wrote:
 
 Greetings.
 
 I am trying to authenticate from a sibling cache using ntlm, sending 
 requests out through a parent.
 
 If the parent uses NCSA auth, the sibling serves back pages that 
 cannot be navigated due to authentication failures.
 
 If the parent is also using ntlm, then a password/userid prompt, that 
 will not accept any input, appears.
 
 Any Ideas?
 
 ***
 This message is intended for the addressee named and
 may  contain confidential information. If you are not the intended 
 recipient, please delete it and notify the sender. Views expressed in 
 this message are those of the individual sender, and are not 
 necessarily the views of the Department of  Information Technology  
 Management.
 
 This email message has been swept by MIMEsweeper
 for the presence of computer viruses.
 ***

**
This message was virus scanned at siliconjunkie.net and
any known viruses were removed. For a current virus list
see http://www.siliconjunkie.net/antivirus/list.html




Re: [squid-users] R: [squid-users] Client Computer Name in access.log

2003-02-14 Thread Henrik Nordstrom
And is the name of your client stations registered on their IP addresses
in your DNS servers?

(if not, how do you expect Squid to be able to know the computer name..)

Regards
Henrik


FRANCO Battista (Baky) wrote:
 
 I set :
 log_fqdn on
 after
 squid -k reconfigure
 but it doesn't work :o
 
 -Messaggio originale-
 Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
 Inviato:venerdi 14 febbraio 2003 1.43
 A:  FRANCO Battista (Baky)
 Cc: [EMAIL PROTECTED]
 Oggetto:Re: [squid-users] Client Computer Name in access.log
 
 log_fqdn
 
 Regards
 Henrik
 
 FRANCO Battista (Baky) wrote:
 
  In my access.log i find client Ip address and its url links can i modify my
  confiuratin file to write client computer name instad of IP address.
  Thank You



Re: [squid-users] problem assessing IP address using iptables DNAT

2003-02-14 Thread Henrik Nordstrom
You need an OUTPUT DNAT rule as well (and NAT of local connections
enabled in your kernel config), or to use a redirector helper rewriting
the IP address to the real destination..

Regards
Henrik



Siew Wing Loon wrote:
 
 Hi,
 
 I have my squid running fine but having problem
 assessing to IP address using iptables DNAT
 
 FW + Squid using External IP Address 2.2.2.1
 
 iptables -t nat -A PREROUTING -p tcp -d 2.2.2.1
 --dport 80 -j DNAT --to 192.168.100.4:80
 
 When I tried to access 2.2.2.1 via Squid, it display
 this error message below :-
 
 ERROR
 The requested URL could not be retrieved
 
 --
 
 While trying to retrieve the URL: http://2.2.2.1/test/
 
 The following error was encountered:
 
 Connection Failed
 The system returned:
 
 (111) Connection refused
 The remote host or network may be down. Please try the
 request again.
 
 Rgds,
 Siew
 
 __
 Do you Yahoo!?
 Yahoo! Shopping - Send Flowers for Valentine's Day
 http://shopping.yahoo.com



[squid-users] R: [squid-users] R: [squid-users] Client Computer Name in access.log

2003-02-14 Thread FRANCO Battista (Baky)
Yes it's because from my Server Linux i can ping clientcomputername  


-Messaggio originale-
Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
Inviato:venerdi 14 febbraio 2003 11.06
A:  FRANCO Battista (Baky)
Cc: [EMAIL PROTECTED]
Oggetto:Re: [squid-users] R: [squid-users] Client Computer Name in access.log

And is the name of your client stations registered on their IP addresses
in your DNS servers?

(if not, how do you expect Squid to be able to know the computer name..)

Regards
Henrik


FRANCO Battista (Baky) wrote:
 
 I set :
 log_fqdn on
 after
 squid -k reconfigure
 but it doesn't work :o
 
 -Messaggio originale-
 Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
 Inviato:venerdi 14 febbraio 2003 1.43
 A:  FRANCO Battista (Baky)
 Cc: [EMAIL PROTECTED]
 Oggetto:Re: [squid-users] Client Computer Name in access.log
 
 log_fqdn
 
 Regards
 Henrik
 
 FRANCO Battista (Baky) wrote:
 
  In my access.log i find client Ip address and its url links can i modify my
  confiuratin file to write client computer name instad of IP address.
  Thank You




[squid-users] File descriptor with squid2.5

2003-02-14 Thread Niti Lohwithee
Dear ALL,

I have plan to build new proxy using squid2.5 with Redhat 7.2 (kernel 
2.4.7-10). There are 3,000 user to access the new box. I ‘am not sure about file 
descriptor setting on squid2.5.  

(1) If I use squid 2.5 I still setting file descriptor or not ?
(2) If (1) is required . ulimit -HSn # should be ?
(3) What is another setting about it?

Anyone recommend me ☺


Regards and Thanks
Niti : )







Re: [squid-users] File descriptor with squid2.5

2003-02-14 Thread Marc Elsen


Niti Lohwithee wrote:
 
 Dear ALL,
 
 I have plan to build new proxy using squid2.5 with Redhat 7.2 
(kernel 2.4.7-10). There are 3,000 user to access the new box. I ‘am not sure about 
file descriptor setting on squid2.5.
 
 (1) If I use squid 2.5 I still setting file descriptor or not ?

 I would recommend observing the default 'FD usage' first  using
cachemgr.
 There may not be an initial need for raising the default (1024).
 
 M.

 (2) If (1) is required . ulimit -HSn # should be ?
 (3) What is another setting about it?
 
 Anyone recommend me ☺
 
 Regards and Thanks
 Niti : )
 
 
 

-- 

 'Time is a consequence of Matter thus
 General Relativity is a direct consequence of QM
 (M.E. Mar 2002)



Re: [squid-users] ssl in accelerator

2003-02-14 Thread Emilio Casbas
Henrik Nordstrom wrote:


Emilio Casbas wrote:

 

i can use the https_port directive successfully, i.e. i use squid as a
ssl gateway:

client -- (https) --- Squid -- (http) -- origin server
Accelerator

is it possible to use encrypted connections between both ways?
(squid and origin server), I don' think so, but, why?.
   


Squid-2.5.STABLE does not know how to initiate SSL connections.

This will be possible in Squid-3.

The functionality is also available as a patch to Squid-2.5 from
http://devel.squid-cache.org/

 

Thanks Henrik!  

but...
I have seen is that the redirector
(I use a redirector in accel) gets an http:// url instead of an
https:// url); How I can do for the redirector gets an https:// url ?
It's possible ?

Thanks in advance.
Emilio.


 






Re: [squid-users] squid and webalizer

2003-02-14 Thread Edward D. Millington
Once squid is configure by default for default logging options, 
webalizer can access squid logs normally.

Just read the webalizer conf for more config options.

Thank you very much.

Best regards

Edward Millington
BSc, Network+, I-Net+, CIW Associate
Systems Administrator, Sr
Cariaccess Communications Ltd.
Palm Plaza
Wildey
St. Michael
Barbados

Phone:  1 246 430 7435
Mobile: 1 246 234 6278
Fax:    1 246 431 0170

[EMAIL PROTECTED]
www.cariaccess.com


-Original Message-
From: Siew Wing Loon [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Date: Thu, 13 Feb 2003 19:21:48 -0800 (PST)
Subject: [squid-users] squid and webalizer

 Hi,
 
 How can I configure squid to allow webalizer to
 analyse the access.log file?  Does they both work
 together?
 
 Rgds,
 Siew
 
 __
 Do you Yahoo!?
 Yahoo! Shopping - Send Flowers for Valentine's Day
 http://shopping.yahoo.com




Re: [squid-users] squid and webalizer

2003-02-14 Thread Lucas Brasilino
Hi



How can I configure squid to allow webalizer to
analyse the access.log file?  Does they both work
together?

Rgds,
Siew


	
	Yes. You just have to configure squid generate
its logs in NCSA style.

--

[]'s
Lucas Brasilino
[EMAIL PROTECTED]
http://www.recife.pe.gov.br
Emprel -	Empresa Municipal de Informatica (pt_BR)
		Municipal Computing Enterprise (en_US)
Recife - Pernambuco - Brasil
Fone: +55-81-34167078




[squid-users] IP based access control through restricting password reuse

2003-02-14 Thread Mr. Singh

Hi Users

 My local network  ip address is as follows(however fictitious)

156.160.1.1 to 156.160.45.255 .  I have configured user authentication
too. Now  What I am planning is to allow a  user  to browse the
internet  from a particular range of computers only. Can I achieve this
arrangement through access control list ?? If so what is the way to
achieve this? 

T. Singh



-- 




[squid-users] squid and php-sites

2003-02-14 Thread alp
hi,
i am not sure if squid is required to not cache sites without suitable
headers (lastmod, expires,...).
does anybody know?
it seems as if for such objects the refresh-patterns are NOT used in
squid.conf.
is this right?

thx in advance,
alp

- Original Message -
From: SSCR Internet Admin [EMAIL PROTECTED]
To: alp [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Friday, February 14, 2003 1:11 AM
Subject: RE: [squid-users] question concerning php-sites and caching -still
some questions


 some sites dont want their pages to cached, so i guess squid will
eventually
 reload pages.

 -Original Message-
 From: alp [mailto:[EMAIL PROTECTED]]
 Sent: Wednesday, February 12, 2003 11:01 PM
 Cc: [EMAIL PROTECTED]
 Subject: Re: [squid-users] question concerning php-sites and caching
 -still some questions


 thanks marc,

 i knowed this page already, it's a really nice one.
 but my problem is: does squid never caches an object without validation
 headers (expires, max-age, lastmod,...)?
 if i have a refresh-pattern like
 refresh_pattern . 0 20% 5
 such an object should retain at most 5 minutes in cache, shouldn't it?
 or is refresh_pattern only used if an object has validation headers?

 thx in advance,
 alp

 - Original Message -
 From: Marc Elsen [EMAIL PROTECTED]
 To: alp [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Sent: Wednesday, February 12, 2003 5:05 PM
 Subject: Re: [squid-users] question concerning php-sites and caching


 
 
  alp wrote:
  
   hi,
   i have on my webserver a simple php site which i query via squid 2.5.
   this works (of course) and i see that no last_modified or
expiry-header
 is
   replied, which is correct for dynamic sites, too, as far as i know
   i have no cache_deny for php-sites and only the usual refresh_patterns
 of
   default squid.conf.
  
   squid does not cache this php side (also ok), but my question is: why?
   is it hardcoded into squid not to cache php-sites, or is the missing
of
   expiry and last_mod headers the reason for this?
 
Most probably, you may,for instance check objects (urls)
with :
 
http://www.ircache.net/cgi-bin/cacheability.py
 
M.
 
  
   thx in advance,
   alp
 
  --
 
   'Time is a consequence of Matter thus
   General Relativity is a direct consequence of QM
   (M.E. Mar 2002)
 
 ---
 Incoming mail is certified Virus Free.
 Checked by AVG anti-virus system (http://www.grisoft.com).
 Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

 ---
 Outgoing mail is certified Virus Free.
 Checked by AVG anti-virus system (http://www.grisoft.com).
 Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003





Re: [squid-users] File descriptor with squid2.5

2003-02-14 Thread Henrik Nordstrom
For Squid-2.5 and later, running ulimit -HSn  both before you run
configure AND before you start Squid should be sufficient.

See also http://devel.squid-cache.org/hno/linux-lfd.html

Regards
Henrik

Niti Lohwithee wrote:
 
 Dear ALL,
 
 I have plan to build new proxy using squid2.5 with Redhat 7.2 
(kernel 2.4.7-10). There are 3,000 user to access the new box. I â?~am not sure about 
file descriptor setting on squid2.5.
 
 (1) If I use squid 2.5 I still setting file descriptor or not ?
 (2) If (1) is required . ulimit -HSn # should be ?
 (3) What is another setting about it?
 
 Anyone recommend me â~º
 
 Regards and Thanks
 Niti : )
 
 




Re: [squid-users] FTP CLIENT.

2003-02-14 Thread Marc Elsen


Ampugnani, Fernando wrote:
 
 Marc:
 I fix it adding port 21 to SSL port. Isn´t recommended, but is the
 only option that I take.
 
 What do you think about this?

 Could be dangerous, in security terms.
 Remember the threads on port 25 open's for CONNECT and SPAM
 relaying abuse of squid.

 Meaning that it would be wise to use calm ftp clients in 
 sec. terms and or securing this access to squid from unintended use

 M.

 
 Regards.
 
 -Original Message-
 From: Marc Elsen [mailto:[EMAIL PROTECTED]]
 Sent: Friday, February 14, 2003 4:32 AM
 To: Ampugnani, Fernando
 Subject: Re: [squid-users] FTP CLIENT.
 
 Ampugnani, Fernando wrote:
 
  If in the program option I can configure that go for http port neither ?
 
  No because ftp is a different protocol at the client side.
  Squid does know the ftp protocol however and can fetch files
  via ftp, but only for web browser requests which always use
  the http protocol at the client side.
 
  M.
 
  Regards.
 
  -Original Message-
  From: Marc Elsen [mailto:[EMAIL PROTECTED]]
  Sent: Thursday, February 13, 2003 1:49 PM
  To: Ampugnani, Fernando
  Cc: [EMAIL PROTECTED]
  Subject: Re: [squid-users] FTP CLIENT.
 
  Ampugnani, Fernando wrote:
  
   Hi all,
   Is there any ftp client that work with squid?, because I need
 that
   my squid users make ftp through squid.
   Ws_ftp, Cuteftp and Cupertino don´t work.
 
   No , because squid deals with http.
 
   Squid can not be used as a native ftp proxy
 
   M.
 
  
   Thanks in advance.
 
  --
 
   'Time is a consequence of Matter thus
   General Relativity is a direct consequence of QM
   (M.E. Mar 2002)
 
 --
 
  'Time is a consequence of Matter thus
  General Relativity is a direct consequence of QM
  (M.E. Mar 2002)

-- 

 'Time is a consequence of Matter thus
 General Relativity is a direct consequence of QM
 (M.E. Mar 2002)



Re: [squid-users] R: [squid-users] R: [squid-users] Client Computer Name in access.log

2003-02-14 Thread Jason M. Kusar
You may be able to do this, but this really has nothing to do with squid
finding the name.  You need REVERSE DNS.  This is set up totally independent
of normal DNS.  Reverse DNS allows you to map IP addresses back to names.
If your computers are on a private network, you can set this up for yourself
using bind or some other DNS server.  If you are on a public network, you
need to talk to your ISP about getting reverse DNS set up for your IP
addresses.

--Jason

- Original Message - 
From: FRANCO Battista (Baky) [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, February 14, 2003 5:09 AM
Subject: [squid-users] R: [squid-users] R: [squid-users] Client Computer
Name in access.log


 Yes it's because from my Server Linux i can ping clientcomputername 


 -Messaggio originale-
 Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
 Inviato: venerdi 14 febbraio 2003 11.06
 A: FRANCO Battista (Baky)
 Cc: [EMAIL PROTECTED]
 Oggetto: Re: [squid-users] R: [squid-users] Client Computer Name in
access.log

 And is the name of your client stations registered on their IP addresses
 in your DNS servers?

 (if not, how do you expect Squid to be able to know the computer name..)

 Regards
 Henrik


 FRANCO Battista (Baky) wrote:
 
  I set :
  log_fqdn on
  after
  squid -k reconfigure
  but it doesn't work :o
 
  -Messaggio originale-
  Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
  Inviato:venerdi 14 febbraio 2003 1.43
  A:  FRANCO Battista (Baky)
  Cc: [EMAIL PROTECTED]
  Oggetto:Re: [squid-users] Client Computer Name in access.log
 
  log_fqdn
 
  Regards
  Henrik
 
  FRANCO Battista (Baky) wrote:
  
   In my access.log i find client Ip address and its url links can i
modify my
   confiuratin file to write client computer name instad of IP address.
   Thank You




[squid-users] Re: NT Authentication

2003-02-14 Thread Henrik Nordstrom
Yes. See the Squid FAQ entry on configuring Squid to use winbind for
authentication.

If using the ntlm authentication sheme then passwords are somewhat
encrypted on the wire.

Regards
Henrik

Cildemac Marques wrote:
 
 Hi Henrik!
 
 I was browsing Squid-list when I saw you msg about authenticating users
 based on a NT (PDC).
 I know it's possible to do it but I would like to know if it's possible
 to do via by group I mean.
 
 1. Not even all authenticated users would have the right to access the
 internet so it opens a door for creating groups and if the user =
 belongs to that group so he's able to access the internet.
 2. The second point on my scenario might be harder because I have a lot
 of different group of people and how to set that my boss can see =
 porn and chat if other just cannot.
 3. And last but not least, as my network is about 25,000 users all
 country over and I have 8 different Squids in 8 different geopraphic =
 area, it's quite possible that some smart guy will try to steel passwords
 and by doing this he will also have the login/password on the =
 NT domain so the point is, would be possible to crypt it or something?
 
 Thank you very much in advance for reading my email.
 And once again thank you very much for supporting me with reverse proxy
 6 months ago I made some modification and now it's working in a
 spetacular way! ;)
 
 Kind regards,
 
 Mac
 Brazil



Re: [squid-users] R: [squid-users] R: [squid-users] Client Computer Name in access.log

2003-02-14 Thread Henrik Nordstrom
Jason M. Kusar wrote:
 
 You may be able to do this, but this really has nothing to do with squid
 finding the name.  You need REVERSE DNS.  This is set up totally independent
 of normal DNS.  Reverse DNS allows you to map IP addresses back to names.
 If your computers are on a private network, you can set this up for yourself
 using bind or some other DNS server.  If you are on a public network, you
 need to talk to your ISP about getting reverse DNS set up for your IP
 addresses.

For private use you can do so even without help of the ISP. Just create
a DNS zone in the DNS server(s) used by Squid mapping the IP addresses
to names with suitable PTR records.. This will only be known to the
applications using your DNS servers however..

Regards
Henrik



Re: [squid-users] squid and php-sites

2003-02-14 Thread alp
thx henrik,

is it possible to change squid's behaviour to use a refresh-pattern for such
sites, too? (without changing the source code) I mean sites without any
validation headers.

I know this may cause a lot problems, but it may also be useful sometimes.

- Original Message -
From: Henrik Nordstrom [EMAIL PROTECTED]
To: alp [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Friday, February 14, 2003 2:30 PM
Subject: Re: [squid-users] squid and php-sites


 By default it does not. The RFC does not require either way as there is
 explicit headers for instructing caches, but common sense recommends not
 to as such pages are often dynamically generated by programs not aware
 of caching.

 The refresh patterns are used, but only if there is no headers denying
 the object from being cached.

 If unsure use the cacheability engine to check the status of the page in
 question.

 Regards
 Henrik


 alp wrote:
 
  hi,
  i am not sure if squid is required to not cache sites without suitable
  headers (lastmod, expires,...).
  does anybody know?
  it seems as if for such objects the refresh-patterns are NOT used in
  squid.conf.
  is this right?
 
  thx in advance,
  alp
 
  - Original Message -
  From: SSCR Internet Admin [EMAIL PROTECTED]
  To: alp [EMAIL PROTECTED]
  Cc: [EMAIL PROTECTED]
  Sent: Friday, February 14, 2003 1:11 AM
  Subject: RE: [squid-users] question concerning php-sites and
caching -still
  some questions
 
   some sites dont want their pages to cached, so i guess squid will
  eventually
   reload pages.
  
   -Original Message-
   From: alp [mailto:[EMAIL PROTECTED]]
   Sent: Wednesday, February 12, 2003 11:01 PM
   Cc: [EMAIL PROTECTED]
   Subject: Re: [squid-users] question concerning php-sites and caching
   -still some questions
  
  
   thanks marc,
  
   i knowed this page already, it's a really nice one.
   but my problem is: does squid never caches an object without
validation
   headers (expires, max-age, lastmod,...)?
   if i have a refresh-pattern like
   refresh_pattern . 0 20% 5
   such an object should retain at most 5 minutes in cache, shouldn't it?
   or is refresh_pattern only used if an object has validation headers?
  
   thx in advance,
   alp
  
   - Original Message -
   From: Marc Elsen [EMAIL PROTECTED]
   To: alp [EMAIL PROTECTED]
   Cc: [EMAIL PROTECTED]
   Sent: Wednesday, February 12, 2003 5:05 PM
   Subject: Re: [squid-users] question concerning php-sites and caching
  
  
   
   
alp wrote:

 hi,
 i have on my webserver a simple php site which i query via squid
2.5.
 this works (of course) and i see that no last_modified or
  expiry-header
   is
 replied, which is correct for dynamic sites, too, as far as i know
 i have no cache_deny for php-sites and only the usual
refresh_patterns
   of
 default squid.conf.

 squid does not cache this php side (also ok), but my question is:
why?
 is it hardcoded into squid not to cache php-sites, or is the
missing
  of
 expiry and last_mod headers the reason for this?
   
  Most probably, you may,for instance check objects (urls)
  with :
   
  http://www.ircache.net/cgi-bin/cacheability.py
   
  M.
   

 thx in advance,
 alp
   
--
   
 'Time is a consequence of Matter thus
 General Relativity is a direct consequence of QM
 (M.E. Mar 2002)
   
   ---
   Incoming mail is certified Virus Free.
   Checked by AVG anti-virus system (http://www.grisoft.com).
   Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003
  
   ---
   Outgoing mail is certified Virus Free.
   Checked by AVG anti-virus system (http://www.grisoft.com).
   Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003
  





Re: [squid-users] squid and php-sites

2003-02-14 Thread Henrik Nordstrom
As I said in the previous message: refresh_pattern IS USED for replies
with no validation headers.

Only if the content is EXPLICITLY MARKED AS NOT CACHEABLE (or NOT
CACHEABLE) by the server is refresh_pattern not used by Squid.

Regards
Henrik

alp wrote:
 
 thx henrik,
 
 is it possible to change squid's behaviour to use a refresh-pattern for such
 sites, too? (without changing the source code) I mean sites without any
 validation headers.
 
 I know this may cause a lot problems, but it may also be useful sometimes.
 
 - Original Message -
 From: Henrik Nordstrom [EMAIL PROTECTED]
 To: alp [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Sent: Friday, February 14, 2003 2:30 PM
 Subject: Re: [squid-users] squid and php-sites
 
  By default it does not. The RFC does not require either way as there is
  explicit headers for instructing caches, but common sense recommends not
  to as such pages are often dynamically generated by programs not aware
  of caching.
 
  The refresh patterns are used, but only if there is no headers denying
  the object from being cached.
 
  If unsure use the cacheability engine to check the status of the page in
  question.
 
  Regards
  Henrik
 
 
  alp wrote:
  
   hi,
   i am not sure if squid is required to not cache sites without suitable
   headers (lastmod, expires,...).
   does anybody know?
   it seems as if for such objects the refresh-patterns are NOT used in
   squid.conf.
   is this right?
  
   thx in advance,
   alp
  
   - Original Message -
   From: SSCR Internet Admin [EMAIL PROTECTED]
   To: alp [EMAIL PROTECTED]
   Cc: [EMAIL PROTECTED]
   Sent: Friday, February 14, 2003 1:11 AM
   Subject: RE: [squid-users] question concerning php-sites and
 caching -still
   some questions
  
some sites dont want their pages to cached, so i guess squid will
   eventually
reload pages.
   
-Original Message-
From: alp [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, February 12, 2003 11:01 PM
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] question concerning php-sites and caching
-still some questions
   
   
thanks marc,
   
i knowed this page already, it's a really nice one.
but my problem is: does squid never caches an object without
 validation
headers (expires, max-age, lastmod,...)?
if i have a refresh-pattern like
refresh_pattern . 0 20% 5
such an object should retain at most 5 minutes in cache, shouldn't it?
or is refresh_pattern only used if an object has validation headers?
   
thx in advance,
alp
   
- Original Message -
From: Marc Elsen [EMAIL PROTECTED]
To: alp [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Wednesday, February 12, 2003 5:05 PM
Subject: Re: [squid-users] question concerning php-sites and caching
   
   


 alp wrote:
 
  hi,
  i have on my webserver a simple php site which i query via squid
 2.5.
  this works (of course) and i see that no last_modified or
   expiry-header
is
  replied, which is correct for dynamic sites, too, as far as i know
  i have no cache_deny for php-sites and only the usual
 refresh_patterns
of
  default squid.conf.
 
  squid does not cache this php side (also ok), but my question is:
 why?
  is it hardcoded into squid not to cache php-sites, or is the
 missing
   of
  expiry and last_mod headers the reason for this?

   Most probably, you may,for instance check objects (urls)
   with :

   http://www.ircache.net/cgi-bin/cacheability.py

   M.

 
  thx in advance,
  alp

 --

  'Time is a consequence of Matter thus
  General Relativity is a direct consequence of QM
  (M.E. Mar 2002)

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003
   
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003
   
 



[squid-users] squid and rpms with winbindd - group auth?

2003-02-14 Thread Markus Feilner
Hello list,
I have successfully configured my squid and samba to use wbinfo_group.pl to 
let only members of the AD group WWW_Benutzer access the Internet.
Therefore i used samba 2.2.7 rpms from gd.tuwien.ac.at and the tarball of 
squid-2.5.-STABLE1 from squid-cache.org.
Now: Does anyone know, if there are rpms out there which do provide squid 2.5 
with wbinfo_group.pl or wb_group
 for SuSE?
I only found lots of rpms without that feature, and some only for redhat which 
didn`t work on my system.
I am Using SuSE 8.1
-- 
Mit freundlichen Grüßen
Markus Feilner

May you always grok in fullness!

Beachten Sie bitte unsere neue Email-Adresse!

-
Feilner IT Linux  GIS Erlangerstr. 2 93059 Regensburg
fon: +49 941 70 65 23  - mobil: +49 170 302 709 2 
web: http://feilner-it.net mail: [EMAIL PROTECTED]





Re: [squid-users] squid and rpms with winbindd - group auth?

2003-02-14 Thread Markus Feilner
Am Freitag, 14. Februar 2003 16:16 schrieb Markus Feilner:
 Hello list,
 I have successfully configured my squid and samba to use wbinfo_group.pl to
 let only members of the AD group WWW_Benutzer access the Internet.
 Therefore i used samba 2.2.7 rpms from gd.tuwien.ac.at and the tarball of
 squid-2.5.-STABLE1 from squid-cache.org.
 Now: Does anyone know, if there are rpms out there which do provide squid
 2.5 with wbinfo_group.pl or wb_group
  for SuSE?
 I only found lots of rpms without that feature, and some only for redhat
 which didn`t work on my system.
 I am Using SuSE 8.1
BTW, I am working on an easy-to use Howto bind squid to authenticate against 
an AD / NT Domain based on membershipin a distinguished group.
It will be posted on my webpage soon - any help is welcome PM to me!
thanks 
-- 
Mit freundlichen Grüßen
Markus Feilner

May you always grok in fullness!

Beachten Sie bitte unsere neue Email-Adresse!

-
Feilner IT Linux  GIS Erlangerstr. 2 93059 Regensburg
fon: +49 941 70 65 23  - mobil: +49 170 302 709 2 
web: http://feilner-it.net mail: [EMAIL PROTECTED]





Re: [squid-users] squid and webalizer

2003-02-14 Thread Edward D. Millington
Which is default

-Original Message-
From: Lucas Brasilino [EMAIL PROTECTED]
To: Siew Wing Loon [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Date: Fri, 14 Feb 2003 09:02:59 -0300
Subject: Re: [squid-users] squid and webalizer

 Hi
 
  
  How can I configure squid to allow webalizer to
  analyse the access.log file?  Does they both work
  together?
  
  Rgds,
  Siew
 
   
   Yes. You just have to configure squid generate
 its logs in NCSA style.
 
 -- 
 
 []'s
 Lucas Brasilino
 [EMAIL PROTECTED]
 http://www.recife.pe.gov.br
 Emprel -  Empresa Municipal de Informatica (pt_BR)
   Municipal Computing Enterprise (en_US)
 Recife - Pernambuco - Brasil
 Fone: +55-81-34167078




Re: [squid-users] Strange behavior using winbind and sibling caches

2003-02-14 Thread Henrik Nordstrom
Hugo Monteiro wrote:

 2) Another thing i notice, it's that in some sites trough HTTPS with authentication 
(Authentication with HTML Forms) the session immediatly expires once i've login.
 I suppose this is because of the round-robin parent caches, because different 
objects are fetch by different caches and the webserver thinks it's a different 
connection (?), or maybe it's because of a poor session handling scripts on the 
client or webserver part, i don't no. Does this happens to someone else?

round-robin parents have a tendency to break web servers doing session
management based on the client IP address..

Regards
Henrik



Re: [squid-users] squid and webalizer

2003-02-14 Thread Henrik Nordstrom
Not in any Squid I have seen... default is Squid native log format which
is inherently different from the NCSA style (or common) log format.

Regards
Henrik


Edward D. Millington wrote:
 
 Which is default
 
 -Original Message-
 From: Lucas Brasilino [EMAIL PROTECTED]
 To: Siew Wing Loon [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Date: Fri, 14 Feb 2003 09:02:59 -0300
 Subject: Re: [squid-users] squid and webalizer
 
  Hi
 
  
   How can I configure squid to allow webalizer to
   analyse the access.log file?  Does they both work
   together?
  
   Rgds,
   Siew
 
 
Yes. You just have to configure squid generate
  its logs in NCSA style.
 
  --
 
  []'s
  Lucas Brasilino
  [EMAIL PROTECTED]
  http://www.recife.pe.gov.br
  Emprel -  Empresa Municipal de Informatica (pt_BR)
Municipal Computing Enterprise (en_US)
  Recife - Pernambuco - Brasil
  Fone: +55-81-34167078



Re: [squid-users] squid and php-sites

2003-02-14 Thread Henrik Nordstrom
Two questions:

1. What is your refresh_pattern settings?

2. What is the full headers returend by your server?

Just tested this with Squid-2.5 and a reply with only a Date header and
some content is cached if your refresh_pattern says it should be.


Note: The default refresh_pattern settings does not cache such replies
for the reasons indicated before.

Regards
Henrik


alp wrote:
 
 sorry, i misunderstood your first reply.
 BUT:
 i have a site test.php (without any php-code, just for testing the suffix)
 on an apache server.
 it sends this site only with the DATE-header. no lastmod, no expires. it
 also does not mark the object as not cacheable.
 so the refresh-pattern IS used, as you say.
 
 so, first call:
 echo -e GET /test.php HTTP/1.0\nHost:myhost\n\n | netcat squidserver 80
 gives the file together with the above header (date)
 second call:
 echo -e GET /test.php
 HTTP/1.0\nHost:myhost\ncache-control:only-if-cached\n\n | netcat
 squidserver 80
 it says: object is not in cache.
 
 ???
 doing the same with a file test.html i see the lastmod header and it is of
 course cached.
 
 i still seem to miss some important point in understanding this, i guess.
 but for me it seems as if the refresh-pattern is not used.



[squid-users] Ignore

2003-02-14 Thread Richard StClair


How can you get squid to ignore sites that have the 'Cache-Control: no-cache' 
option set in the initial HTTP packets so that they'll cache anyway??


-- 
Regards,
Richard Saint Clair,
Co-Founder Technical Manager
Internet Users Society Niue
Chairman, Pacific Island Chapter ISOC

[EMAIL PROTECTED] www.niue.nu
Voice (68 3) 4630 Fax (68 3) 4237
Internet Service Provider, Niue Island

ISP/C, ISOC, APIA, NCOC, ISOCNZ, PICISOC, ARRL

Niue Island South Pacific 169 West 19 South

Don't forget, Nuts feed the squirrels.





Re: [squid-users] squid and webalizer

2003-02-14 Thread Edward Millington
Just as I said.

Leave sauid option
#Default:
# emulate_httpd_log off

and configure webalizer to use squid type format.

Set #LogType apache to:

LogType squid


Read the webalizer config for help.

- Original Message -
From: Jason M. Kusar [EMAIL PROTECTED]
To: Edward D. Millington [EMAIL PROTECTED]; Lucas Brasilino
[EMAIL PROTECTED]; Siew Wing Loon [EMAIL PROTECTED];
[EMAIL PROTECTED]
Sent: Friday, February 14, 2003 4:12 PM
Subject: Re: [squid-users] squid and webalizer


 Actually, you can just use webalizer to analyze squid logfiles.  I use
 version 2.01 and it has an option to read the squid standard format.  No
 changes to squid are necessary.

 --Jason

 - Original Message -
 From: Edward D. Millington [EMAIL PROTECTED]
 To: Lucas Brasilino [EMAIL PROTECTED]; Siew Wing Loon
 [EMAIL PROTECTED]; [EMAIL PROTECTED]
 Sent: Friday, February 14, 2003 1:22 PM
 Subject: Re: [squid-users] squid and webalizer


  Which is default
 
  -Original Message-
  From: Lucas Brasilino [EMAIL PROTECTED]
  To: Siew Wing Loon [EMAIL PROTECTED]
  Cc: [EMAIL PROTECTED]
  Date: Fri, 14 Feb 2003 09:02:59 -0300
  Subject: Re: [squid-users] squid and webalizer
 
   Hi
  
   
How can I configure squid to allow webalizer to
analyse the access.log file?  Does they both work
together?
   
Rgds,
Siew
  
  
   Yes. You just have to configure squid generate
   its logs in NCSA style.
  
   --
  
   []'s
   Lucas Brasilino
   [EMAIL PROTECTED]
   http://www.recife.pe.gov.br
   Emprel - Empresa Municipal de Informatica (pt_BR)
   Municipal Computing Enterprise (en_US)
   Recife - Pernambuco - Brasil
   Fone: +55-81-34167078





[squid-users] Delivering cached websites when internet is unreachable

2003-02-14 Thread Paul Cox
Hello,

I'm looking for a way to have a squid http proxy completely cache a web 
page like cnn.com and other major pages and be able to serve that cached 
page to clients even when the proxy's internet connection is no longer 
available (interface down, default gateway down, or some other failure). 
  In other words, if squid is unable to establish a connection to a web 
server after a defined time-out period, it would serve the page as it 
last appears in the cache. Is this a possible configuration for Squid? 
Thanks.

Paul



RE: [squid-users] IP based access control through restricting password reuse

2003-02-14 Thread SSCR Internet Admin
You can create an acl for it... Like

acl privilege_ip src /etc/squid/ip_add

contents of ip_add will be
156.160.1.1/32
156.160.45.5/32
and
so 
on

then

http_access allow privilege_ip
http_access deny all


Nats

-Original Message-
From: Mr. Singh [mailto:[EMAIL PROTECTED]]
Sent: Friday, February 14, 2003 2:20 AM
To: [EMAIL PROTECTED]
Subject: [squid-users] IP based access control through restricting
password reuse



Hi Users

 My local network  ip address is as follows(however fictitious)

156.160.1.1 to 156.160.45.255 .  I have configured user authentication
too. Now  What I am planning is to allow a  user  to browse the
internet  from a particular range of computers only. Can I achieve this
arrangement through access control list ?? If so what is the way to
achieve this? 

T. Singh



-- 
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003




Re: [squid-users] Ignore

2003-02-14 Thread Henrik Nordstrom
You can't without modifying the source.

Regards
Henrik


Richard StClair wrote:
 
 How can you get squid to ignore sites that have the 'Cache-Control: no-cache'
 option set in the initial HTTP packets so that they'll cache anyway??
 
 --
 Regards,
 Richard Saint Clair,
 Co-Founder Technical Manager
 Internet Users Society Niue
 Chairman, Pacific Island Chapter ISOC
 
 [EMAIL PROTECTED] www.niue.nu
 Voice (68 3) 4630 Fax (68 3) 4237
 Internet Service Provider, Niue Island
 
 ISP/C, ISOC, APIA, NCOC, ISOCNZ, PICISOC, ARRL
 
 Niue Island South Pacific 169 West 19 South
 
 Don't forget, Nuts feed the squirrels.



RE: [squid-users] Ignore

2003-02-14 Thread SSCR Internet Admin
Hmm thats sounds interesting, Henrik can you provide us a step by step code
on this? This is for non-programmer like me...

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of
Henrik Nordstrom
Sent: Friday, February 14, 2003 5:27 PM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] Ignore


You can't without modifying the source.

Regards
Henrik


Richard StClair wrote:

 How can you get squid to ignore sites that have the 'Cache-Control:
no-cache'
 option set in the initial HTTP packets so that they'll cache anyway??

 --
 Regards,
 Richard Saint Clair,
 Co-Founder Technical Manager
 Internet Users Society Niue
 Chairman, Pacific Island Chapter ISOC
 
 [EMAIL PROTECTED] www.niue.nu
 Voice (68 3) 4630 Fax (68 3) 4237
 Internet Service Provider, Niue Island
 
 ISP/C, ISOC, APIA, NCOC, ISOCNZ, PICISOC, ARRL
 
 Niue Island South Pacific 169 West 19 South
 
 Don't forget, Nuts feed the squirrels.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003




RE: [squid-users] using jesred with squid

2003-02-14 Thread Tushar Gupta

 

Hi,

I am trying to use jesred as redirection program with squid. I have
installed jesred and given appropriate entries in squid.conf. But, for
some reason jesred doesn't seem to work. It looks like requests are not
getting redirected to jesred at all. Even logs are not getting generated
which may give any hint either in messages file, or of jesred.

Can anybody provide any tips on how to go about debugging this

Tushar








RE: [squid-users] Password resuse

2003-02-14 Thread khiz code
Hie

i tried the suggestions
my config is 

authenticate_ip_ttl  1 hour  
authenticate_ip_ttl_is_strict on

Howvere i have observed that the user name and password can be reused on some
other client machine within the authenticate_ip_ttl time period ??


 have i missed something here?

pls do get back
TIA
Khiz
 

--- Prasanta kumar Panda [EMAIL PROTECTED] wrote:
 
 
 Hi Khiz
 
 Don't use strict then.
 
 For 2.4
 authenticate_ip_ttl_is_strict off
 
 For 2.5
 Don't use -s for max_user_ip.
 
 This will prompt for a second time password every time the IP gets
 changed. If some one else is using the username/password of your (valid
 user) the (valid user) will be prompted for password frequently which
 will make him not to share his credential to other. But this will not
 help if you have some sort of tools where you can hardcode the
 credential.
 Reg.
 Prasanta
 
 
 
 -Original Message-
 From: khiz code [mailto:[EMAIL PROTECTED]] 
 Sent: Tuesday, February 11, 2003 7:23 PM
 To: Prasanta kumar Panda; [EMAIL PROTECTED]
 Subject: RE: [squid-users] Password resuse
 
 
 thanks for the reply
 well this will bind the user to  that specific IP address
 what if the  (valid user) were to move to another PC during that period
 itself .. i guess im talking non sense 
 
 henrick ..any pointers ???
 
 TIA KHiz
 
 --- Prasanta kumar Panda [EMAIL PROTECTED] wrote:
  
  
  Hi Khiz
  
  If using 2.4 squid:
  Just set the time for authenticate_ip_ttl and make 
  authenticate_ip_ttl_is_strict on ( is default)
  Ex:
  authenticate_ip_ttl 2 hour
  authenticate_ip_ttl_is_strict on
  
  For 2.5 Squid
  
  authenticate_ip_ttl_is_strict option is served by acl aclname 
  max_user_ip [-s] number
  
  Use this acl to match and then deny the request. Also you can give a 
  custom error page as supported by 2.5
  
  Reg.
  Prasanta
  
  
  
  -Original Message-
  From: khiz code [mailto:[EMAIL PROTECTED]]
  Sent: Tuesday, February 11, 2003 6:20 PM
  To: [EMAIL PROTECTED]
  Subject: [squid-users] Password resuse
  
  
  Hie gurus

  i ve got a peculiar requirement
   
  after a user authenticates himeslf to squid (using any of the 
  available
  mechanisms) i need to be able to restrict the user  to that particular
  machine as such  time that he is browsing using that machine. SO
 during
  such time , no other user should be able to use the same user name and
  password on some other machine ..
  
  
  however once he has logged off (??) , the user name and password can 
  be re used on some other machine
  
  I know this is more of a policy issue, wherein passwods should  not be
 
  revealed, but wondering if Technology could do the rescue act :-0)
  
  Thanks in advance
  khiz
  
  
  
  
  __
  Do you Yahoo!?
  Yahoo! Shopping - Send Flowers for Valentine's Day 
  http://shopping.yahoo.com
  
 **Disclaimer
 **
 
   
   Information contained in this E-MAIL being proprietary to Wipro 
  Limited is 'privileged' and 'confidential' and intended for use only 
  by the individual or entity to which it is
  addressed. You are notified that any use, copying or dissemination of
 the
  information 
  contained in the E-MAIL in any manner whatsoever is strictly
 prohibited.
  
 
 
 
  
  
  
  
 
 
 __
 Do you Yahoo!?
 Yahoo! Shopping - Send Flowers for Valentine's Day
 http://shopping.yahoo.com
 
 **Disclaimer
 
 Information contained in this E-MAIL being proprietary to Wipro Limited is 
 'privileged' and 'confidential' and intended for use only by the individual
  or entity to which it is addressed. You are notified that any use, copying 
 or dissemination of the information contained in the E-MAIL in any manner 
 whatsoever is strictly prohibited.
 
 ***
  BEGIN:VCARD
 VERSION:2.1
 N:Panda;Prasanta;Kumar
 FN:Prasanta ([EMAIL PROTECTED]) (prasanta)
 ORG:Wipro Technologies;IMG-HDC
 TITLE:Sr. Network Analyst
 TEL;WORK;VOICE:+91 40-6565148
 TEL;WORK;VOICE:+91 40-6565000
 ADR;WORK;ENCODING=QUOTED-PRINTABLE:;;Wipro Technologies=0D=0ASurvey #
 64=0D=0AMadhapur;Hyderabad;Andhra Pradesh=
 ;500033;India
 LABEL;WORK;ENCODING=QUOTED-PRINTABLE:Wipro Technologies=0D=0ASurvey #
 64=0D=0AMadhapur=0D=0AHyderabad, Andhra Pra=
 desh 500033=0D=0AIndia
 URL;WORK:http://www.wipro.com
 EMAIL;PREF;EX:/o=Wipro/ou=First Administrative
 Group/cn=Recipients/cn=prasanta
 REV:20020725T070827Z
 END:VCARD
 


__
Do you Yahoo!?
Yahoo! Shopping - Send Flowers for Valentine's Day
http://shopping.yahoo.com



[squid-users] Re: anyone know why this is blocked?

2003-02-14 Thread Jeff Donovan
Rick you are my hero!
is there any way to find out what variables in the expressionslist is 
the culprit?

thanks for the tips. The dual log is awesome.


--jeff

On Thursday, February 13, 2003, at 05:11 PM, Rick Matthews wrote:

Jeff Donovan wrote:


i have a transparent proxy running squid 2.5 and squidguard.
everything is working fine.
however when I was surfing around i came to :
http://www.netbsd.org

now that domain loads fine. but when i click on   Documentation/FAQ 

I get redirected to my Denied file.
I greped my blacklists for the domain, url, and ip and nothing came
back. Then I manually searched ( what a bugger)


It's not blocked here.

As Darren has already mentioned, there are a few things that you can
do when you are setting up squidGuard that will greatly simplify your
research efforts:

- Use squidGuard.cgi (from the /samples folder) for redirects.  That
will give you a redirect page that resembles this:
http://home1.gte.net/res0pj61/squidguard/redirect-sample.gif

- If you can't (or would prefer not to) run cgi, you can still
redirect to a different page from each group.  For example, you might
redirect the porn group to http://home1.gte.net/res0pj61/403prn.html
and the drugs group to http://home1.gte.net/res0pj61/403drgs.html.

- For clarity and ease of use, add a redirect statement to every
destination block.  They could all point to the same location, or
they might all be different.  For starters, I'd recommend pointing
everything but the ads group to the squidGuard.cgi page.  The ads
group should be redirected to a transparent 1x1.gif (or png).

- For clarity and ease of use, add a log statement to every
destination block.  For starters, I'd recommend logging everything
but the ads group to blocked.log.  The ads group should be
logged to ads.log.  This will log the important information
about every block, to greatly simply research.

- If you use the logic presented in the first 2 tips above, you do
not need a redirect statement in any acl sections where the
pass statement ends with all.  You do need a redirect statement
in the acl sections where the pass statement ends with none.

- If you are using an allowed destination group, remember that any
domains entered there have a free pass, even if the domain or
subdomains are listed in blocked destination groups.  The allowed
group should be listed first in your acl, pass allowed !porn 
It is not necessary to have a redirect and log statement in your
allowed group.

- Be extremely careful with expressionlists!  As an example,
remember that your porn expressionlist will define a combination
that, if it appears in a url, will cause it to be classified as a
porn url.  Therefore, that combination should never appear in a
non-porn url.  (Repeat the previous two sentences for each group
that contains an expressionlist, replacing porn with the name
of the destination group.)  I only use 2 expressionlists, both in
areas where the terminology is fairly unique - porn and ads.

- My expressionlists are not in the same destination groups with
domains and urls.  I have a porn group and a pornexp group, the latter
containing only the porn expressionlist.  I also have ads and adsexp
groups.  This is extremely helpful in debugging and correcting
false blocks.  Knowing the destination group that caused the block
immediately tells you whether you have a database or expressionlist
problem.

- Separating the database files from the expressionlists also allows
you to gauge the effectiveness of your expressionlist.  Put the
database before the expressionlist in your pass statement
(pass !porn !pornexp...).  You can then examine your blocked.log
file knowing that if a url was blocked by pornexp, it was not in
the porn databases and would have been approved except for the
expressionlist.

- More information on isolating expressionlist blocks for easier
problem identification:

Here's a small change that you can make to your squidGuard.conf file
so that you will immediately know if you've been blocked by the porn
database or by the porn expressionlist.

Instead of setting up your porn destination group like this:

 not this way --
dest porn {
	domainlist		porn/domains
	urllist		porn/urls
	expressionlist	porn/expressions
	redirect		http://yourserver.com/whatever...
	logfile		blocked.log
}
-  end  

Break out the expressionlist and set it up like this:

-- Recommended --
dest porn {
	domainlist		porn/domains
	urllist		porn/urls
	redirect		http://yourserver.com/whatever...
	logfile		blocked.log
}

dest pornexp {
	expressionlist	expressions
	redirect		http://yourserver.com/whatever...
	logfile		blocked.log
}
-  end  -

Then replace [!porn] with [!porn !pornexp] in your acl and you'll
have exactly the same coverage as before, but now your redirect
page and blocked log will show:

Target group = porn
or
Target group = pornexp

I hope these help!

Rick













[squid-users] anyone have a good expressions list

2003-02-14 Thread Jeff Donovan
greetings

I'm looking for a good expressions list. Something that only targets 
porn sites. I had been using the default exp list that comes with the 
blacklists, but it seems to block out many sites that are not adult 
related.

I'm pretty much REGEX illiterate.

--jeff



Re: [squid-users] anyone have a good expressions list

2003-02-14 Thread Henrik Nordstrom
Building a good regex list which blocks only porn is a almost impossible
task, if you also want it to block porn..

In almost all cases you will need a whitelist when using regex patterns
for blocking to exclude things which are not wanted to be blocked but
which resembles too closely a name which normally should be blocked..

Writing regex expressions is not that hard. Some quick guidelines:

1. regex matches are partial string matches, not word matches.

2. . is a special character matching any character. To match . you need
to use \.

3. ^ and $ is also special charaters, matching the beginning and end of
the string respectively. This means that a regex pattern starting with
^  starts matching only at the beginning of the string (i.e. ^www\. 
matches www.anything), and a pattern ending in $ matches only if it
matches the end of the string (i.e. \.com$ matches anything.com)

4. * and {nn} makes repetitions. * repeats the previous atom 0 to
infinity number of times, {nn} exacly nn times. There is also {min,max}
repetition count.

5. To group things you can use (). i.e. (ab){4} matches abababab  (4
times ab)

6. To make different alternatives you can use |. i.e. a(b|c|de)f matches
abf or acf or adef

7. There is a number of magic constructs such as word boundary matches
etc.. see the man 7 regex manual for a full list of regex
capabilities.  (squid uses what is referred to in most documentation as
modern or extended regex syntax)

Regards
Henrik

Jeff Donovan wrote:
 
 greetings
 
 I'm looking for a good expressions list. Something that only targets
 porn sites. I had been using the default exp list that comes with the
 blacklists, but it seems to block out many sites that are not adult
 related.
 
 I'm pretty much REGEX illiterate.
 
 --jeff



Re: [squid-users] squid and php-sites

2003-02-14 Thread alp
hi henrik,

1) refresh_pattern . 0 20% 4320
if i understand your note correctly, this is not correct for php-sites???
but it should, since the dot finds any object, doesn't it?
nevertheless, i also tried
refresh_pattern \.php 0 20% 4320
with the same effect (see 2)

2)
if i do the request for test.php i see:
HTTP/1.0 200 OK
Daten: Sat, 15 Feb 2003 08:00:00 GMT
Server: Apache/1.3.27 PHP/4.1.2
X-Powered-By: PHP/4.1.2
Content-Type: text/html
X-Cache: MISS from test.de
Connection: close

hi-php

(the last line is the content of the file)
and still it is not in the cache

does this help in finding an explanation?

- Original Message -
From: Henrik Nordstrom [EMAIL PROTECTED]
To: alp [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Friday, February 14, 2003 8:04 PM
Subject: Re: [squid-users] squid and php-sites


 Two questions:

 1. What is your refresh_pattern settings?

 2. What is the full headers returend by your server?

 Just tested this with Squid-2.5 and a reply with only a Date header and
 some content is cached if your refresh_pattern says it should be.


 Note: The default refresh_pattern settings does not cache such replies
 for the reasons indicated before.

 Regards
 Henrik


 alp wrote:
 
  sorry, i misunderstood your first reply.
  BUT:
  i have a site test.php (without any php-code, just for testing the
suffix)
  on an apache server.
  it sends this site only with the DATE-header. no lastmod, no expires. it
  also does not mark the object as not cacheable.
  so the refresh-pattern IS used, as you say.
 
  so, first call:
  echo -e GET /test.php HTTP/1.0\nHost:myhost\n\n | netcat squidserver
80
  gives the file together with the above header (date)
  second call:
  echo -e GET /test.php
  HTTP/1.0\nHost:myhost\ncache-control:only-if-cached\n\n | netcat
  squidserver 80
  it says: object is not in cache.
 
  ???
  doing the same with a file test.html i see the lastmod header and it is
of
  course cached.
 
  i still seem to miss some important point in understanding this, i
guess.
  but for me it seems as if the refresh-pattern is not used.