[squid-users] Plz help... urls problem

2004-03-01 Thread Deepa D
Hi,
I have collected some info from squid and the
redirector program wrt the malformed urls problem. It
is a huge mail, kindly spare some time to read
through. 
Making a premise, the redirector code comprises of
two modules - the client is the one that is started by
squid. It fwds the requests to the redirector server
that takes care of processing the request. It then
sends the response to the client which inturn fwds the
response to the squid.
   The bug is that malformed urls are getting
generated. The squid log for debug_options ALL,1 61,9
33,5 is also pasted below :-

1)
 Client :-
 12304 Thu Feb 26 18:11:04 2004:  Client - read from
stdin =  http://www.dictionary.com/
10.10.10.106/bhadra - GET
   12305 Thu Feb 26 18:11:04 2004:  Client - wrote to
server =  http://www.dictionary.com/
10.10.10.106/bhadra - GET
   12306 Thu Feb 26 18:11:04 2004:  Client - the value
read from server = 
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET

 Server :-
 72060  Thu Feb 26 18:11:04 2004:  Server - the value
read =  http://www.dictionary.com/ 10.10.10.106/bhadra
- GET
  72061
  72062 Thu Feb 26 18:11:04 2004:  In loadInBuff, buff
=  http://www.dictionary.com/ 10.10.10.106/bhadra -
GET
  72098 Thu Feb 26 18:11:04 2004:  in_buff.url = 
http://www.dictionary.com/
  72100 Thu Feb 26 18:11:04 2004:  Server - wrote to
client = 
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET
  
 Cache.log :-
  2004/02/26 18:11:04| parseHttpRequest: Method is
'GET'
  2004/02/26 18:11:04| parseHttpRequest: URI is
'http://www.dictionary.com/'
  2004/02/26 18:11:04| clientSetKeepaliveFlag:
http_ver = 1.0
  2004/02/26 18:11:04| clientSetKeepaliveFlag: method
= GET
  2004/02/26 18:11:04| The request GET
http://www.dictionary.com/ is ALLOWED, because it
matched 'all'
  2004/02/26 18:11:04| redirectStart:
'http://www.dictionary.com/'
  2004/02/26 18:11:04| redirectHandleRead:
{http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
10.10.10.106 mani GET}
  2004/02/26 18:11:04| clientRedirectDone:
'http://www.dictionary.com/'
result=http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
 
 2) 
 Client :-
 12307 Thu Feb 26 18:11:04 2004:  Client - read from
stdin = 
http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
   12308 Thu Feb 26 18:11:04 2004:  Client - wrote to
server = 
http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
   12309 Thu Feb 26 18:11:05 2004:  Client - read from
stdin =  ñ^RBñ^RBww.dictionary.com/css/console.css
10.10.10.106/bhadra - GET
   12310 Thu Feb 26 18:11:05 2004:  Client - wrote to
server = 
ñ^RBñ^RBww.dictionary.com/css/console.css
10.10.10.106/bhadra - GET
   12311 Thu Feb 26 18:11:05 2004:  Client - the value
read from server =
   12312 Thu Feb 26 18:11:05 2004:  Client - the value
read from server = 
http://localhost:8080/contentfilter/login1.jsp?url=(ñ^RBñ^RBww.dictionary.com/css/console.css)ip=10.10.10.106
10.10.10.106 mani GET
 
 Server :-
 72708  Thu Feb 26 18:11:04 2004:  Server - the value
read =  http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
  72710 Thu Feb 26 18:11:04 2004:  In loadInBuff, buff
=  http://update.messenger.yahoo.com/msgrcli.html
10.10.10.109/sharavathi - GET
  72134 Thu Feb 26 18:11:04 2004:  in_buff.url = 
http://update.messenger.yahoo.com/msgrcli.html
  72135 Thu Feb 26 18:11:04 2004:  After doAuth,
in_buff.url =  http://update.me   
ssenger.yahoo.com/msgrcli.html
  Thu Feb 26 18:11:05 2004:  Allowed , wrote to client
=

 Cache.log :-
 2004/02/26 18:11:04| parseHttpRequest: Method is
'GET'
 2004/02/26 18:11:04| parseHttpRequest: URI is
'http://update.messenger.yahoo.com/msgrcli.html'
 2004/02/26 18:11:04| clientSetKeepaliveFlag: http_ver
= 1.0
 2004/02/26 18:11:04| clientSetKeepaliveFlag: method =
GET
 2004/02/26 18:11:04| The request GET
http://update.messenger.yahoo.com/msgrcli.html is
ALLOWED, because it matched 'all'
 2004/02/26 18:11:04| The request GET
http://update.messenger.yahoo.com/msgrcli.html is
ALLOWED, because it matched 'all'
 2004/02/26 18:11:04| redirectStart:
'http://update.messenger.yahoo.com/msgrcli.html'
 2004/02/26 18:11:04| clientSendMoreData:
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106,
3881 bytes
 2004/02/26 18:11:04| clientSendMoreData: FD 16
'http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106',
out.offset=0 
 2004/02/26 18:11:04| clientBuildReplyHeader: can't
keep-alive, unknown body size
 2004/02/26 18:11:04| clientSendMoreData: Appending
3584 bytes after 297 bytes of headers
 2004/02/26 18:11:04| The reply for GET
http://localhost:8080/contentfilter/login1.jsp?url=(http://www.dictionary.com/)ip=10.10.10.106
is ALLOWED, because it matched 'all'
 

[squid-users] caching problem ?

2004-03-01 Thread Jérôme PHILIPPE
Hi,

I have Trend's IWSS and I try to use it with squid.

IE  Squid with cache  IWSS proxy Internet

I've configured Squid with options :

cache_peer parent  @IP of  IWSS 8080 7 no-query default
acl all src 0.0.0.0/0.0.0.0
never_direct allow all

With this configuration, I always redownloading my files and I don't use my
squid's cache. It's normal ?

Without this, it's ok. IE  Squid with cache  Internet

Bests regards.



[squid-users] Squid and Firewall rules

2004-03-01 Thread GG BB
Hi List!

I'm actually working with
squid-2.5.STABLE3 installed on a Slackware 7.2

this box acts as a Gateway, Firewall and VPN(FreeSWAN)
so I've set up my own private LAN and users 

It's all working fine now, Squid, Firewall, and so on,
I just need that all users on the private LAN -MUST-
go through the Squid-Firewall Box to surf the WEB..

at the moment I've added the Transparent Proxy
iptables rule on my Firewall settings, through which
all traffic passing through port 80 is then redirected
to my Squid-Firewall box, on port 3128.

-- iptables -t nat -A PREROUTING -i eth1 -p tcp
--dport 80 -j REDIRECT --to-port 3128 --

But with this rule in, I get that all users, even if
they don't set their Browsers to use a Proxy, can surf
the WEB withouth being authenticated by Squid, but
passing through the Proxy anyway (in fact I can see
them on my Access.log file)

what I wish to do is to set the Squid or Firewall
settings to impose a Squid Authentication even if my
users don't set their Browsers to use a Proxy, so 

USER1 Browser-configured -- Authentication = Allowed

USER2 NoBrowser-configured -- Authentication or ERROR
You are not allowed to ...

I hope I've been clear enough ,if not, please ask for
more information ..
here are my Squid settings:

## GENERIC SETTINGS

httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
emulate_httpd_log on
auth_param basic program
/etc/webmin/squid/squid-auth.pl
/etc/webmin/squid/users
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320

## ACLs

acl myPwd proxy_auth REQUIRED
acl all src 0.0.0.0/0.0.0.0
acl mylan src 10.4.4.4/24
acl manager proto cache_object
acl localhost src 192.168.1.80 
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

## HTTP_ ACCESS SETTINGS

http_access deny to_localhost
http_access deny !mylan
http_access allow myPwd
http_access allow mylan
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access deny all

Thanks !!

__
Yahoo! Mail: 6MB di spazio gratuito, 30MB per i tuoi allegati, l'antivirus, il filtro 
Anti-spam
http://it.yahoo.com/mail_it/foot/?http://it.mail.yahoo.com/


[squid-users] Squid-2.5.STABLE5 released [minor security / major bugfix release]

2004-03-01 Thread Henrik Nordstrom
The Squid HTTP Proxy team is pleased to announce the availability of the
Squid-2.5.STABLE5 bugfix release.

This new release can be downloaded from our HTTP or FTP servers

   http://www.squid-cache.org/Versions/v2/2.5/
   ftp://ftp.squid-cache.org/pub/squid-2/STABLE/

or the mirrors (may take a while before all mirrors are updated).
For a list of mirror sites see

   http://www.squid-cache.org/Mirrors/http-mirrors.html
   http://www.squid-cache.org/Mirrors/ftp-mirrors.html


Squid-2.5.STABLE5 is a major bugfix release of Squid-2.5 and corrects one
minor security issue in url_regex access controls and several major
non-security related bugs found in the earlier Squid-2.5 releases. Users
are recommended to upgrade to this new release, especially if using any of
the features mentioned below.


The most important bug-fixes in this release are:

[security] %00 in could be used in to bypass url_regex and urlpath_regex
access controls in certain configurations. Other acl directives not
affected. More information on this issue can be found in the SQUID-2004:1
security advisory distributed separately
url:http://www.squid-cache.org/Advisories/SQUID-2004_1.txt

[major] Several NTLM related bugfixes and improvements fixing the problem
of random auth popups and account lockouts. Optional support for the
NEGOTIATE NTLM packet is also added to allow Samba-3.0.2 or later to
negotiate the use of NTLMv2 or NTLM2.

[major] Several authentication related bugfixes to allow authentication to
work in additional acl driven directives outside of http_access, and a
number of corrections to assertion or segmentation faults and some memory
leaks.


In addition there is a small number of new features or improvements which
enhances the functionality of Squid

[medium] redirector interface modified to work with login names containing
spaces or other odd characters. This is accomplished by URL-encoding the
login name before sent to redirectors. Note: Existing redirectors or their
configuration may need to be slightly modified in how they process the
ident column to support the new username format (only applies to
redirectors looking into the username)

[medium] various timeouts adjusted: connect_timeout 1 minute (was 2 
minutes which is now forward_timeout), negative_dns_ttl 1 minute (was 5 
minutes) and is now also used as minimum positive dns ttl, dns_timeout 2 
minutes (was 5 minutes)

[minor] short_icon_urls on can be used to simplify the URLs used for 
icons etc to avoid issues with proxy host naming and authentication when 
requesting icons.

[minor] A new urllogin ACL type has been introduced allowing regex 
matches to the login component of Internet style URLs 
(protocol://user:[EMAIL PROTECTED]/path/to/file).

[minor] Squid now respects the Telnet protocol on connections to FTP 
servers. The ftp_telnet_protocol directice can be used to revert back to 
the old incorrect implementation if required.

[minor] The default mime.conf has been updated with many new mime types 
and a few minor corrections. In addition the download and view links is 
used more frequently to allow view/download of different ftp:// contents 
regardless of their mime type assignment.


in addition there is a large amount of minor and cosmetic bugfixes not
included in the above list. For a complete list of changes see the
ChangeLog and the Squid-2.5 Patches page
url:http://www.squid-cache.org/Versions/v2/2.5/bugs/


It is recommended to read the release notes when upgrading from an earlier 
Squid release (including Squid-2.5.STABLE4) as there has been some minor 
changes in the configuration.


Thanks goes to MARA Systems AB who has been actively sponsoring this
bugfix release of Squid as part of their continuing effort to provide both
free and commercial support to the Squid community, and to all users who
have provided valuable bug reports and feedback via the Squid bug
reporting tool.


Regards
The Squid HTTP Proxy developer team




RE: [squid-users] caching problem ?

2004-03-01 Thread Elsen Marc

 
 
 Hi,
 
 I have Trend's IWSS and I try to use it with squid.
 
 IE  Squid with cache  IWSS proxy Internet
 
 I've configured Squid with options :
 
 cache_peer parent  @IP of  IWSS 8080 7 no-query default
 acl all src 0.0.0.0/0.0.0.0
 never_direct allow all
 
 With this configuration, I always redownloading my files and 
 I don't use my
 squid's cache. It's normal ?

  No it's not normal (indeed).
  Post example (URL), with excerpt from access.log , both cases :
  Case mentioned and case below. Access code is important
 
 Without this, it's ok. IE  Squid with cache  Internet

  Also provide squid version.

  M.
 
 Bests regards.
 
 


Re: [squid-users] Squid and Firewall rules

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, GG BB wrote:

 But with this rule in, I get that all users, even if
 they don't set their Browsers to use a Proxy, can surf
 the WEB withouth being authenticated by Squid, but
 passing through the Proxy anyway (in fact I can see
 them on my Access.log file)

This is most likely due to the fact that you can not combine
authentication and transparent interception. For proxy authentication to 
be used the browser MUST be configured to use a proxy.

You should notice this by quite massive complaints in cache.log if there 
is a users who do not have their proxy settings in the browser.

 ## HTTP_ ACCESS SETTINGS
 
 http_access deny to_localhost
 http_access deny !mylan
 http_access allow myPwd
 http_access allow mylan
 http_access allow manager localhost
 http_access deny manager
 http_access deny !Safe_ports
 http_access deny CONNECT !SSL_ports
 http_access deny all


The above should read

http_access allow manager localhost
http_access deny manager
http_access deny !mylan
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow myPwd mylan
http_access deny all


For details on how http_access works see the Suqid FAQ chapter 10 Access
Controls. url:http://www.squid-cache.org/Doc/FAQ/FAQ-10.html

Regards
Henrik



RE: [squid-users] Squid and Firewall rules

2004-03-01 Thread Elsen Marc

 
 
 Hi List!
 
 I'm actually working with
 squid-2.5.STABLE3 installed on a Slackware 7.2
 
 this box acts as a Gateway, Firewall and VPN(FreeSWAN)
 so I've set up my own private LAN and users 
 
 It's all working fine now, Squid, Firewall, and so on,
 I just need that all users on the private LAN -MUST-
 go through the Squid-Firewall Box to surf the WEB..
 
 at the moment I've added the Transparent Proxy
 iptables rule on my Firewall settings, through which
 all traffic passing through port 80 is then redirected
 to my Squid-Firewall box, on port 3128.
 
 -- iptables -t nat -A PREROUTING -i eth1 -p tcp
 --dport 80 -j REDIRECT --to-port 3128 --
 
 But with this rule in, I get that all users, even if
 they don't set their Browsers to use a Proxy, can surf
 the WEB withouth being authenticated by Squid, but
 passing through the Proxy anyway (in fact I can see
 them on my Access.log file)
 
 what I wish to do is to set the Squid or Firewall
 settings to impose a Squid Authentication even if my
 users don't set their Browsers to use a Proxy, so 
 
 USER1 Browser-configured -- Authentication = Allowed
 
 USER2 NoBrowser-configured -- Authentication or ERROR
 You are not allowed to ...
 
  You can't at least in in the squid context :

 http://www.squid-cache.org/Doc/FAQ/FAQ-17.html#ss17.15

  M.


Re: [squid-users] caching problem ?

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, Jérôme PHILIPPE wrote:

 IE  Squid with cache  IWSS proxy Internet
 
 I've configured Squid with options :

Make sure to use server_persistent off in this type of configuration. 
There has been many reports about the Trend proxy being buggy and causing 
cache pollutions if persistent connections are enabled.

 With this configuration, I always redownloading my files and I don't use my
 squid's cache. It's normal ?

Depends.

Most objects should be cached like normal, but if you are using the
progress window function in IWSS then the download will not be cached.
Or to be precise, when using the progress window IWSS then changes the URL
of the downloaded file to a unique URL for this request only, which will
never ever be requested again.

Regards
Henrik



[squid-users] Squid Log Analyzer

2004-03-01 Thread Endre Szekely-Bencedi
   Hi List,

 I am trying to set up some log analyzer for squid. I've set up
Webalizer as I've used it before for Apache log analyzing, and it's pretty
fancy and I like it. However, now I'm using it for Squid, and there is
something that doesn't logs, and it would be pretty important for me. I
don't see the hit/miss ratio, or anything related to this.
 Now, is that possible to see with Webalizer, and I'm not smart enough
to find it out how? Or I should use some other tool for this. Note that I
also like the top hits, top kbs, all URLs (very important) etc., so I'd
like to see all these too if possible. Also if I have to change the tool, I
wouldn't want something that generates some binary file that I need to
parse with vi and work for hours to be able to find what I'm interested in.
Html is very nice, I'd like something similar to Webalizer if possible.

Thanks for any input on this matter.

Endre

THIS E-MAIL MESSAGE ALONG WITH ANY ATTACHMENTS IS INTENDED ONLY FOR THE
ADDRESSEE and may contain confidential and privileged information. If the
reader of this message is not the intended recipient, you are notified that
any dissemination, distribution or copy of this communication is strictly
prohibited. If you have received this message by error, please notify us
immediately, return the original mail to the sender and delete the message
from your system.



RE: [squid-users] user_cert ACL in accel mode

2004-03-01 Thread David Hajek
 The timeframe is when I (or MARA Systems) have a customer 
 requiring the functionality, or someone else submits a patch 
 implementing the function.
 I have not yet studied how complex it would be to add the 
 renegotiation requirements to request SSL certificates in the 
 ACL code but it probably isn't all trivial. SSL certificate 
 negotiation is quite different from all other forms of 
 authentication or acl checks.

OK, makes sense. Just wonder why there is certificate related ACLs in
squid.conf already.
I think its a bit confusing. 

-David



RE: [squid-users] user_cert ACL in accel mode

2004-03-01 Thread David Hajek
 The timeframe is when I (or MARA Systems) have a customer 
 requiring the functionality, or someone else submits a patch 
 implementing the function.
 I have not yet studied how complex it would be to add the 
 renegotiation requirements to request SSL certificates in the 
 ACL code but it probably isn't all trivial. SSL certificate 
 negotiation is quite different from all other forms of 
 authentication or acl checks.

OK, makes sense. Just wonder why there is certificate related ACLs in
squid.conf already.
I think its a bit confusing. 

-David



RE: [squid-users] Squid and Firewall rules

2004-03-01 Thread Mark Cooke
On Mon, 2004-03-01 at 12:01, Elsen Marc wrote in reply to:
  
  
  -- iptables -t nat -A PREROUTING -i eth1 -p tcp
  --dport 80 -j REDIRECT --to-port 3128 --
  
  But with this rule in, I get that all users, even if
  they don't set their Browsers to use a Proxy, can surf
  the WEB withouth being authenticated by Squid, but
  passing through the Proxy anyway (in fact I can see
  them on my Access.log file)
  
  what I wish to do is to set the Squid or Firewall
  settings to impose a Squid Authentication even if my
  users don't set their Browsers to use a Proxy, so 
  
  USER1 Browser-configured -- Authentication = Allowed
  
  USER2 NoBrowser-configured -- Authentication or ERROR
  You are not allowed to ...
  
   You can't at least in in the squid context :
 
  http://www.squid-cache.org/Doc/FAQ/FAQ-17.html#ss17.15

But the workaround is to setup the redirect to a web server you control
that explains how to setup the browser to use your proxy, instead of
trying to transparently direct it to squid.

Ie, --to-destination as well as --to-port (so you don't have to run a
web server on your firewall).

iptables -t nat --dport 80 -j REDIRECT --to-destination
my.proxyinstruction.server --to-port 80

When you setup the web server, just map all URLs to the proxy setup
instructions (because iptables can't change the requested URL).  If you
have an machine running as an existing web server, just use a different
port number and a virtual host, or similar.

Cheers,

Mark

-- 
Mark Cooke [EMAIL PROTECTED]



RE: [squid-users] Squid Log Analyzer

2004-03-01 Thread Scott Phalen
Try Calamaris.  Excellent tool!!

http://cord.de/tools/squid/calamaris/Welcome.html

Regards,
Scott


-Original Message-
From: Endre Szekely-Bencedi [mailto:[EMAIL PROTECTED]
Sent: Monday, March 01, 2004 5:46 AM
To: [EMAIL PROTECTED]
Subject: [squid-users] Squid Log Analyzer

   Hi List,

 I am trying to set up some log analyzer for squid. I've set up
Webalizer as I've used it before for Apache log analyzing, and it's pretty
fancy and I like it. However, now I'm using it for Squid, and there is
something that doesn't logs, and it would be pretty important for me. I
don't see the hit/miss ratio, or anything related to this.
 Now, is that possible to see with Webalizer, and I'm not smart enough
to find it out how? Or I should use some other tool for this. Note that I
also like the top hits, top kbs, all URLs (very important) etc., so I'd
like to see all these too if possible. Also if I have to change the tool, I
wouldn't want something that generates some binary file that I need to
parse with vi and work for hours to be able to find what I'm interested in.
Html is very nice, I'd like something similar to Webalizer if possible.

Thanks for any input on this matter.

Endre

THIS E-MAIL MESSAGE ALONG WITH ANY ATTACHMENTS IS INTENDED ONLY FOR THE
ADDRESSEE and may contain confidential and privileged information. If the
reader of this message is not the intended recipient, you are notified that
any dissemination, distribution or copy of this communication is strictly
prohibited. If you have received this message by error, please notify us
immediately, return the original mail to the sender and delete the message
from your system.



RE: [squid-users] Squid Log Analyzer

2004-03-01 Thread Endre Szekely-Bencedi


Thanks, will try this.





   
 
Scott 
 
Phalen  To: Endre Szekely-Bencedi [EMAIL 
PROTECTED],
[EMAIL PROTECTED][EMAIL PROTECTED]  

ast.net cc:   
 
 Subject: RE: [squid-users] Squid Log 
Analyzer  
03/01/2004 
 
02:11 PM   
 
   
 
   
 




Try Calamaris.  Excellent tool!!

http://cord.de/tools/squid/calamaris/Welcome.html

Regards,
Scott


-Original Message-
From: Endre Szekely-Bencedi [mailto:[EMAIL PROTECTED]
Sent: Monday, March 01, 2004 5:46 AM
To: [EMAIL PROTECTED]
Subject: [squid-users] Squid Log Analyzer

   Hi List,

 I am trying to set up some log analyzer for squid. I've set up
Webalizer as I've used it before for Apache log analyzing, and it's pretty
fancy and I like it. However, now I'm using it for Squid, and there is
something that doesn't logs, and it would be pretty important for me. I
don't see the hit/miss ratio, or anything related to this.
 Now, is that possible to see with Webalizer, and I'm not smart enough
to find it out how? Or I should use some other tool for this. Note that I
also like the top hits, top kbs, all URLs (very important) etc., so I'd
like to see all these too if possible. Also if I have to change the tool, I
wouldn't want something that generates some binary file that I need to
parse with vi and work for hours to be able to find what I'm interested in.
Html is very nice, I'd like something similar to Webalizer if possible.

Thanks for any input on this matter.

Endre

THIS E-MAIL MESSAGE ALONG WITH ANY ATTACHMENTS IS INTENDED ONLY FOR THE
ADDRESSEE and may contain confidential and privileged information. If the
reader of this message is not the intended recipient, you are notified that
any dissemination, distribution or copy of this communication is strictly
prohibited. If you have received this message by error, please notify us
immediately, return the original mail to the sender and delete the message
from your system.




THIS E-MAIL MESSAGE ALONG WITH ANY ATTACHMENTS IS INTENDED ONLY FOR THE
ADDRESSEE and may contain confidential and privileged information. If the
reader of this message is not the intended recipient, you are notified that
any dissemination, distribution or copy of this communication is strictly
prohibited. If you have received this message by error, please notify us
immediately, return the original mail to the sender and delete the message
from your system.



RE: [squid-users] user_cert ACL in accel mode

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, David Hajek wrote:

 OK, makes sense. Just wonder why there is certificate related ACLs in
 squid.conf already.

The certificate ACLs work, and allow you to match specific user 
certificates or user CAs.

It is just the option to have the request for the client certificate 
delayed until required by acl processing which does not.

Regards
Henrik



RE: [squid-users] user_cert ACL in accel mode

2004-03-01 Thread David Hajek
  OK, makes sense. Just wonder why there is certificate 
 related ACLs in 
  squid.conf already.
 
 The certificate ACLs work, and allow you to match specific 
 user certificates or user CAs.
 
 It is just the option to have the request for the client 
 certificate delayed until required by acl processing which does not.
 

Got you know. Thank you for help and great squid. ;)

-D



RE: [squid-users] user_cert ACL in accel mode

2004-03-01 Thread David Hajek
  OK, makes sense. Just wonder why there is certificate 
 related ACLs in 
  squid.conf already.
 
 The certificate ACLs work, and allow you to match specific 
 user certificates or user CAs.
 
 It is just the option to have the request for the client 
 certificate delayed until required by acl processing which does not.
 

Got you know. Thank you for help and great squid. ;)

-D



Re: [squid-users] caching problem ?

2004-03-01 Thread Jérôme PHILIPPE
Thanks Henrik,

I'm using progress window and you're right.
I changed my IWSS configuration for using progress window at a size
greater than 5 MB and the cache works fine ;o)


- Original Message - 
From: Henrik Nordstrom [EMAIL PROTECTED]
To: Jérôme PHILIPPE [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Monday, March 01, 2004 1:00 PM
Subject: Re: [squid-users] caching problem ?


 On Mon, 1 Mar 2004, Jérôme PHILIPPE wrote:

  IE  Squid with cache  IWSS proxy Internet
 
  I've configured Squid with options :

 Make sure to use server_persistent off in this type of configuration.
 There has been many reports about the Trend proxy being buggy and causing
 cache pollutions if persistent connections are enabled.

  With this configuration, I always redownloading my files and I don't use
my
  squid's cache. It's normal ?

 Depends.

 Most objects should be cached like normal, but if you are using the
 progress window function in IWSS then the download will not be cached.
 Or to be precise, when using the progress window IWSS then changes the URL
 of the downloaded file to a unique URL for this request only, which will
 never ever be requested again.

 Regards
 Henrik




Re: [squid-users] Problem Starting Squid

2004-03-01 Thread adrian.wells
Hi Mihai,

Just grab the nearest Unix/Linux for dummies manual for this
 stuff.
Got me! :-)

When I cd /usr/local/squid/sbin and then pwd it returns the same path, so
does this not mean that the path is correct? It was not a problem before,
but I have upgraded Linux SuSE 7.2 to 9.0 maybe it's down to a change in the
version?


  Means '.' was not in your current path.
 
  I don't understand. I set the path via cd /usr/local/squid/sbin

 No, you did not set $PATH, you only changed the current/working
 directory.

 When you type a command, it is looked for in the directories from
 $PATH; if . (the working directory) exists in $PATH (which is the
 wrong thing) then the command will also be looked in the working
 directory. This is different from DOS where the command is first
 looked in . and then in $PATH.

 Just grab the nearest Unix/Linux for dummies manual for this
 stuff.

 Mihai Buha




[squid-users] A problem with URL truncation

2004-03-01 Thread mrflora
Hello, I am a Squid neophyte and I'm having a problem with URLs.  Squid
seems to be truncating all URLs to a single / and gives me the error
The requested URL could not be retrieved.  I can't find anything wrong
in the config file.  Any ideas would be appreciated.

Regards,
M.R.F.

P.S. Thanks again to those who responded to my earlier post.



Re: [squid-users] A problem with URL truncation

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004 [EMAIL PROTECTED] wrote:

 Hello, I am a Squid neophyte and I'm having a problem with URLs.  Squid
 seems to be truncating all URLs to a single / and gives me the error
 The requested URL could not be retrieved.  I can't find anything wrong
 in the config file.  Any ideas would be appreciated.


Are you trying to run Squid as a transparently intercepting proxy nor 
requiring the browser to be configured to use the proxy?

If so, have you made the required squid.conf changes for transparent 
interception operation? (see the Squid FAQ)

And does it work if you configure your browser to use the proxy?

Regards
Henrik



Re: [squid-users] Problem Starting Squid

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, adrian.wells wrote:

 When I cd /usr/local/squid/sbin and then pwd it returns the same path, so
 does this not mean that the path is correct?

No, it still only tells you the current working directory, not the search 
path for starting applications. (pwd = Print Working Directory).

Applications in the current directory will only be started if either

a) You start them with ./filename  (recommended)

b) You have . in your search path (not recommended due to security 
implications)

 It was not a problem before, but I have upgraded Linux SuSE 7.2 to 9.0
 maybe it's down to a change in the version?

All UN*X:es work the same way. No point in changing only because of this.

But it is possible the security of your previous installation had been 
relaxed to have . in the default search path.

Regards
Henrik



[squid-users] Calamaris

2004-03-01 Thread Endre Szekely-Bencedi
Hello List,

I have a problem with Calamaris (v2.58).

I am using squid 2.5stable3, compiled from sources, with SmartFilter
plugin.
As far as I know, I have to use the squid-extended input type for this. But
this will give some errors:

[EMAIL PROTECTED] logs]# date;cat test.log | /usr/local/squid/bin/calamaris
-f squid-extended -F html  /var/www/html/calamaris2.html;date
Mon Mar  1 17:44:08 CET 2004
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Split loop at (eval 1) line 20,  line 369578.
Mon Mar  1 17:48:05 CET 2004
[EMAIL PROTECTED] logs]#

Generated log shows:

!DOCTYPE HTML PUBLIC -//W3C//DTD HTML 4.0 Transitional//EN
HTMLHEAD
META http-equiv=Content-Type content=text/html;
charset=iso-8859-1/HEAD
BODY/BODY/HTML

Which is an empty page.

A sample from the logfile:

1077780471.441 93 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.466 64 3.227.65.74 TCP_MISS/200 1722 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.479 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.508 59 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.699 73 3.227.65.74 TCP_MISS/200 1585 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.713 83 3.227.65.74 TCP_MISS/200 1607 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.726 86 3.227.65.74 TCP_MISS/200 1589 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.885256 3.227.65.74 TCP_MISS/200 726 GET
http://as.fotexnet.hu/adserver.ads/153/0///937480 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.212229 3.227.65.74 TCP_MISS/200 23713 GET
http://index.hu/ad/lipton/banner1_120x240.swf? -
DEFAULT_PARENT/10.20.20.254 applicat
ion/x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.298 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.388279 3.227.65.74 TCP_MISS/200 17697 GET
http://index.hu/ad/microsoft_wss.swf? - DEFAULT_PARENT/10.20.20.254
application/x-sho
ckwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.439106 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.458 47 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.480368 3.227.65.74 TCP_MISS/200 4292 GET
http://as.fotexnet.hu/adserver.ads/196/0///27236 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.643162 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.646144 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.673487 3.227.65.74 TCP_MISS/200 10319 GET
http://as.fotexnet.hu/adserver.ads/200/0///378158 -
DEFAULT_PARENT/10.20.20.254 text/
html text/html ALLOW
1077780473.799280 3.227.65.74 TCP_MISS/200 26216 GET
http://index.hu/ad/teluzoallo_120x240.swf? - DEFAULT_PARENT/10.20.20.254
application/
x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.819122 3.227.65.74 TCP_MISS/200 216 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Porta
l Sites
1077780473.824124 3.227.65.74 TCP_MISS/200 355 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Porta
l Sites
1077780473.842136 3.227.65.74 TCP_MISS/200 1603 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780473.846 47 3.227.65.74 TCP_MISS/200 353 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Porta
l Sites

Am I doing something wrong?

Thanks,
Endre.

THIS E-MAIL MESSAGE ALONG WITH ANY ATTACHMENTS IS INTENDED ONLY FOR THE
ADDRESSEE and may contain confidential and privileged information. If the
reader of this message is not the intended recipient, you are notified that
any dissemination, distribution or copy of this communication is strictly
prohibited. If you have received this message by error, please notify us
immediately, return the 

Re: [squid-users] Calamaris

2004-03-01 Thread Kirk Schneider
Endre,

I have contacted the Calamaris author before on this and he has
suggested filtering the extra fields that smartfilter adds at
the end.
Now I run this on all my logs before piping to calamaris:

awk '{print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10}' access.log |calamaris

--
Kirk Schneider  972-952-4645 (work)
Raytheon Corporate IT Security  214-912-8679 (cell)
[EMAIL PROTECTED] 888-431-7621 (pager)
If you think the problem is bad now just wait until we've solved it.



 Original Message 
Subject: [squid-users] Calamaris
Date: Mon, 1 Mar 2004 17:43:52 +0100
From: Endre Szekely-Bencedi [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Hello List,

I have a problem with Calamaris (v2.58).

I am using squid 2.5stable3, compiled from sources, with SmartFilter
plugin.
As far as I know, I have to use the squid-extended input type for this. But
this will give some errors:
[EMAIL PROTECTED] logs]# date;cat test.log | /usr/local/squid/bin/calamaris
-f squid-extended -F html  /var/www/html/calamaris2.html;date
Mon Mar  1 17:44:08 CET 2004
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Split loop at (eval 1) line 20,  line 369578.
Mon Mar  1 17:48:05 CET 2004
[EMAIL PROTECTED] logs]#
Generated log shows:

!DOCTYPE HTML PUBLIC -//W3C//DTD HTML 4.0 Transitional//EN
HTMLHEAD
META http-equiv=Content-Type content=text/html;
charset=iso-8859-1/HEAD
BODY/BODY/HTML
Which is an empty page.

A sample from the logfile:

1077780471.441 93 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.466 64 3.227.65.74 TCP_MISS/200 1722 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.479 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.508 59 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.699 73 3.227.65.74 TCP_MISS/200 1585 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.713 83 3.227.65.74 TCP_MISS/200 1607 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.726 86 3.227.65.74 TCP_MISS/200 1589 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.885256 3.227.65.74 TCP_MISS/200 726 GET
http://as.fotexnet.hu/adserver.ads/153/0///937480 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.212229 3.227.65.74 TCP_MISS/200 23713 GET
http://index.hu/ad/lipton/banner1_120x240.swf? -
DEFAULT_PARENT/10.20.20.254 applicat
ion/x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.298 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.388279 3.227.65.74 TCP_MISS/200 17697 GET
http://index.hu/ad/microsoft_wss.swf? - DEFAULT_PARENT/10.20.20.254
application/x-sho
ckwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.439106 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.458 47 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.480368 3.227.65.74 TCP_MISS/200 4292 GET
http://as.fotexnet.hu/adserver.ads/196/0///27236 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.643162 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.646144 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.673487 3.227.65.74 TCP_MISS/200 10319 GET
http://as.fotexnet.hu/adserver.ads/200/0///378158 -
DEFAULT_PARENT/10.20.20.254 text/
html text/html ALLOW
1077780473.799280 3.227.65.74 TCP_MISS/200 26216 GET
http://index.hu/ad/teluzoallo_120x240.swf? - DEFAULT_PARENT/10.20.20.254
application/
x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.819122 3.227.65.74 TCP_MISS/200 216 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Porta
l Sites
1077780473.824124 3.227.65.74 TCP_MISS/200 355 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Porta
l Sites
1077780473.842136 3.227.65.74 TCP_MISS/200 1603 GET

Re: [squid-users] Calamaris

2004-03-01 Thread Kirk Schneider
Endre,

You also might try this filter script I've written to cleanup
the logs before passing to calamaris, it removes some characters
that cause calamaris problems.
#!/usr/local/bin/gawk -f
{
if ( $0 ~ /[^\x20-\x7E]/ ) {
gsub( /\x20\x0C/,  )
gsub( /[\x00-\x1F]/,  )
gsub( /[\x7F-\xFF]/,  )
}
if ( $5  0 ) {
$5 *= -1
}
if ( $2 !~ /[^0-9]/  $5 !~ /[^0-9]/  ( $11 == ALLOW || $11 == DENY ) ) 
{
print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10
}
else {
next
}
}


--
Kirk Schneider  972-952-4645 (work)
Raytheon Corporate IT Security  214-912-8679 (cell)
[EMAIL PROTECTED] 888-431-7621 (pager)
If you think the problem is bad now just wait until we've solved it.



 Original Message 
Subject: [squid-users] Calamaris
Date: Mon, 1 Mar 2004 17:43:52 +0100
From: Endre Szekely-Bencedi [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Hello List,

I have a problem with Calamaris (v2.58).

I am using squid 2.5stable3, compiled from sources, with SmartFilter
plugin.
As far as I know, I have to use the squid-extended input type for this. But
this will give some errors:
[EMAIL PROTECTED] logs]# date;cat test.log | /usr/local/squid/bin/calamaris
-f squid-extended -F html  /var/www/html/calamaris2.html;date
Mon Mar  1 17:44:08 CET 2004
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Malformed UTF-8 character (unexpected non-continuation byte 0x31,
immediately after start byte 0xf3) in split at (eval 1) line 20,  line
369578.
Split loop at (eval 1) line 20,  line 369578.
Mon Mar  1 17:48:05 CET 2004
[EMAIL PROTECTED] logs]#
Generated log shows:

!DOCTYPE HTML PUBLIC -//W3C//DTD HTML 4.0 Transitional//EN
HTMLHEAD
META http-equiv=Content-Type content=text/html;
charset=iso-8859-1/HEAD
BODY/BODY/HTML
Which is an empty page.

A sample from the logfile:

1077780471.441 93 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.466 64 3.227.65.74 TCP_MISS/200 1722 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.479 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.508 59 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780471.699 73 3.227.65.74 TCP_MISS/200 1585 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.713 83 3.227.65.74 TCP_MISS/200 1607 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.726 86 3.227.65.74 TCP_MISS/200 1589 GET
http://sher.index.hu/get? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Port
al Sites
1077780471.885256 3.227.65.74 TCP_MISS/200 726 GET
http://as.fotexnet.hu/adserver.ads/153/0///937480 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.212229 3.227.65.74 TCP_MISS/200 23713 GET
http://index.hu/ad/lipton/banner1_120x240.swf? -
DEFAULT_PARENT/10.20.20.254 applicat
ion/x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.298 72 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.388279 3.227.65.74 TCP_MISS/200 17697 GET
http://index.hu/ad/microsoft_wss.swf? - DEFAULT_PARENT/10.20.20.254
application/x-sho
ckwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.439106 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.458 47 3.227.65.74 TCP_MISS/302 476 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.480368 3.227.65.74 TCP_MISS/200 4292 GET
http://as.fotexnet.hu/adserver.ads/196/0///27236 -
DEFAULT_PARENT/10.20.20.254 text/ht
ml text/html ALLOW
1077780473.643162 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.646144 3.227.65.74 TCP_MISS/302 477 GET
http://sher.index.hu/ad? - DEFAULT_PARENT/10.20.20.254 text/html text/html
ALLOW Portal
 Sites
1077780473.673487 3.227.65.74 TCP_MISS/200 10319 GET
http://as.fotexnet.hu/adserver.ads/200/0///378158 -
DEFAULT_PARENT/10.20.20.254 text/
html text/html ALLOW
1077780473.799280 3.227.65.74 TCP_MISS/200 26216 GET
http://index.hu/ad/teluzoallo_120x240.swf? - DEFAULT_PARENT/10.20.20.254
application/
x-shockwave-flash application/x-shockwave-flash ALLOW Portal Sites
1077780473.819

[squid-users] Question: squidclient in ping mode

2004-03-01 Thread OTR Comm
Hello,

When squidclient is used in ping mode, e.g.

/usr/local/squid/bin/squidclient -g 5 -h 209.145.208.8 -p 8939
http://216.19.43.110
2004-03-01 11:33:11 [1]: 0.131 secs, 7.557252 KB/s
2004-03-01 11:33:12 [2]: 0.001 secs, 997.00 KB/s
2004-03-01 11:33:13 [3]: 0.001 secs, 997.00 KB/s
2004-03-01 11:33:14 [4]: 0.001 secs, 997.00 KB/s
2004-03-01 11:33:15 [5]: 0.001 secs, 997.00 KB/s

what is actually being pinged here?

I know this site (http://216.19.43.110) is not in the squid cache on
209.145.208.8.

When I ping 216.19.43.110 from the host, I get:

[EMAIL PROTECTED] root]# ping 216.19.43.110
PING 216.19.43.110 (216.19.43.110) from 209.145.208.8 : 56(84) bytes of
data.
64 bytes from 216.19.43.110: icmp_seq=1 ttl=247 time=48.5 ms
64 bytes from 216.19.43.110: icmp_seq=2 ttl=247 time=34.5 ms
64 bytes from 216.19.43.110: icmp_seq=3 ttl=247 time=35.6 ms

Can someone explain what squidclient in ping mode is doing and how I can
interpret it's output?


Thanks,

Murrah Boswell


[squid-users] SquidGuard and DB 4

2004-03-01 Thread Carlos Simbaña
Hi. I have an squid 2.5 stable 4 with squidGuard. I am updating my server
and squidGuard say that not work with DB 4. There are another program as
squidGuard? but newest because the last uptate for squidGuad was at Dec 18
2001. Thanks

Carlos S.





[squid-users] Squid Block the Internet Access

2004-03-01 Thread Hernan Dario Arredondo
Hi everyone

I had, since the last week, when I tried to start squid, it block the 
Internet access... I try to ping any host (when squid is up) and it 
simply it doesn't respond I had a transparent proxy conf. using 
iptables

When I start squid it block Internet in the host where is installed and 
block the clients too, when I shutdown squid the machine (where squid is 
installed) can surf normally

Is any problem with my ISP or squid caused the problem.

I tried cleanig the cache... and nothing happend
I tried modify the squid.conf file and nothing.
ANY HELP ME !!



Hernan Dario Arredondo



RE: [squid-users] Squid and Firewall rules

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, GG BB wrote:

 So I guess mine is not the 'standard' architecture for
 a NAT-VPN and Proxy ...
 Maybe the best solution would be keep them separate,
 and setting up a box that act 'only' as a Proxy ?

Sorry, you are mixing two problems.

There is no technical problem in running the proxy on a NAT-VPN gateway,
but you need to be careful with the source IP address assignment if the
proxy is to access servers over LAN IPSec VPN tunnels as the gateway
itself is usually not on an address part of the tunnel, and also the 
firewalling gets a little complex as ju suddently have an alternative path 
whereby traffic may be forwarded by the gateway so you better know the 
access controls of the proxy at least as good as your firewalling rules.

Interception caching also works fine, at least with Free-S/WAN /
Open-S/WAN tunnels, and soon also with Linux-2.6 IPSec.


However, NAT-VPN gateway or not you can not use Interception
caching/proxying in combination with HTTP proxy authentication. If you
want to use HTTP proxy authentication there is no option but to have the
browsers configured to use the proxy as neither the browsers or the proxy
will accept authentication if not. When using interception the browsers
have no clue who they are supposed to authenticate to as for all they know
they are talking to the origin web server and not a proxy, and because of
the security implications of this HTTP REQUIRES the browser to not respond
to any HTTP proxy authentication challenges in such case.

 could someone provide their own experience in HOW and
 WHERE build a Proxy on a NET having a
 NAT-Firewall-VPN that is already working ?!

The logical place is on the local network, like any other local server. A 
proxy is just a server which is accessed by your clients, and then going 
out to the Internet to fetch the required content as a client of it's own.

So a proxy can be placed anywere where the following conditions are
fulfilled:

a) All the clients who need to use the proxy can reach it

b) From where your policy allows access to the Internet.

c) Where it is suitable for your overall traffic flow. On the far side
(counted from the clients) of a heavily overloaded connection is for
example not a good place.

A transparently intercepting proxy has stricter requirements, and the
easies place to run it on the gateway, but if you can I would recommend
not due to the security implications if this gateway is also acting as a 
firewall.

Regards
Henrik



Re: [squid-users] Squid Block the Internet Access

2004-03-01 Thread Henrik Nordstrom
On Mon, 1 Mar 2004, Hernan Dario Arredondo wrote:

 When I start squid it block Internet in the host where is installed and 
 block the clients too, when I shutdown squid the machine (where squid is 
 installed) can surf normally

Unless you in when I start Squid also include when I install the 
iptables rules then something is seriously wrong.

Just starting Squid without modifying Firewalling should have no impact 
what so ever on Internet connectivity, other than that with Squid running 
you will be able to use the proxy.

But it you in the sentence include adding the iptables rules for
interception caching then the error is likely to be found there.


Please describe your setup in a little more detail.

Regards
Henrik



[squid-users] ldap_auth example

2004-03-01 Thread none none
Is anyone able to help me with a real example of squid_ldap_auth.exe  if my Active 
directory looks like this:
domain.com
 -Builtin
 -Computers
 ...
 -Group1
  -Computers
  -Folders
  -Groups
  -Users
  -Someone LastName
  -Anotherone LastName
 -Group2

etc.
So what would be the configuration if need to grant Internet access to all users in 
Group1 or all users in a group
inside Group1 etc.

I'm using these lines with no luck:

auth_param basic program C:/squid/libexec/squid_ldap_auth.exe -u cn -b
cn=Group1,cn=Users,dc=myDomain,dc=com -h

external_acl_type ldap_group %LOGIN C:/squid/libexec/squid_ldap_group.exe -u cn -b
cn=Group1,cn=users,dc=myDomain,dc=com -h

acl G1 external ldap_group cn=Group1,cn=Users,dc=myDomain,dc=com

http_access allow G1


Any thoughts?
Thank you
VerMan

Les noms de domaine les moins chers du marché : 6,49 euros HT sur 
http://www.lycos.fr/pro .



[squid-users] squid soure code for AUFS

2004-03-01 Thread Ko Jong Hyun
I compiled Squid with '--enable-async-io' option, and modify squid.conf to 
operate Squid with AUFS.

But I could not see the 16(default value) squid threads. There is only one 
Squid process and its child.

So, I inserted some debug messages into Squid source code.
(One of them is inserted to the function that call 'pthread_create')
Then, I found that the 'pthread_create' function is never called.
(Exactly, the function that call 'pthread_create' is never called)
So, I want to track the sequence of function calls related to AUFS, but it 
seems unclear to me.

What can i do to fix the problem like this?

_
   , ... 
http://www.msn.co.kr/money/interlotto/  



RE: [squid-users] squid soure code for AUFS

2004-03-01 Thread Elsen Marc

 
 
 I compiled Squid with '--enable-async-io' option, and modify 
 squid.conf to 
 operate Squid with AUFS.

   Which changes did you make to squid.conf in order to use
   aufs ?

   Which OS/plaform/version are you using ?
 
 But I could not see the 16(default value) squid threads. 
 There is only one 
 Squid process and its child.
 
 So, I inserted some debug messages into Squid source code.
 (One of them is inserted to the function that call 'pthread_create')
 
 Then, I found that the 'pthread_create' function is never called.
 (Exactly, the function that call 'pthread_create' is never called)
 
 So, I want to track the sequence of function calls related to 
 AUFS, but it 
 seems unclear to me.
 
 What can i do to fix the problem like this?
 
 _
, ... 
 http://www.msn.co.kr/money/interlotto/