[squid-users] Re: Squid and Tomcat in one machine running WinXP

2005-02-18 Thread Joost de Heer
 acl myNet src 10.0.0.0-200.0.0.1/255.255.255.0

This is wrong. What exactly are you trying to do? You want all class C
subnets between 10.0.0.0/24 and 200.0.0.0/24? That's not a range that can
be easily caught with a single subnet-mask. You're probably better off to
state only the subnets you want, like

acl myNet src 10.0.0.0/255.255.255.0
acl myNet src 169.254.0.0/255.255.0.0
acl myNet src 200.0.0.0/255.255.255.0

Joost



Re: [squid-users] Two squid instances based on file types? Is it good?

2005-02-18 Thread Marco Crucianelli
On Thu, 2005-02-17 at 12:52 -0600, Kevin wrote:

 
 What mechanism are you using to set expire times?

Well, I'm still not sure what I shall use! I mean: should I use
refresh_pattern!? Or what? I mean, refresh_pattern can let me change
refresh period based on sire url right? What' else could I use?!

 
 
  2) supposing to have two different cache dir, the first one for normal
  web doc and the second one for big multimedia files, whenever squid
  needs space to cache, let's say, another web doc (small file) does it
  start applying the replacement policy only on the small file cache_dir
  or even in the big file cache dir? I don't want it to purge big
  multimedia files, when it needs to cache only a small web doc!!!
 
 That's a good question.
 
 I guess it depends on how the code is implemented, the squid.conf comments
 say It is used to initially choose the storedir, but not what happens if the
 initial storedir is full?

Right, this is interesting to understand, but I couldn't find info
anywhere! :/

 
 
   You might also consider setting the maximum_object_size_in_memory
   relatively low, even if you have quite a bit of RAM to work with.  I have
   caches with cache_mem set to 2GB, yet I set m_o_s_i_m to 128KB.
  
  Well, if I need to cache very big files, let's say about 1GB in size, I
  can't set m_o_s to 128kb or I would never cache files bigger than
  128kb...Am I wrong?
 
 From a production cache:
$ egrep ^(cache_dir|cache_mem|maximum_) squid.conf
cache_mem 2100 MB
maximum_object_size 16383 KB
maximum_object_size_in_memory 128 KB
cache_dir aufs /squid 1600 16 256
 
 Kevin Kadow

Well, maybe I did misunderstood! I was thinking about m_o_s and not
m_o_s_i_m, sorry! Anyway, giving a look at your configuration, can I ask
you why do you have 2.1GB on RAM dedicated to squid, while you have only
1.6GB of cache storage? I've read that normally you should dedicate
about 10MB of RAM for each GB of storage space...right?
Well, considering that I could cache very big files, let's say about
also 1GB per file, what do you suggest me? I was thinking something like
this:

cache_mem 1000 MB
maximum_object_size 1000MB
maximum_object_size_in_memory ??
cache_dir aufs /web ?? 16 256
cache_dir aufs /multimedia ?? 16 256

where the cache_dir in total could be up to 1TB (I could even use more
disk space, but then, I think, I would have problem for RAM)

Thanks!

Marco



[squid-users] HEAD method, NTLM authentication and browser cache policy setting

2005-02-18 Thread Gilles Hamel
Hello,
We use squid V2.5-STABLE7. We have some troubles with NTLM 
authentication (samba 3.0.2) and browsers which do HEAD requests.
On some http sites (ie: http://www.windowsupdate.com, 
www.shopathomeselect.com, www.agoraplus.com ...) , clients browser do 
HEAD requests but don't send their authentication tokens :

1108489166.618  8 10.2.10.27 TCP_DENIED/407 425 HEAD 
http://www.shopathomeselect.com/GR_check_site.html? - NONE/- text/html
1108489166.720  7 10.2.10.27 TCP_DENIED/407 424 HEAD 
http://www.shopathomeselect.com/GR_check_site.html? - NONE/- text/html
1108489166.757 10 10.2.10.27 TCP_DENIED/407 424 HEAD 
http://www.shopathomeselect.com/GR_check_site.html? - NONE/- text/html

If we don't allow HEAD methods, we can't use correctly these sites. The 
workaround is :

acl HEAD method HEAD
http_access allow HEAD
The www.agoraplus.com use Control ActiveX to display technical plan. If 
HEAD requests are denied, the application doesn't work.

We had similar issue with POST method and NTLM prior STABLE5 (see 
http://www.squid-cache.org/bugs/show_bug.cgi?id=267
http://www.squid-cache.org/bugs/show_bug.cgi?id=757 ), now it's ok.

The HEAD requests are not common in the squid log file and are related 
always on the same sites. In some case, Browser does HEAD requests to 
check recent modifications of objects. Why have we so few HEAD requests 
in log file ?
I have tried to change my browser cache setting to  = check for newer 
versions of stored pages, every visit to the page
the browser doesn't do HEAD requests on expired objects to squid. Why ?

I looked for in the bug database, and I found nothing.
Is it a known issue ?
Thank you


[squid-users] http_reply_access and windows groups (again, please!!)

2005-02-18 Thread Carlos

Hi !!
I have sent this message to the list yesterday, but, as I didn´t receive 
any answers, I thougth that it may not have been received ...

We are trying to prevent the download of software from some of our users, 
and we have managed do to that, for test purposes, using http_reply_access 
combined with user acls.

Now that everything is ok, we would like to apply these rules combined 
with windows groups (we use ntlm authentication).

We have read a message posted by Henrik Nordstrom stating that 
http_reply_access cannot wait for external acl, but suggesting the 
following workaround:

You can work around this quite well (but not 100%) by making sure the 
same acls is evaluated in http_access, allowing Squid to cache the result 
before processing your http_reply_access rules. A simple method to have 
acls evaluated in http_access without affecting the http_access outcome is 
to use combine them with a dummy acl that will never match anything

acl nothing src 0.0.0.0/32
http_access deny acl_that_needs_to_be_evaluated nothing
somewhere before where access is allowed..
I didn´t really understand how does it work... By doing this, can I use 
acl_thar_needs_to_be_evaluated, wich, in our case, would be an external 
acl using wbinfo_group.pl, in a http_reply_access rule? Or, better yet, is 
there a simpler way to do that?

Thanks again,
Carlos Zottmann.






[squid-users] squidrunner v 1.5 information

2005-02-18 Thread squidrunner team
Hello All,

Greetings.

A script to build or upgrade the squid proxy server
with named as squidrunner's new version is available
as 

http://www.geocities.com/squidrunner_dev/squidrunnerv15.txt

Squid Runner updated functionalities :

If you don't have squid proxy server then,
1. Automatically Get new source from the squid portal
and patches
2. get configuration details from user on
a. ICMP PINGING
b. Delay pools
c. SNMP monitoring
d. ARP ACL lists
e. HTCP protocol
f. CARP protocol
g. Useragent logging
h. Referer header logging
i. Automatic call backtrace on fatal errors
j. Store I/O Modules (aufs,coss, diskd, null, ufs
k. Removal Policies (heap,lru)
l. Cache Digests
3. build and install based on the input 
4. configure using customer input
a. access only to your network
b. cache_dir
c. cache_mem
5. Create cache directories and start for service

If you have squid proxy server then,
1. Ask the location to use the same configuration
2. get new source and patches
3. Build and install the previous configuration
4. startup the proxy server 

Try yourself and give the feedback to enrich this.

home page : http://freshmeat.net/projects/squidrunner

Have a nice week-end there.


=




__ 
Do you Yahoo!? 
Yahoo! Mail - 250MB free storage. Do more. Manage less. 
http://info.mail.yahoo.com/mail_250


[squid-users] Adjusting web pages delivered by the server

2005-02-18 Thread Rodrigo de Oliveira
Hello guys!

I would like Squid to listen all the HTTP traffic in
the middle of the host-server conversation so it could
take the HTML page delivered by the server to the host
and make some adjusts to it (let´s say, putting a logo
in the end of the page, shrinking the page, etc.). Can
it be done? Or else, can a module be implemented for
squid in any programming language that could act
maybe, I don´t know, like a CGI or something else? If
not, what do you suggest me to make this adjustments
on the web pages delivered by the server?

Thanks in advance to you all

Rodrigo





___ 
Yahoo! Acesso Grátis - Instale o discador do Yahoo! agora. 
http://br.acesso.yahoo.com/ - Internet rápida e grátis


[squid-users] Problem with unparseable HTTP header field

2005-02-18 Thread Ralf Hildebrandt
When I surf to http://www.abstractserver.de/da2005/avi/e/Abs_revi.htm
and enter any number/character and click Submit my query, I get an
error page (Invalid Response The HTTP Response message received from
the contacted server could not be understood or was otherwise
malformed. Please contact the site operator. Your cache administrator
may be able to provide you with more details about the exact nature of
the problem if needed. ) and in my cache.log:

-- snip ---
2005/02/18 14:42:23| ctx: exit level  0
2005/02/18 14:42:23| ctx: enter level  0: 
'http://www.abstractserver.de/bin46/da2005/avi/e/avicgi.exe'
2005/02/18 14:42:23| WARNING: unparseable HTTP header field near {HTTP/1.1 200 
OK
content-type: text/html
content-length: 2205
Connection: Keep-Alive
}
-- snip ---

with squid/2.5.STABLE7 (Debian/testing)

How can I work around this?
Without a proxy I can access the site OK (I can, my users MUST use the
proxy).

-- 
Ralf Hildebrandt (i.A. des IT-Zentrum)  [EMAIL PROTECTED]
Charite - Universitätsmedizin BerlinTel.  +49 (0)30-450 570-155
Gemeinsame Einrichtung von FU- und HU-BerlinFax.  +49 (0)30-450 570-962
IT-Zentrum Standort CBF send no mail to [EMAIL PROTECTED]


Re: [squid-users] Problem with unparseable HTTP header field

2005-02-18 Thread Ralf Hildebrandt
* Ralf Hildebrandt [EMAIL PROTECTED]:
 When I surf to http://www.abstractserver.de/da2005/avi/e/Abs_revi.htm
 and enter any number/character and click Submit my query, I get an
 error page (Invalid Response The HTTP Response message received from
 the contacted server could not be understood or was otherwise
 malformed. Please contact the site operator. Your cache administrator
 may be able to provide you with more details about the exact nature of
 the problem if needed. ) and in my cache.log:
 
 -- snip ---
 2005/02/18 14:42:23| ctx: exit level  0
 2005/02/18 14:42:23| ctx: enter level  0: 
 'http://www.abstractserver.de/bin46/da2005/avi/e/avicgi.exe'
 2005/02/18 14:42:23| WARNING: unparseable HTTP header field near {HTTP/1.1 
 200 OK
 content-type: text/html
 content-length: 2205
 Connection: Keep-Alive
 }
 -- snip ---
 
 with squid/2.5.STABLE7 (Debian/testing)

# squid -v
Squid Cache: Version 2.5.STABLE7
configure options:  --prefix=/usr --exec_prefix=/usr
--bindir=/usr/sbin --sbindir=/usr/sbin --libexecdir=/usr/lib/squid
--sysconfdir=/etc/squid --localstatedir=/var/spool/squid
--datadir=/usr/share/squid --enable-async-io --with-pthreads
--enable-storeio=ufs,aufs,diskd,null --enable-linux-netfilter
--enable-arp-acl --enable-removal-policies=lru,heap --enable-snmp
--enable-delay-pools --enable-htcp --enable-poll
--enable-cache-digests --enable-underscores --enable-referer-log
--enable-useragent-log --enable-auth=basic,digest,ntlm --enable-carp
--enable-large-files i386-debian-linux

# uname -a
Linux spiderboy 2.6.10 #1 SMP Mon Jan 3 16:22:38 CET 2005 i686 GNU/Linux

-- 
Ralf Hildebrandt (i.A. des IT-Zentrum)  [EMAIL PROTECTED]
Charite - Universitätsmedizin BerlinTel.  +49 (0)30-450 570-155
Gemeinsame Einrichtung von FU- und HU-BerlinFax.  +49 (0)30-450 570-962
IT-Zentrum Standort CBF send no mail to [EMAIL PROTECTED]


Re: [squid-users] Problem with unparseable HTTP header field

2005-02-18 Thread Ralf Hildebrandt
* Ralf Hildebrandt [EMAIL PROTECTED]:

  with squid/2.5.STABLE7 (Debian/testing)
 
 # squid -v
 Squid Cache: Version 2.5.STABLE7

I also tried 2.5stable8 on Solaris:

# sbin/squid -v
Squid Cache: Version 2.5.STABLE8-20050218
configure options:  --libexecdir=/usr/local/squid/2.5s8/libexec
--exec-prefix=/usr/local/squid/2.5s8 --enable-err-language=German
--enable-cachemgr-hostname=spiderman.charite.de --enable-dlmalloc
--enable-poll --disable-ident-lookups --disable-wccp --enable-snmp
--enable-useragent-log --enable-referer-log --enable-async-io
--enable-cache-digests ' '

with the same results

-- 
Ralf Hildebrandt (i.A. des IT-Zentrum)  [EMAIL PROTECTED]
Charite - Universitätsmedizin BerlinTel.  +49 (0)30-450 570-155
Gemeinsame Einrichtung von FU- und HU-BerlinFax.  +49 (0)30-450 570-962
IT-Zentrum Standort CBF send no mail to [EMAIL PROTECTED]


Re: [squid-users] blocking internet application files?

2005-02-18 Thread Michael Pophal
Hi,
use the acl 

#   acl aclname browser  [-i] regexp ...
# # pattern match on User-Agent header

see squid.conf on this.

Regards Michael

On Wed, 2005-02-16 at 21:01, Shiraz Gul Khan wrote:
 dear list,
 
 is there a way to allow only iexplorer.exe application for my user to access 
 squid box.
 
 suppose i only want to run internet explorer on my user computers. no msn no 
 yahoo no any other internet application. only and only iexplorer for 
 browsing internet.
 
 what is the best config for squid.conf what and where i add/edit in 
 squid.conf
 
 ==
 squid.conf
 ==
 acl all src 0.0.0.0/0.0.0.0
 acl myusers src 192.168.100.0/255.255.255.0
 http_access allow myusers
 http_access deny all
 ==
 
 
 
 
 
 Thankyou  best regards,
 Shiraz Gul Khan (03002061179)
 Onezero Inc.
 
 _
 It's fast, it's easy and it's free. Get MSN Messenger today! 
 http://www.msn.co.uk/messenger
-- 
Mit freundlichen Grüssen / With kind regards

Michael Pophal
--
Topic Manager
Internet Access Services  Solutions
--
Siemens AG, ITO AS 4
Telefon: +49(0)9131/7-25150
Fax: +49(0)9131/7-43344
Email:   [EMAIL PROTECTED]
--



[squid-users] Zero Sized Reply

2005-02-18 Thread David Mercier
I got an Zero Sized Reply with some site with redirect? or not.

I've read many FAQ but the problem is stil existing

---
/usr/sbin/squidclient -v http://businessgateway.ca


headers: 'GET http://businessgateway.ca HTTP/1.0
Accept: */*

'
HTTP/1.0 502 Bad Gateway
Server: squid/2.5.STABLE7
Mime-Version: 1.0
Date: Fri, 18 Feb 2005 14:57:40 GMT
Content-Type: text/html
Content-Length: 985
Expires: Fri, 18 Feb 2005 14:57:40 GMT
X-Squid-Error: ERR_ZERO_SIZE_OBJECT 0
X-Cache: MISS from goyette.local
Proxy-Connection: close

 !DOCTYPE HTML PUBLIC -//W3C//DTD HTML 4.01 Transitional//EN 
http://www.w3.org/TR/html4/loose.dtd; 
 HTML  HEAD  META HTTP-EQUIV=Content-Type CONTENT=text/html; 
charset=iso-8859-1 
 TITLE ERROR: The requested URL could not be retrieved /TITLE 
 STYLE 
type=text/css  
!--BODY{background-color:#ff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--
  /S
TYLE 
 /HEAD  BODY 
 H1 ERROR /H1 
 H2 The requested URL could not be retrieved /H2 
 HR noshade size=1px 
 P 
While trying to retrieve the URL:
 A HREF=http://businessgateway.ca; http://businessgateway.ca /A 
 P 
The following error was encountered:
 UL 
 LI 
 STRONG 
Zero Sized Reply
 /STRONG 
 /UL 

 P 
Squid did not receive any data for this request.
 P Your cache administrator is  A HREF=mailto:[EMAIL PROTECTED] [EMAIL 
PROTECTED] /A . 

 BR clear=all 
 HR noshade size=1px 
 ADDRESS 
Generated Fri, 18 Feb 2005 14:57:40 GMT by goyette.local (squid/2.5.STABLE7)
 /ADDRESS 
 /BODY  /HTML 
---

I'm searching for a solution to post only the line used on squid.conf cause 
this file is very big !!!

On squid/access.log I got :

1108739958.435295 127.0.0.1 TCP_MISS/502 1280 GET http://businessgateway.ca 
- DIRECT/192.197.183.241 text/html

When is't OK I got : 

1108740019.961162 127.0.0.1 TCP_MISS/200 2735 GET http://www.google.ca - 
DIRECT/64.233.161.99 text/html


Anyone have hints ???

Tks




Re: [squid-users] Squid and Tomcat in one machine running WinXP

2005-02-18 Thread Rodrigo de Oliveira
Hello Denis!

I did what you told me. Dumped all the http_access and
allowed just the all as in http_access allow all.
Still doesn´t work. In another forum they told me I
should change the http server to listen port 80 and
squid the 8080 and change squid.conf like this:

http_port 8080
httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on

I did it and now I can reach the http server. But
Squid isn´t intercepting the requests/responses. No
entries were added to the access.log file. Only if I
make calls specific through the squid port like
http://169.254.243.112:8080/index.jsp. But again I get
the error:

Access denied. Access control configuration prevents
your request from being allowed at this time. Please
contact your service provider if you feel this is
incorrect.

What should I do? Thanks for the attention.

Rodrigo

 --- Denis Vlasenko
[EMAIL PROTECTED] escreveu: 
 On Friday 18 February 2005 03:08, Rodrigo de
 Oliveira wrote:
  Hello! I´m new here and would be very thankful if
  someone could solve my problem.
   
  I want Squid to intercept HTTP requests, deliver
 them
  to the server, receive them from the server, make
 some
  adjusts on the HTML file and deliver them to the
  hosts. I got a PC running both a HTTP server
 (Apache
  Tomcat 4.1.24) and Squid 2.5 STABLE 3
 

(http://www.adrenalin.to/bofi/setup_squid_2_5_stable_3_eng.exe)
  under Windows XP. For test purposes, Tomcat is
  listening port 8080 and Squid port 80. Among other
  tags, mainly, my squid.conf is:
   
  http_port 80
  httpd_accel_host 127.0.0.1
  httpd_accel_port 8080
  acl acceleratedHost dst 127.0.0.1/255.255.255.255
  acl acceleratedPort port 8080
  acl all src 0.0.0.0/0.0.0.0
  acl myNet src 10.0.0.0-200.0.0.1/255.255.255.0
  http_access allow acceleratedHost acceleratedPort
  http_access allow myNet
  http_access deny all
   
  This way, Squid makes the interception correctly
 for
  localhost's tests, lilke calling
  http://127.0.0.1/index.jsp on a browser. But when
 I
  connect a laptop to it, and suposing the IP of the
 PC
  server is 169.254.243.112 in this small LAN, Squid
  rejects because of an access denied problem. On
 the
  laptop, I can only reach the server bypassing
 Squid
  through a calling like
  http://169.254.243.112:8080/index.jsp on the
 browser.
  What am I doing wrong?
 
 Does it work if you dump all http_access except
 http_access allow all?
 --
 vda
 
  





___ 
Yahoo! Acesso Grátis - Instale o discador do Yahoo! agora. 
http://br.acesso.yahoo.com/ - Internet rápida e grátis


[squid-users] Re: Squid and Tomcat in one machine running WinXP

2005-02-18 Thread Rodrigo de Oliveira
Sorry, Joost. That was a desperate action trying to
allow any incoming requests through that IP range. Now
I did what you told me (whick is quite better) but
still doesn´t work. As I told Denis in another e-mail,
people from another squid forum are suggesting me to
put the http server at port 80 and squid caching at
8080 and use virtual as the http_accel_host. To tell
the truth, I don´t really want to cache things. I want
someone to intercept the http server responses and
make some adjustments on the web page delivered to the
host. So squid MUST be the one who is going to deliver
the page at last. And I´m hopping there´s a way to
implement in any programming language a module for
squid to make these adjustments on the page being
delivered. Is that possible? If so, how can I do it?
Or better, where can I find full information about
doing that?

Thanks a lot for your attention, guys. This forum is
my hope.

Rodrigo

 --- Joost de Heer [EMAIL PROTECTED] escreveu: 
  acl myNet src 10.0.0.0-200.0.0.1/255.255.255.0
 
 This is wrong. What exactly are you trying to do?
 You want all class C
 subnets between 10.0.0.0/24 and 200.0.0.0/24? That's
 not a range that can
 be easily caught with a single subnet-mask. You're
 probably better off to
 state only the subnets you want, like
 
 acl myNet src 10.0.0.0/255.255.255.0
 acl myNet src 169.254.0.0/255.255.0.0
 acl myNet src 200.0.0.0/255.255.255.0
 
 Joost
 
  





___ 
Yahoo! Acesso Grátis - Instale o discador do Yahoo! agora. 
http://br.acesso.yahoo.com/ - Internet rápida e grátis


[squid-users] Authenticated proxy for other protocols

2005-02-18 Thread Pablo Gietz
Hi all
We have Squid proxy, but we need to pass traffic trhoug the firewall 
based on authenticated user rather than IP address.
We use ldap_auth with win 2000.

Do you know any how-to about Squid+iptables+authenticated gateway ?
We need to open some port for some special people and redirecto his 
traffic directly to the router rather than passing to the squid. I hope 
my english was clear .

Thanks
--
Pablo A. C. Gietz
Jefe de Seguridad Informática
Nuevo Banco de Entre Ríos S.A.
Te.: 0343 - 4201351
Fax: 0343 - 4201329


RE: [squid-users] How to serve directory index files...?

2005-02-18 Thread Peter Yohe
We have a collection of mirrored web sites that we distribute to schools in
developing countries where Internet access is limited or not available. I'm
looking into different proxies that provide off-line browsing support to see
if they can be used to let students type in a URI and if it's in the
collection it will be served as if they were on the Internet.

I would like to know how squid responds while in offline mode to a request
for http://www.foo.com/ with no default page specified versus a request
where the default page is included in the request as in
http://www.foo.com/index.html. It would make sense to me that the second
request would work if the resource was in the cache already. But say I use
an address without specifying a file. Does Squid keep track of what files in
it's cache are the default document for some address?

I understand that normally the web server is configured to serve certain
pages as a default document or directory index page. When such a page does
not exist in the requested directory path the server will respond with a
directory listing or a directory listing denied. It seems to me that it
would not be possible to define ahead of time what documents are to be
served as default, if they exist, in an offline cache and that some other
technique would need to be used.

Thanks,

Peter


-Original Message-
From: Jeff Donovan [mailto:[EMAIL PROTECTED] 
Sent: Thursday, February 17, 2005 9:15 PM
To: [EMAIL PROTECTED]
Cc: Squid List
Subject: Re: [squid-users] How to serve directory index files...?


On Feb 17, 2005, at 3:46 PM, Peter Yohe wrote:

 Hello,

 When Squid is in offline mode, how does it know what a default 
 document in a site or directory is if a client does not provide the 
 name of the file?
If the client has not requested information, Why would squid need to know
the default document ( assuming default.html ) of any site or directory?
No request = squid do nothing

what are you trying to do with squid? post your squid.conf and we may better
answer your questions.

 Thanks,

 Peter Yohe

 The WiderNet Project


---
jeff donovan
basd network operations
(610) 807 5571 x41
AIM  xtdonovan




Re: [squid-users] Two squid instances based on file types? Is it good?

2005-02-18 Thread H Matik
On Friday 18 February 2005 08:34, Marco Crucianelli wrote:
 On Thu, 2005-02-17 at 12:52 -0600, Kevin wrote:
  What mechanism are you using to set expire times?

 Well, I'm still not sure what I shall use! I mean: should I use
 refresh_pattern!? Or what? I mean, refresh_pattern can let me change
 refresh period based on sire url right? What' else could I use?!


when I suggested the choice of two caches, one for small objects and one for 
large objects the focus was not on refresh patterns

the goal here is you can use at very first priority the OS and especially the 
HD System tuned for serving small or large files. This certainly will not be 
possible running two squids on one machine

the second point is that you can use max|min_object_size in order to limit the 
file size you will serve by each server. My experienced showed best results 
breaking at 512K on modern PCs

third step is to use cache_replacement_policy LFUDA/GDSF accordingly and if 
using diskd you may play with Q1 and Q2 what will give you the difference

and to make sense you push from the large_obj_cache with  proxy-only set

to achieve this correctly you may or should set additional always|never_direct 
for known mpg, avi, wmv,mp3,iso and other so that the small_obj_cache pulls 
them really from the larg_obj_cache

Hans





-- 
___
Infomatik
(18)8112.7007
http://info.matik.com.br
Mensagens não assinadas com GPG não são minhas.
Messages without GPG signature are not from me.
___


pgpG7mnwhyxrL.pgp
Description: PGP signature


[squid-users] RE: Squid and OWA

2005-02-18 Thread Guy Speier
Since I haven't heard any responses, can someone even confirm this is even
possible?


Thanks,
Guy

-Original Message-
From: Guy Speier 
Sent: Friday, February 18, 2005 10:33 AM
To: squid-users@squid-cache.org
Subject: Squid and OWA

Hello All,

I am trying to get squid to work with OWA behind a firewall.  I would really
appreciate seeing someone's squid.conf to get an idea what I am doing wrong.

Would anyone be willing to show me the relevant squid.conf entries?

Thank,
Guy


[squid-users] Re: Problem with unparseable HTTP header field

2005-02-18 Thread M A Young
On Fri, 18 Feb 2005, Ralf Hildebrandt wrote:

 When I surf to http://www.abstractserver.de/da2005/avi/e/Abs_revi.htm
 and enter any number/character and click Submit my query, I get an
 error page (Invalid Response The HTTP Response message received from
 the contacted server could not be understood or was otherwise
 malformed.
See bug 1242
http://www.squid-cache.org/bugs/show_bug.cgi?id=1242
The issue is that with 2.5S8 (or well patched 2.5S7) squid has become less
tolerant of illegal behaviour from web servers in the headers they serve
before the contents of the web page. If you fetch that page by hand (eg.
with wget -S) you can see the HTTP headers
 1 HTTP/1.0 200 OK
 2 Server: Microsoft-IIS/3.0
 3 Date: Fri, 18 Feb 2005 19:54:50 GMT
 4 HTTP/1.1 200 OK
 5 content-type: text/html
 6 content-length: 2617
 7 Connection: Keep-Alive
which is difficult to make sense of if you actually try to understand it;
is the answer HTTP/1.1 or HTTP/1.0?

Michael Young


[squid-users] newbie help

2005-02-18 Thread richard
Hi All
Although I've been using linux for many years I've only just got round to using squid and trying to 
set up a transparent proxy parsed by dansguardian.
OS is MDK 10.1, Firewall is Shorewall, 3 ports , ppp0,eth0 and wlan0, the later is redirected to 
port 3128.
I'set up squid.conf I think to do as a few docs on the net suggest, but when I start a remote 
browser on the wlan, everything gets stopped.

At this stage if someone can point in the right direction it'll help alot.
I know there's a FAQ and I've been through it, but not found the relevant bit.
the access.log looks like this
1108744208.159  1 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108744382.947 11 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108744395.413 41 127.0.0.1 TCP_DENIED/400 1431 GET / - NONE/- text/html
1108744779.957150 127.0.0.1 TCP_DENIED/400 1545 GET
/cgi-bin/check-updates?run_as=check_updatesprotocol=1 - NONE/- text/html
1108745611.882183 127.0.0.1 TCP_DENIED/400 1475 POST
/Config/MsgrConfig.asmx - NONE/- text/html
1108745686.231 41 127.0.0.1 TCP_DENIED/400 1549 GET
/firefox?client=firefox-arls=org.mozilla:en-GB:official - NONE/-
text/html
1108745686.399  1 127.0.0.1 TCP_DENIED/400 1449 GET /rss20.xml -
NONE/- text/html
1108745686.445 17 127.0.0.1 TCP_DENIED/400 1453 GET /favicon.ico -
NONE/- text/html
1108745694.663  1 127.0.0.1 TCP_DENIED/400 1489 GET
/products/firefox/central.html - NONE/- text/html
1108745694.817  1 127.0.0.1 TCP_DENIED/400 1453 GET /favicon.ico -
NONE/- text/html
1108745700.387  1 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108745779.488  1 127.0.0.1 TCP_DENIED/400 1545 GET
/cgi-bin/check-updates?run_as=check_updatesprotocol=1 - NONE/- text/html
This was an attempt to get to the firefox home page.
OK its a TCP denial ,but whats the  400 ??
TIA
Richard
---BeginMessage---
Hi all,
not sure if this list still exists?
but in case it is I've sent a mail to subscribe.
Although I've been using linux for many years I've only just got round to using squid and trying to 
set up a transparent proxy parsed by dansguardian.
OS is MDK 10.1, Firewall is Shorewall, 3 ports , ppp0,eth0 and wlan0, the later is redirected to 
port 3128.
I'set up squid.conf I think to do as a few docs on the net suggest, but when I start a remote 
browser on the wlan, everything gets stopped.

At this stage if someone can point in the right direction it'll help alot.
the access.log looks like this
1108744208.159  1 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108744382.947 11 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108744395.413 41 127.0.0.1 TCP_DENIED/400 1431 GET / - NONE/- text/html
1108744779.957150 127.0.0.1 TCP_DENIED/400 1545 GET
/cgi-bin/check-updates?run_as=check_updatesprotocol=1 - NONE/- text/html
1108745611.882183 127.0.0.1 TCP_DENIED/400 1475 POST
/Config/MsgrConfig.asmx - NONE/- text/html
1108745686.231 41 127.0.0.1 TCP_DENIED/400 1549 GET
/firefox?client=firefox-arls=org.mozilla:en-GB:official - NONE/-
text/html
1108745686.399  1 127.0.0.1 TCP_DENIED/400 1449 GET /rss20.xml -
NONE/- text/html
1108745686.445 17 127.0.0.1 TCP_DENIED/400 1453 GET /favicon.ico -
NONE/- text/html
1108745694.663  1 127.0.0.1 TCP_DENIED/400 1489 GET
/products/firefox/central.html - NONE/- text/html
1108745694.817  1 127.0.0.1 TCP_DENIED/400 1453 GET /favicon.ico -
NONE/- text/html
1108745700.387  1 127.0.0.1 TCP_DENIED/400 1453 GET /search?q=%s -
NONE/- text/html
1108745779.488  1 127.0.0.1 TCP_DENIED/400 1545 GET
/cgi-bin/check-updates?run_as=check_updatesprotocol=1 - NONE/- text/html
This was an attempt to get to the firefox home page.
OK its a TCP denial ,but whats the  400 ??
TIA
Richard
please cc to my home addressas well as the list if it exists still.
---End Message---


[squid-users] Sidewinder WebCache and ICP

2005-02-18 Thread Kevin
I learned something today, but if you have no interest in Secure Computing's
Sidewinder G2 firewall, you probably want to stop reading now.


While the WebProxy service in Sidewinder is based on Squid 2.4.STABLE6,
the actual caching functionality is *very* limited, and even though the firewall
has a 'cf' command to enable ICP, the service cannot reply to ICP queries.

On the firewall, the cache.log file will show errors transmitting the UDP
reply packet, like this:
comm_udp_sendto: FD 18, 192.168.42.7, port 34467: (1) Operation not permitted


I should have known, this limitation is documented in the man pages:

$ man squid
 . . .
 At this time Sidewinder does not support any of squid's hierarchical
 caching capability.
 . . . 
SIDEWINDERNovember 14, 2003 1
$ uname -a
SecureOS . . . 6.1.0.05 SW_OPS Fri Nov 12 14:19:42 CST 2004   i386
$ exit

I realize that the Squid community cannot support the Sidewinder firewall,
and that Secure Computing cannot support Squid.  I just thought it'd be
useful to mention this limitation so the next person attempting this
doesn't have to waste as much time as I did in trying (and failing) to get
ICP working.

Kevin Kadow

(P.S. No support for Cache Digests either.  When they say the proxy
does not support features needed for cache hierarchy, they really 
mean it.)


[squid-users] Re: squid + winbind weird behavior

2005-02-18 Thread Adam Aube
Please don't top post (which is replying above the original message) - it
makes the thread hard to follow.

Paulo Pires wrote:
 Qui, 2005-02-17 às 00:40 +0100, Henrik Nordstrom escreveu:
 On Wed, 16 Feb 2005, Paulo Pires wrote:
 
  chown nobody /usr/local/samba-3.0.10/var/locks/winbindd_privileged
 
  This solved the thing. We can't change the perms cause it's a socket,
  so it's better to change the owner to the user which runs squid.
 
 You should change the group, not the owner..
 
  http://www.squid-cache.org/Doc/FAQ/FAQ-23.html#ss23.5
  http://us4.samba.org/samba/docs/man/winbindd.8.html
 
 Changing the owner will make Samba quite upset about the security.

 chgrp squid /path/to/winbind_privileged

 I've added squid group, added user nobody into it and put it in my
 squid.conf. But as you can see below, there's only read perms for squid
 group, so the error is still there.
 
  4 drwxr-s---  2 root squid  4096 2005-02-17 14:15 winbindd_privileged
 
 I don't know how the hell this worked for others, since other users from
 squid will only have read access to the dir, when they should have
 execute permissions too.

They do have execute permissions - the s in that position means the
directory is group executable and SetGID.

Adam



[squid-users] RE: Squid and OWA

2005-02-18 Thread Guy Speier
 Pardon my idiocy, but by OWA I was refering to Outlook Web Access.  Many
have e-mailed me about Oracle web access.

If you can share with me your expertice, about outlook web access via squid,
I would be most appreciative.

Thanks,
Guy

-Original Message-
From: Guy Speier 
Sent: Friday, February 18, 2005 1:37 PM
To: squid-users@squid-cache.org
Subject: RE: Squid and OWA

Since I haven't heard any responses, can someone even confirm this is even
possible?


Thanks,
Guy

-Original Message-
From: Guy Speier
Sent: Friday, February 18, 2005 10:33 AM
To: squid-users@squid-cache.org
Subject: Squid and OWA

Hello All,

I am trying to get squid to work with OWA behind a firewall.  I would really
appreciate seeing someone's squid.conf to get an idea what I am doing wrong.

Would anyone be willing to show me the relevant squid.conf entries?

Thank,
Guy


[squid-users] Re: Is there a way to bypass squid for any destination ip address ?

2005-02-18 Thread Adam Aube
Please don't ask a new question by replying to another post - instead, post
a new message to the list.

Nont Banditwong wrote:

 My transparency squid box redirect packet which has destination port 80 to
 3128 by this iptable command
 
 iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
 --to-port 3128
 
 but I don't want clients access some destination ip address through squid,
 Is there a way to bypass squid by add some iptables command ?

(This question really belongs on an iptables list.)

Before the REDIRECT line above, add iptables rules similar to this:

iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -d a.b.c.d -j ACCEPT

where a.b.c.d is the IP address of the server to bypass Squid for. Also,
be sure to allow port 80 traffic in your FORWARD chain.

Adam



[squid-users] Re: Newbie question....

2005-02-18 Thread Adam Aube
Jean Cantarutti wrote:

 then I look access.log but don't show downloaded files. only the
 little ones files .gif. .jpg .js ... etc.
 
 Where are the .exe .zip .doc .pdf . files logged (the big ones)?.

Squid logs all requests it receives. The requests for .exe, .zip, etc. files
will be in access.log like all the others.

Adam



[squid-users] What did I do to make squid behace this way??

2005-02-18 Thread Ray Charles
Hi,

My setup-
squid-stable8-20050218 w/collapsed_forwarding patch.
RHEL-3.1 on 3.6Ghz w/1Gig of RAM. SW Raid.

I've been trying to understand how squid works for the
past week or so and I'am looking for what causes the
following behavior when squid receives a request to
fetch a URI not associated with my network:


Squid stops w/SEGV message

2005/02/18 20:54:34| storeLateRelease: released 0
objects
FATAL: Received Segment Violation...dying.
2005/02/18 21:10:56| storeDirWriteCleanLogs:
Starting...
2005/02/18 21:10:56| WARNING: Closing open FD   12


Any insights or pointers much appreciatied,

Ray




__ 
Do you Yahoo!? 
Yahoo! Mail - Find what you need with new enhanced search.
http://info.mail.yahoo.com/mail_250