Re: [squid-users] Squid 2.5.STABLE9 and Kernel 2.6.11 SMP

2006-02-21 Thread Schelstraete Bart
On Tue, 2006-02-21 at 08:30 +0100, Christian Herzberg wrote:
> We are running the above configuration. Everything works fine until
> the squid
> stops working.
> Nothing in the squid logs and nothing in the messages.

Hello,

What do yo exactly mean with 'stops' working? Is the program crashing or
doesn't is react anymore?



Bart
 
--
Schelstraete Bart 
W: http://www.schelstraete.org   |   E: [EMAIL PROTECTED]
12:15:51 up 12:19, 1 user, load average: 0.40, 0.96, 1.19 
'Not all men are annoying. Some are dead.'



Re: [squid-users] Squid Slow Downloads problem

2006-02-16 Thread Schelstraete Bart
On Thu, 2006-02-16 at 12:14 -0600, Hesham Shakil wrote:

> 
> - Tried apache+mod_proxy+mod_cache on the same linux machine, worked
> perfectly with 28+KB/s transfer rate
> - Tried Squid-2.5.STABLE12 compiled for windows on a Windows XP machine
> running on the same Internet connection and it worked fine at speeds of
> 28+KB/s
> - Tried ufs/aufs/diskd but none improved the speed
> - Tried recompiling squid with NONE but the very basic options
> - Tried recompiling squid with and without pthreads/aio etc.

For the moment I don't see anything wrong.
What you can maybe try, just for a test , is disabling the caching for a
certain site/filetype. And then try to download such a file via the
proxy.

If that works faster, then you will have a problem with the disks
(cache) or something similar.
If not...then you will have another problem (deuh :) . Maybe increase
loglevel then, maybe something can be discovered there.


Note: Do you have an antivirus running on that box? Because that will
slow down the downloads.




Bart




--
Schelstraete Bart 
http://www.schelstraete.org
[EMAIL PROTECTED]
22:21:18 up 5 days, 4:32, 3 users, load average: 0.42, 0.83, 1.11



Re: [squid-users] Squid Slow Downloads problem

2006-02-16 Thread Schelstraete Bart
On Thu, 2006-02-16 at 14:38 -0600, Hesham Shakil wrote:
> > On Thu, 2006-02-16 at 12:14 -0600, Hesham Shakil wrote:
> > > Downloading files via squid (on Linux ONLY.. read below) is slower than
> > > without squid. The normal download speed bypassing squid is 28+KB/s while
> > > through squid it reduces to 16-17KB/s. Browsing seems fine, and the
> > > internet bandwidth tests also give 28+KB/s when run through squid, so the
> > > problems seems only with files 1MB+ in size.
> >
> > Hi,
> >
> > You said that the download is faster without Squid.
> > Is that on the same box? So are you downloading from the Squid box?
> >
> 
> Downloads on the Squid box itself or on other machines on the network are
> all slow when connecting through squid proxy.

So: You 're downloading on the Squid box, without using the Squid
proxy ..and that's also slow? Correct?

If that's the case, it has nothing to do with Squid but with the box
itself. (Network cable, Duplex settings)



Bart


--
Schelstraete Bart 
http://www.schelstraete.org
[EMAIL PROTECTED]
21:44:54 up 5 days, 3:56, 3 users, load average: 0.69, 0.93, 0.92



Re: [squid-users] Squid Slow Downloads problem

2006-02-16 Thread Schelstraete Bart
On Thu, 2006-02-16 at 12:14 -0600, Hesham Shakil wrote:
> Downloading files via squid (on Linux ONLY.. read below) is slower than
> without squid. The normal download speed bypassing squid is 28+KB/s while
> through squid it reduces to 16-17KB/s. Browsing seems fine, and the
> internet bandwidth tests also give 28+KB/s when run through squid, so the
> problems seems only with files 1MB+ in size.

Hi,

You said that the download is faster without Squid.
Is that on the same box? So are you downloading from the Squid box?




B
--
Schelstraete Bart 
http://www.schelstraete.org
[EMAIL PROTECTED]
21:17:00 up 5 days, 3:28, 3 users, load average: 0.18, 0.51, 0.86



Re: [squid-users] Squid+auth+Yahoo messenger with voice 7,5,0,333

2006-02-15 Thread Schelstraete Bart
On Tue, 2006-02-14 at 22:07 +0800, Liew Toh Seng wrote:


> Hi List,
> 
> Is there anyone know how to configure yahoo messenger to use
> squid 
> proxy auth ? I can use msn with squid proxy auth but not yahoo 
> messenger. Any idea ? 


Yahoo Messenger works fine with a HTTP proxy, so it will work with
Squid. 
BUT you need to configure your Yahoo messenger first, so that it can
connect to the proxy server. (located in 'connection preferences' or
something)



Bart



--
Schelstraete Bart 
http://www.schelstraete.org 
[EMAIL PROTECTED]
15:59:05 up 3 days, 22:10, 3 users, load average: 1.30, 1.59, 1.69



Re: [squid-users] Problem understanding acl

2006-02-15 Thread Schelstraete Bart
On Wed, 2006-02-15 at 10:23 -0400, Chris Mason (Lists) wrote:
>  From: 
> Chris Mason (Lists)
> <[EMAIL PROTECTED]>
>  Reply-To: 
> [EMAIL PROTECTED]
>To: 
> squid-users@squid-cache.org
>   Subject: 
> [squid-users] Problem understanding
> acl
>  Date: 
> Wed, 15 Feb 2006 10:23:43 -0400
> (15:23 CET)
>Mailer: 
> Thunderbird 1.5 (Windows/20051201)
> 
> 
> I want to have the following scenario but I can't understand how to do
> it
> 
> # Employee general access to a list of sites
> acl allowed-sites dstdomain .thisdomain.com .thatdomain.com
> http_access allow allowed-sites
> 
> # Some employees listed get access to all EXCEPT the banned sites
> acl banned_sites dstdomain .abc.com .msn.com .hotmail.com .go.com 
> .playboy.com
> acl password_access proxy_auth someone someone-else anotheruser
> http_access allow password_access but deny the banned_lists
> 
> # And finally deny all other access to this proxy
> http_access allow localhost
> http_access deny all

Hi,

As far as I understand, it's just like this:


acl allowed-sites dstdomain .thisdomain.com .thatdomain.com
acl banned_sites
dstdomain .abc.com .msn.com .hotmail.com .go.com .playboy.com
acl password_access proxy_auth someone someone-else anotheruser

##Order is important
http_access allow allowed-sites
http_access deny banned_sites
http_access allow password_access
http_access deny all



Bart



--
Schelstraete Bart 
http://www.schelstraete.org 
[EMAIL PROTECTED]
15:43:27 up 3 days, 21:54, 3 users, load average: 1.83, 1.88, 1.78



Re: [squid-users] pop-up authentication window question

2006-02-15 Thread Schelstraete Bart
On Wed, 2006-02-15 at 12:20 +0100, Sándor Zsolt wrote:
> Hello,
> 
> I faced a strange problem after upgrading from squid-2.4.STABLE1 -> 
> 2.5.STABLE12. 
> 
> I have authentication for all users, but there are a few sites which
> do not 
> need authentication. For the authentication the conf file contains
> the 
> followings:
> 
> acl all src 0.0.0.0/255.255.255.255
> acl authenticate proxy_auth_regex \*
> acl PROXY src /255.255.255.255
> acl noauth dstdomain "/usr/local/squid/etc/noauth_web.txt"
> 
> http_access allow noauth
> http_access allow authenticate !PROXY
> http_access allow all
> 
> The noauth_web.txt contains the web sites which don't need
> authentication.
> 
> (The authentication goes well, and the sites, which don't need
> authentication, 
> could be reached too without authentication through the login window.)
> 
> My problem is: when the user logs in and calls for a site which
> doesn't need 
> authentication the new squid pops up the login window, to reach the
> content 
> of this site the user has to to click on the "cancel" button. In the
> old 
> version of squid there was no pop up window if a site, listed in the 
> noauth_web.txt file, was called. The content of the site was
> displayed 
> directly.
> 
> I tried to find out what could cause the problem, but can't.
> 
> Could someone help me please, how to configure squid to eliminate the
> pop up 
> window when it is not needed?

Hi,

This just means that the squid didnt match the site with the one from
the ACL.
Check in the access_log which url's that users is accessing when
connecting to that site.

I guess that the URL which the users tries to access is using some kind
of external link (banners, google ads, etc) which is not in your list.
That's why the user can press cancel and still can see the 'needed'
stuff




B



--
Schelstraete Bart 
http://www.schelstraete.org
[EMAIL PROTECTED]
12:42:48 up 3 days, 18:54, 3 users, load average: 0.23, 1.06, 1.62



Re: [squid-users] cpu usage increases over time, squid performance declines

2006-02-15 Thread Schelstraete Bart
On Tue, 2006-02-14 at 22:31 -0800, Mike Solomon wrote:
> Hi all,
> 
> I've been having some problems with squid in reverse proxy mode. I  
> will preface this by saying that I realize I'm pushing squid to the  
> limits and I've been following the recent discussions on  
> performance.  I'll try to be as clear as possible to expose the crux  
> of my problem, but please bear with me as the explanation is rather  
> long.

Hi,

Can you send us your squid.conf file?



Bart




--
Schelstraete Bart 
http://www.schelstraete.org
[EMAIL PROTECTED]
10:12:26 up 3 days, 16:23, 3 users, load average: 0.22, 0.52, 0.56



Re: [squid-users] Check mail using Proxy server

2005-06-04 Thread Schelstraete Bart
Squid is an HTTP proxy.


Bart

On Sat, 2005-06-04 at 17:04 +0600, Shahnawaz Iqbal wrote:
> 
> 
> 
> 



Re: [squid-users] ERROR : While trying to retrieve the URL:

2004-12-24 Thread Schelstraete Bart
Let me guessyou're using Internet Explorer?

On Fri, 2004-12-24 at 03:23 -0800, ads squid wrote:
> I am running squid "Version 2.5.STABLE6" on Redhat
> Linux 9.0. 
> Squid is running on Gateway. DNS, Web server are
> running on DMZ. 
> Everything was working smoothly so far.
> 
> Suddenly It started giving following error when try to
> click on URL on the web page of site hosted on DMZ
> server.
> 
> 
> The requested URL could not be retrieved
> 
> 
> 
> While trying to retrieve the URL:
> /Energy_Solutions.htm 
> 
> The following error was encountered: 
> 
> Invalid URL 

Let me guessyou're using Internet Explorer to connecto to a 'secure'
site?


Bart



Re: [squid-users] Problems with proxy.pac

2004-11-07 Thread Schelstraete Bart
Sorry, but this has nothing to do with Squid.

can you tell me what isn't working with Mozilla? Maybe you just forget
the 'http://' before the autoconfig url...(this is required)


Bart

On Sat, 2004-11-06 at 19:23 +0530, Manoj Kumar Chitlangia wrote:
> Hello,
> 
> My network has two proxy servers and i need to balance the traffic load
> between these two. I used the following proxy.pac file.
> 
> function FindProxyForURL(url, host)
> {
>   if (isPlainHostName(host) || dnsDomainIs(host, ".iiita.ac.in") ||
> dnsDomainIs(host, ".local") || (host.substring(0,4) == "172.") || host ==
> "iiita.ac.in" || host == "127.0.0.1")
>   return "DIRECT";
>   if(myIpAddress.substring(0,6) == "172.19" || myIpAddress.substring(0,6)
> == "172.24")
>   {
>   if (url.substring(0, 5) == "http:") {
>   return "PROXY 172.31.1.8:8080";
>   }
>   else if (url.substring(0, 4) == "ftp:") {
>   return "PROXY 172.31.1.8:8080";
>   }
>   else if (url.substring(0, 7) == "gopher:") {
>   return "PROXY 172.31.1.8:8080";
>   }
>   else if (url.substring(0, 6) == "https:" || url.substring(0, 6)
> == "snews:";) {
>   return "PROXY 172.31.1.8:8080";
>   }
>   else {
>   return "DIRECT";
>   }
>}
>   if (url.substring(0, 5) == "http:") {
>   return "PROXY 172.31.1.1:8080";
>   }
> else if (url.substring(0, 4) == "ftp:") {
>return "PROXY 172.31.1.1:8080";
> }
> else if (url.substring(0, 7) == "gopher:") {
> return "PROXY 172.31.1.1:8080";
> }
> else if (url.substring(0, 6) == "https:" || url.substring(0, 6) ==
> "snews:";) {
> return "PROXY 172.31.1.1:8080";
> }
> else {
> return "DIRECT";
> }
> }
> 
> The problem is that the above file works fine with Internet Explorer but
> does not work with Mozilla, Opera and other browsers. Please suggest me
> how to get rid of this problem and share the load of the netwok between
> these two proxies.
> NOTE: 172.19.X.X n 172.24.X.X are two VLANs on my network.
> 
> Manoj Chitlangia
> 
> 
> 



Re: [squid-users] streaming content

2004-10-31 Thread Schelstraete Bart
On Sun, 2004-10-31 at 19:23 +0100, Hendrik Voigtländer wrote: 
> Hello list,
> 
> traffic analysis shows a lot of webradio traffic going through our squid 
> (up to 10%).
>  From the logs and the cachemgr it seems like squid is not caching this 
> at all - similar to ssl-traffic.
> Is there any way to achieve caching of this content?
> A seperate streaming proxy configured to work with the most popular 
> sites would be acceptable, but of course the best way would be some 
> squid 'magic'. Blocking this traffic is not really an option.

Hello Henrik,

If the content is 'streaming' (if it's no 'static' audio files), then
this cannot be cached because the data is always changing.

Normal audio files on the other hand can be cached off course.
(.wma, .wav., .mp3,etc) .


Bart




Re: [squid-users] GAIM Error...

2004-10-01 Thread Schelstraete Bart
Hello,
Try upgrading your gaim.
BArt
Rick Whitley wrote:
I have a user that is using GAIM to talk with yahoo messenger. We have
configured the proxy but when she tries to connect she gets the error:
'GAIM Error: Access denied proxy server forbids port 5050 tunnelling'.
If I look at the conf file I can't see where that is being blocked. We
are running squid-2.5-stable5 on suse 9. I have used the default
safe-port list. Where do I need to look? Thanks for any help.
rick...
Rom.5:8
 


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] is it possible for squid to redirect http to another proxy?

2004-09-30 Thread Schelstraete Bart
Hi,
You need to specify a parent cache
See: http://squid.visolve.com/squid/squid24s1/neighbour.htm#cache_peer
Bart
[EMAIL PROTECTED] wrote:
Hi
i have this situation
a browser which connects to an antivirus (hbedv) proxy
which connects to squid
then squid  connect to internet
but i like browsers to connect to squid
then squid will connect to proxy AV
then proxy AV will go to the internet
how to tell squid to redirect TCP 80 to proxy AV?
thx
 


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] refresh my cache

2004-09-30 Thread Schelstraete Bart
Hello Jeff,
On the squid page you have a 'scripts' section, and on that paga you 
will find a tool to clean URL's from the Squid cache.
('third party software')

Bart
Jeff Donovan wrote:
greetings
I am having an issue where some browsers can access a site and others 
can't. All browsers can access the site outside squid.

how can i force squid to refresh it's cache? or just refresh the 
contents for this url?

TIA
--j
---
jeff donovan
basd network operations
(610) 807 5571 x41
AIM  xtdonovan


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Blocking Trillian with Squid

2004-08-09 Thread Schelstraete Bart
Babidii wrote:
Hi,
I would like to know if anyone knows how to block trillian
users (software) using Squid? But few users still will need
to use it.
 

Hello,
You can block the useragent
(Trillian/..)
And you can add an extra rule for the ones who still needs to use this.
   Bart
--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Re: Web Scanning

2004-06-17 Thread Schelstraete Bart
Norman Zhang wrote:
I'm currently scanning web traffic through TrendMicro VirusWall 
using the following options. Are there shorter ways of specifying 
the extensions?

cache_peer 127.0.0.1 parent 80 7 default no-query
acl binaries urlpath_regex -i \.bin$ \.com$ \.cmd$ \.doc$ \.dot$ 
\.drv$ \.exe$ \.sys$ \.xls$ \.xla$ \.xlt$ \.vbs$ \.js$ \.htm$ 
\.html$ \.cla$ \.class$ \.scr$ \.mdb$ \.ppt$ \.dll$ \.ocx$ \.ovl$ 
\.pot$ \.shs$ \.pif$ \.hlp$ \.hta$ \.mpp$ \.mpt$ \.msg$ \.oft$ 
\.pps$ \.rtf$ \.vsd$ \.vst$ \.386$ \.arj$ \.cab$ \.gz$ \.lzh$ \.rar$ 
\.tar$ \.swf$ \.zip$

cache_peer_access 127.0.0.1 allow binaries
never_direct allow binaries

> As far  as I know there's no shorter way to do this. What you can do
> is put those extensions in a file, and call that file with squid.
> That makes the squid.conf not that dirty.
Would the following do?
cache_peer 127.0.0.1 parent 80 7 no-query default
acl all src 0.0.0.0/0.0.0.0
never_direct allow all
Then all files will be directed..I don't know if this is disered.
   B
--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Web Scanning

2004-06-17 Thread Schelstraete Bart
As far  as I know there's no shorter way to do this. What you can do is 
put those extensions in a file, and call that file with squid.
That makes the squid.conf not that dirty.

   B
Norman Zhang wrote:
Hi,
I'm currently scanning web traffic through TrendMicro VirusWall using 
the following options. Are there shorter ways of specifying the 
extensions?

Regards,
Norman
cache_peer 127.0.0.1 parent 80 7 default no-query
acl binaries urlpath_regex -i \.bin$ \.com$ \.cmd$ \.doc$ \.dot$ 
\.drv$ \.exe$ \.sys$ \.xls$ \.xla$ \.xlt$ \.vbs$ \.js$ \.htm$ \.html$ 
\.cla$ \.class$ \.scr$ \.mdb$ \.ppt$ \.dll$ \.ocx$ \.ovl$ \.pot$ 
\.shs$ \.pif$ \.hlp$ \.hta$ \.mpp$ \.mpt$ \.msg$ \.oft$ \.pps$ \.rtf$ 
\.vsd$ \.vst$ \.386$ \.arj$ \.cab$ \.gz$ \.lzh$ \.rar$ \.tar$ \.swf$ 
\.zip$

cache_peer_access 127.0.0.1 allow binaries
never_direct allow binaries


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Squid on Win32

2004-06-17 Thread Schelstraete Bart
Yes, it is, and there are binaries available.
   B
Bhat, Satish wrote:
Hi,
 I was wondering if it's possible to compile and run squid on Win32
platform. 
 Any suggestions?

Cheers,
Satish
-Original Message-
From: Andreas Pettersson [mailto:[EMAIL PROTECTED] 
Sent: Thursday, June 17, 2004 11:08 AM
To: [EMAIL PROTECTED]
Subject: Re: [squid-users] Punching a hole in redirector

Wouldn't it be better to make exceptions for Windows Update in
squid.conf ?
acl NoRedirect url_regex windowsupdate\.com
acl NoRedirect url_regex download\.microsoft\.com redirector_access deny
NoRedirect redirector_access allow all
/Andreas
- Original Message - 
From: "Matt Ashfield (UNB)" <[EMAIL PROTECTED]>

 

My question is what if I want to allow a specific request through, in 
my case www.windowsupdate.microsoft.com The problem becomes that on 
that
   

site,
 

the images, etc, are all at different url's, so when I try to allow 
just that link through, when it goes to grab images and other things 
it completely messes up.
   


 


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Odd IE 6 Squid SSL Error

2004-06-17 Thread Schelstraete Bart
It's maybe strange, but the last few weeks I have the same problem!
And I didn't change anything in the squid configuration.
This happens with a lot of  HTTPS pages, but not in Mozilla/Netscape.
So I think something has changed in IE.
   B
Osborne, Mike wrote:
Thank You,
I know it is not a Squid issue. I was just wondering if anyone in this community had 
come across this issue. I was looking for a solution. Do you know of a fixed 
wininet.dll? Location it can be downloaded from?
Thank you
Mike Osborne
Sr. Network Analyst
St. Mary's General Hospital
Kitchener Ontario
(519) 749-6451
-Original Message-
From: Tim Neto [mailto:[EMAIL PROTECTED]
Sent: June 17, 2004 12:14 PM
To: Osborne, Mike
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] Odd IE 6 Squid SSL Error

Hello,
This is not a Squid problem.   Check your access log.  You will see an 
initial https request, then the client switches to regular http 
requests.   It is a problem (bug) in Microsoft's wininet.dll.

A wininet.dll version like "6.0.2800.1405" on Windows 2000 sp4 does not 
work correctly.   A wininet.dll version like "6.0.3790.118" on Windows 
20003 does work correctly.The fact that it is a system library that 
contains the problem, any application making a https call is affected.

Please see Microsoft for a fixed version of wininet.dll.
Tim
--
Timothy E. Neto
Computer Systems Engineer  Komatsu Canada Limited
Ph#: 905-625-6292 x265 1725B Sismet Road
Fax: 905-625-6348  Mississauga, Ontario, Canada
E-Mail: [EMAIL PROTECTED]   L4W 1P9
--

Osborne, Mike wrote:
 

Hi,
I was wondering if anyone has seen an issue like this. I have SQUID 2.5.STABLE1 on RedHat 9. I have always been able to access SSL pages. Since a recent Windows Update I get a Squid error. If I refresh I get the site. Other PC's that do not have the latest updates work fine. 

Any pointers would be appreciated.
Here is the URL used for this example:
https://secure.e2rm.com/registrant/solicitOthers.aspx?EventID=838&LangPref=en-CA&RegistrationID=35295
Here is the Error:
ERROR
The requested URL could not be retrieved
While trying to retrieve the URL:  
The following error was encountered: 
*	Invalid URL 
Some aspect of the requested URL is incorrect. Possible problems: 
*	Missing or incorrect access protocol (should be `http://'' or similar) 
*	Missing hostname 
*	Illegal double-escape in the URL-Path 
*	Illegal character in hostname; underscores are not allowed 
Your cache administrator is ProxyAdmin.
Generated Thu, 17 Jun 2004 12:31:10 GMT by proxy.abc (squid/2.5.STABLE1) 

Thank You
Mike Osborne
Sr. Network Analyst
St. Mary's General Hospital
Kitchener Ontario
(519) 749-6451

   

 


--
====
You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"

Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Re: Re: Help with GAIM through squid!

2004-05-23 Thread Schelstraete Bart
Gents,
I think this not a squid problem.
I'm having the same problem with the new versions of Gaim, while it 
worked with older versions.
So I suggest you to contact Gaim developers.

   B
Adam Aube wrote:
Boniforti Flavio wrote:
 

Do these two lines do the job?
acl SSL_ports port 22 443 460 563 1863 5190 1
acl Safe_ports port 1025-65535  # unregistered ports
   

Yes, those lines work - unless there is traffic on other ports that also
needs to be allowed. Check your access.log to see.
Adam
 


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] --> Squid, SquidGuard and selective filtering

2004-04-10 Thread Schelstraete Bart
Henrik,

For people who're not used to work with Squid, the ACL's of Squid can be 
very tricky. That's such a external programs can help for them.
Regarding performance: Like you said: if you 're using SMP system an 
external program can 'improve' the performance if you have a lot of 
acl's. Not on UP systems, I agree.

But I don't think most users of those external programs use this for the 
performance, but just because it's sometimes easier to use.
At least, that's what I think.
(I'm also using a lot of ACL's, but I'm using -of course - the squid acl's)



   Bart
Henrik Nordstrom wrote:
On Fri, 9 Apr 2004, Schelstraete Bart wrote:

 

If you have A LOT of ACL's, and you're not really familiar with Squid I 
think you better to use an  external program for this.
   

Why?

If you have SMP and use a lot of regex acls then yes, at least until Squid
can scale on SMP. But for UP I cannot agree. But very large regex
performance sucks using any approach as a list of regex expresions can not 
be optimized much..

Regards
Henrik
 



--
====
You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"

Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] --> Squid, SquidGuard and selective filtering

2004-04-09 Thread Schelstraete Bart
Christopher,

If you have A LOT of ACL's, and you're not really familiar with Squid I 
think you better to use an  external program for this.

rgrds,

   Bart
Christopher Weimann wrote:
On 04/09/2004-11:36AM, Henrik Nordstrom wrote:
 

Note: I would seriously recommend looking into using the Squid ACLs 
instead of SquidGuard for filtering..

   

I thought squidGuard existed because loading a huge list of URLs
into Squid ACLs was inefficient?  Has this changed?
 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] PAM and Squid problem

2004-03-13 Thread Schelstraete Bart
Henrik Nordstrom wrote:

On Wed, 10 Mar 2004, Jim Gifford wrote:

 

I have been trying to get squid to work with PAM.
   

Don't use PAM unless there is no other options.
 

Henrik,

Why did you say 'Don't use PAM unless there is no other options'?
We use PAM without any problems, and it's easy to configure...at least 
that's what I think.

rgrds,

Bart



Re: [squid-users] PAM and Squid problem

2004-03-10 Thread Schelstraete Bart
Hello Jim,

Did you specify the pam authentication helper in your squid.conf file?

   Bart
Jim Gifford wrote:
I have been trying to get squid to work with PAM. Here is my squid file for
pam
# Begin /etc/pam.d/squid

authrequiredpam_unix.so

account requiredpam_unix.so

# End /etc/pam.d/squid

Here is my error message
Mar 10 11:57:23 server squid(pam_unix)[23002]: authentication failure;
logname= uid=117 euid=117 tty= ruser= rhost=  user=jim
Now for the strange part, the uid is for the squid proxy server and not the
user jim
Any suggestions would be welcomed.

This is PAM 0.77 and Squid 2.5

Jim Gifford
[EMAIL PROTECTED]
 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Squid speed issue after authentication enabled

2004-03-10 Thread Schelstraete Bart
Hello,

* What exact authentication method did you use?
* How much authentication helpers are you using?
* Can you post us you acl's? (are you using an external program for acl's?)
rgrds,

  Bart
[EMAIL PROTECTED] wrote:
Hello,

It takes a long time to get any page if enable authentication in the SQUID.
I am using IE 5.5 on Winxp. Without the authentication speed is normal. I
tried both SMB and LDAP authentication and in both authentication it takes
a long time to display the first page after the authentication. Is there
any setting or special configuration is to be done make it normal. Can any
one help me to solve this problem.
regards,

ravi



 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Squid in a multi-processors Server

2004-03-10 Thread Schelstraete Bart
Hello,

The Squid process can only take advantage of 1 CPU, but the redirectors 
and other stuff can use the other CPU.
You can only bind one squid on the same port on the same ip...

rgrds,

  Bart
[EMAIL PROTECTED] wrote:
Hello everyone,

I'm running squid in a multi-proccessed server, and i've checked that squid
consumes only one processor, my machine's configuration are:
Dell Power Edge 2550
2 Gb RAM
2 Processors Pentium III 933 Mhz, 256kb cache
Is that a limitation of squid ? Can i make more than you process run and
bind to the same ip/port ?
Obs. I've updated squid but it doesnt balance over the processors.

Thanks in advance,
Moacir
 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] I have an problem withe the squid and redhat 9

2004-03-10 Thread Schelstraete Bart
You need to install the gnu compiler - GCC.

   Bart
Tomàs Rodriguez Orta wrote:
Hello, people.

I have a problem, when I going to install the squid2.5stable5.tar.gz, and
execute the scripts ./configure
this scripts stoping the installation  and showing the following error.
loading cache ./config.cache
checking for a BSD compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for mawk... no
checking for gawk... gawk
checking whether make sets ${MAKE}... yes
checking whether to enable maintainer-specific portions of Makefiles... no
checking for gcc... no
checking for cc... no
configure: error no aceptable cc found in $path
why?, what should I do?.
please any help.
TOMÁS
Tomás Rodriguez Orta
Administrador de Red
Agencia de Cruceros Selecmar
Telef: 862-81-50
   866-44-08
 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] problem with beta3 release february 16th.

2004-02-16 Thread Schelstraete Bart
Elias,

First increas the the debug_options value in your squid conf.
This will give you more information when things goes wrong.
   Bart
Éliás Tamás wrote:
Hy. I've downloaded and recompiled my squid (originating from early january) 
with the version in the subject. Since then the proxy is shutting down itself 
after a small time of correct operations. Sometimes with signal 11, but this 
time it received a signal 6 (abort) wihich I can't imagine. I haven't aborted 
it, so who does?
Here is my cache.log, might help:

Feb 16 22:31:53 localhost squid[11253]: Starting Squid Cache version 3.0-PRE3-
20040216 for i686-pc-linux-gnu...
Feb 16 22:31:53 localhost squid[11253]: Process ID 11253
Feb 16 22:31:53 localhost squid[11253]: With 1024 file descriptors available
Feb 16 22:31:53 localhost squid[11253]: DNS Socket created at 192.168.2.1, port 
32820, FD 5
Feb 16 22:31:53 localhost squid[11253]: Adding nameserver 192.168.2.1 from 
squid.conf
Feb 16 22:31:53 localhost squid[11253]: Adding nameserver 193.225.122.115 from 
squid.conf
Feb 16 22:31:53 localhost squid[11253]: Adding nameserver 193.225.12.58 from 
squid.conf
Feb 16 22:31:53 localhost squid[11253]: Adding nameserver 193.225.86.4 from 
squid.conf
Feb 16 22:31:53 localhost squid[11253]: Unlinkd pipe opened on FD 10
Feb 16 22:31:53 localhost squid[11253]: Swap maxSize 6144000 KB, estimated 
472615 objects
Feb 16 22:31:53 localhost squid[11253]: Target number of buckets: 23630
Feb 16 22:31:53 localhost squid[11253]: Using 32768 Store buckets
Feb 16 22:31:53 localhost squid[11253]: Max Mem  size: 102400 KB
Feb 16 22:31:53 localhost squid[11253]: Max Swap size: 6144000 KB
Feb 16 22:31:53 localhost squid[11253]: Local cache digest enabled; 
rebuild/rewrite every 3600/3600 sec
Feb 16 22:31:53 localhost squid[11253]: Store logging disabled
Feb 16 22:31:53 localhost squid[11253]: Rebuilding storage in /var/cache/squid 
(DIRTY)
Feb 16 22:31:53 localhost squid[11253]: Using Least Load store dir selection
Feb 16 22:31:53 localhost squid[11253]: Set Current Directory 
to /var/cache/squid
Feb 16 22:31:53 localhost squid[11253]: Loaded Icons.
Feb 16 22:31:53 localhost squid[11253]: Accepting transparently proxied HTTP 
connections at 192.168.1.1, port 3113, FD 11.
Feb 16 22:31:53 localhost squid[11253]: Accepting  HTTP connections at 
192.168.2.1, port 3113, FD 12.
Feb 16 22:31:53 localhost squid[11253]: Accepting ICP messages at 192.168.2.1, 
port 3130, FD 13.
Feb 16 22:31:53 localhost squid[11253]: Accepting HTCP messages on port 4827, 
FD 14.
Feb 16 22:31:53 localhost squid[11253]: WCCP Disabled.
Feb 16 22:31:53 localhost squid[11253]: Pinger socket opened on FD 15
Feb 16 22:31:53 localhost squid[11253]: Ready to serve requests.
Feb 16 22:31:53 localhost squid[11253]: Store rebuilding is  2.6% complete
Feb 16 22:31:54 localhost squid[11253]: icmpSend: send: (111) Connection refused
Feb 16 22:31:54 localhost squid[11253]: Closing Pinger socket on FD 15
Feb 16 22:31:56 localhost squid[11253]: Done reading /var/cache/squid swaplog 
(155511 entries)
Feb 16 22:31:56 localhost squid[11253]: Finished rebuilding storage from disk.
Feb 16 22:31:56 localhost squid[11253]:155511 Entries scanned
Feb 16 22:31:56 localhost squid[11253]: 0 Invalid entries.
Feb 16 22:31:56 localhost squid[11253]: 0 With invalid flags.
Feb 16 22:31:56 localhost squid[11253]:155511 Objects loaded.
Feb 16 22:31:56 localhost squid[11253]: 0 Objects expired.
Feb 16 22:31:56 localhost squid[11253]: 0 Objects cancelled.
Feb 16 22:31:56 localhost squid[11253]: 0 Duplicate URLs purged.
Feb 16 22:31:56 localhost squid[11253]: 0 Swapfile clashes avoided.
Feb 16 22:31:56 localhost squid[11253]:   Took 2.8 seconds (55849.9 
objects/sec).
Feb 16 22:31:56 localhost squid[11253]: Beginning Validation Procedure
Feb 16 22:31:56 localhost squid[11253]:   Completed Validation Procedure
Feb 16 22:31:56 localhost squid[11253]:   Validated 155511 Entries
Feb 16 22:31:56 localhost squid[11253]:   store_swap_size = 3185036k
Feb 16 22:31:56 localhost squid[11253]: storeLateRelease: released 0 objects
Feb 16 22:32:02 localhost squid[11253]: Preparing for shutdown after 23 requests
Feb 16 22:32:02 localhost squid[11253]: Waiting 30 seconds for active 
connections to finish
Feb 16 22:32:02 localhost squid[11253]: FD 11 Closing HTTP connection
Feb 16 22:32:02 localhost squid[11253]: FD 12 Closing HTTP connection
Feb 16 22:32:16 localhost squid[11253]: Shutting down...
Feb 16 22:32:16 localhost squid[11253]: FD 13 Closing ICP connection
Feb 16 22:32:16 localhost squid[11253]: FD 14 Closing HTCP socket
Feb 16 22:32:16 localhost squid[11253]: Closing unlinkd pipe on FD 10
Feb 16 22:32:16 localhost squid[11253]: storeDirWriteCleanLogs: Starting...
Feb 16 22:32:16 localhost squid[11253]: 65536 entries written so far.
Feb 16 22:32:16 localhost squid[11253]:131072 entries written so far.
Feb 16 22:32:16 localhost squid[11253]:   Finished.  Wrote 155511 entries.
Feb 16 22:

Re: [squid-users] Access deny page

2003-12-31 Thread Schelstraete Bart
Inrease the debugging in squid.As far as I know this is the only way 
to know this
If you have more then 50.000 domains/url's you should better consider to 
use a thrid-party filter. (Squidguard, dansguardian)

  Bart
Paulo Ricardo wrote:
Hi guys

Just a simple question. How can i known which word in ACL type is
blocking access from some user? I'm asking that because I have 10 lists
and some of them w/ more than 50.000 domains/url...
Is there a way to insert the word/url wich is blocked in error message?

as example:
__
ERROR
The requested URL could not be retrieved


While trying to retrieve the URL:
http://mirror.phy.bnl.gov/debian-iso/gluck.debian.org/cdimage/testing/netinst/i386/beta-1/
The following error was encountered:

 * Access Denied.  list /etc/squid/blacklist/government/.gov
   ^^  
   Access control configuration prevents your request from being
   allowed at this time. Please contact your service provider if
   you feel this is incorrect.
   
Your cache administrator is webmaster. 




Generated Wed, 31 Dec 2003 12:54:29 GMT by cerberusint.intranet
(squid/2.5.STABLE4)
__


cheers

 



--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] bug or compilation problem?

2003-12-26 Thread Schelstraete Bart
Forgot to tell you that this is Squid 2.5 stable 4, on HP-UX 11i


   Bart
Quoting Schelstraete Bart <[EMAIL PROTECTED]>:

> cache.log: 
> ---
> 2003/12/26 12:53:44| Process ID 12955
> 2003/12/26 12:53:44| With 2048 file descriptors available
> 2003/12/26 12:53:44| Performing DNS Tests...
> 2003/12/26 12:53:44| Successful DNS name lookup tests...
> 2003/12/26 12:53:44| DNS Socket created at 0.0.0.0, port 49821, FD 4
> 2003/12/26 12:53:44| Adding nameserver 198.141.87.188 from /etc/resolv.conf
> 2003/12/26 12:53:44| Adding nameserver 198.141.87.36 from /etc/resolv.conf
> 2003/12/26 12:53:44| Adding nameserver 198.141.87.41 from /etc/resolv.conf
> 2003/12/26 12:53:44| helperOpenServers: Starting 5 'pam_auth' processes
> 2003/12/26 12:53:44| Unlinkd pipe opened on FD 15
> 2003/12/26 12:53:44| Swap maxSize 10485760 KB, estimated 806596 objects
> 2003/12/26 12:53:44| Target number of buckets: 40329
> 2003/12/26 12:53:44| Using 65536 Store buckets
> 2003/12/26 12:53:44| Max Mem  size: 36864 KB
> 2003/12/26 12:53:44| Max Swap size: 10485760 KB
> 2003/12/26 12:53:44| Rebuilding storage in /squid/var/cache (DIRTY)
> 2003/12/26 12:53:44| Using Least Load store dir selection
> 2003/12/26 12:53:44| Set Current Directory to /squid/var/cache
> 2003/12/26 12:53:44| Loaded Icons.
> 2003/12/26 12:53:45| Accepting HTTP connections at 0.0.0.0, port 3128, FD
> 16.
> 2003/12/26 12:53:45| Accepting HTCP messages on port 4827, FD 17.
> 2003/12/26 12:53:45| WCCP Disabled.
> 2003/12/26 12:53:45| Pinger socket opened on FD 19
> 2003/12/26 12:53:45| Ready to serve requests.
> 2003/12/26 12:53:45| storeSwapMetaUnpack: insane length (268435456)!
> 2003/12/26 12:53:45| assertion failed: MemPool.c:275: "pool->meter.idle.level
> <=
> pool->meter.alloc.level"
> --
> 
> 
> Anybody an idea? A bug? 
> 
> 
>  Bart
> 



[squid-users] bug or compilation problem?

2003-12-26 Thread Schelstraete Bart
cache.log: 
---
2003/12/26 12:53:44| Process ID 12955
2003/12/26 12:53:44| With 2048 file descriptors available
2003/12/26 12:53:44| Performing DNS Tests...
2003/12/26 12:53:44| Successful DNS name lookup tests...
2003/12/26 12:53:44| DNS Socket created at 0.0.0.0, port 49821, FD 4
2003/12/26 12:53:44| Adding nameserver 198.141.87.188 from /etc/resolv.conf
2003/12/26 12:53:44| Adding nameserver 198.141.87.36 from /etc/resolv.conf
2003/12/26 12:53:44| Adding nameserver 198.141.87.41 from /etc/resolv.conf
2003/12/26 12:53:44| helperOpenServers: Starting 5 'pam_auth' processes
2003/12/26 12:53:44| Unlinkd pipe opened on FD 15
2003/12/26 12:53:44| Swap maxSize 10485760 KB, estimated 806596 objects
2003/12/26 12:53:44| Target number of buckets: 40329
2003/12/26 12:53:44| Using 65536 Store buckets
2003/12/26 12:53:44| Max Mem  size: 36864 KB
2003/12/26 12:53:44| Max Swap size: 10485760 KB
2003/12/26 12:53:44| Rebuilding storage in /squid/var/cache (DIRTY)
2003/12/26 12:53:44| Using Least Load store dir selection
2003/12/26 12:53:44| Set Current Directory to /squid/var/cache
2003/12/26 12:53:44| Loaded Icons.
2003/12/26 12:53:45| Accepting HTTP connections at 0.0.0.0, port 3128, FD 16.
2003/12/26 12:53:45| Accepting HTCP messages on port 4827, FD 17.
2003/12/26 12:53:45| WCCP Disabled.
2003/12/26 12:53:45| Pinger socket opened on FD 19
2003/12/26 12:53:45| Ready to serve requests.
2003/12/26 12:53:45| storeSwapMetaUnpack: insane length (268435456)!
2003/12/26 12:53:45| assertion failed: MemPool.c:275: "pool->meter.idle.level <=
pool->meter.alloc.level"
--


Anybody an idea? A bug? 


 Bart


Re: [squid-users] Blocking Adult sites ( SOrry for the empty mail)

2003-12-24 Thread Schelstraete Bart
Kareem Mahgoub wrote:

The problem is that it is outadet.
 

I thought this list was updated frequently.

  Bart

--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Blocking Adult sites ( SOrry for the empty mail)

2003-12-24 Thread Schelstraete Bart
Walid Abd ElDayem wrote:

Hi list,
Sorry for the repeated subject.
I have searched google and the list archive on this topic and all what i
have got that this can be done either by blocking specific domains or by
using regular expression ACL
Now my question is:
1- Is there any database for those sites that i can add on my access list
( Either free or with fees)
2- Is there anyone willing to share his reg list with me?
 

Walid Abd ElDayem ,

You can use the list from Squidguard.



rgdsq,
 bart
--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Bizzaro error

2003-12-22 Thread Schelstraete Bart
Hello Ryan,

Try to start Squid without that start script.
For example:   /usr/local/squid
   BArt
Ryan Nix wrote:
Has anyone ever seen this error before?

Nothing has changed in my conf file.

Executing /etc/rc.d/init.d/squid restart ..

Stopping squid: /etc/rc.d/init.d/squid: line 162:  2868 Aborted $SQUID 
-k check >/dev/null 2>&1
[FAILED]
Starting squid: /etc/rc.d/init.d/squid: line 162:  2869 Aborted $SQUID 
$SQUID_OPTS 2>/dev/null
[FAILED]






Re: [squid-users] Re: Windows Update Problem

2003-12-11 Thread Schelstraete Bart
olbar_curve.gif 
- NONE/- image/gif
Tue Dec  2 15:30:38 2003168 10.10.14.113 TCP_HIT/200 6059 GET 
http://v4.windowsupdate.microsoft.com/shared/images/mstoolbar_icp.gif 
- NONE/- image/gif
Tue Dec  2 15:30:38 2003 82 10.10.14.113 TCP_HIT/200 874 GET 
http://v4.windowsupdate.microsoft.com/shared/images/mstoolbar_ms.gif 
- NONE/- image/gif
Tue Dec  2 15:30:38 2003192 10.10.14.113 TCP_MISS/200 340 HEAD 
http://windowsupdate.microsoft.com/v4/iuident.cab? - 
DIRECT/207.46.134.92 application/octet-stream
Tue Dec  2 15:30:39 2003260 10.10.14.113 TCP_MISS/200 3672 GET 
http://v4.windowsupdate.microsoft.com/en/splash.asp? - 
DIRECT/65.54.249.61 text/html
Tue Dec  2 15:30:39 2003 73 10.10.14.113 TCP_HIT/200 1007 GET 
http://v4.windowsupdate.microsoft.com/shared/images/protect.gif - 
NONE/- image/gif
Tue Dec  2 15:30:39 2003 50 10.10.14.113 TCP_HIT/200 1418 GET 
http://v4.windowsupdate.microsoft.com/shared/images/arrow.gif - 
NONE/- application/octet-stream
Tue Dec  2 15:30:40 2003116 10.10.14.113 TCP_MISS/200 1820 GET 
http://v4.windowsupdate.microsoft.com/en/news.asp? - 
DIRECT/65.54.249.61 text/html
Tue Dec  2 15:31:26 2003 87 10.10.14.113 TCP_HIT/200 443 GET 
http://v4.windowsupdate.microsoft.com/shared/images/toc_expanded.gif 
- NONE/- image/gif
Tue Dec  2 15:31:26 2003364 10.10.14.113 TCP_HIT/200 72555 GET 
http://v4.windowsupdate.microsoft.com/shared/js/top2.js - NONE/- 
application/x-javascript
Tue Dec  2 15:31:27 2003 82 10.10.14.113 TCP_MISS/000 0 GET 
http://v4.windowsupdate.microsoft.com/en/error.asp? - 
DIRECT/207.46.244.222 -
Tue Dec  2 15:31:27 2003114 10.10.14.113 TCP_MISS/200 2515 GET 
http://v4.windowsupdate.microsoft.com/en/error.asp? - 
DIRECT/207.46.244.222 text/html
Tue Dec  2 15:31:27 2003 79 10.10.14.113 TCP_HIT/200 567 GET 
http://v4.windowsupdate.microsoft.com/shared/images/info_icon.gif - 
NONE/- application/octet-stream
 

 

Pardon this rubbish:
This electronic message transmission is a PRIVATE communication which 
contains
information which may be confidential or privileged. The information 
is intended to be for the use of the individual or entity named 
above. If you are not the intended recipient, please be aware that 
any disclosure, copying, distribution or use of the contents of this 
information is prohibited. Please notify the
sender  of the delivery error by replying to this message, or notify 
us by
telephone (877-633-2436, ext. 0), and then delete it from your system.




.::DAMK::.


--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"

Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org


Re: [squid-users] Squid Question

2003-11-19 Thread Schelstraete Bart
Arnie Bose wrote:

Hello,

I have a quick question. I have downloaded Squid for my Red Hat Linux 
9.0. My question to you is are there any good management utility to 
set up squid graphically.

I can always edit squid.conf file but I would prefer to use a graphic 
utility to make the changes in squid. For instance I use commanche for 
Apache.

Hello Arnie,

Webmin has a module for Squid. That's a possibility. (but I personally 
never used that)

rgrds,

 Bart

--
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] HTTP/1.1

2003-11-17 Thread Schelstraete Bart
Trigge, Graham wrote:

Guys (and gals),

A colleague of mine is wanting to find out which version of SQUID has
HTTP/1.1 compliance (if at all at this point in time). He has searched
the web to no avail.
Any information would be grateful.

 

Hello,

As far as I know: None.
Squid is HTTP/1.0
rgrds,

  Bart

--
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



[squid-users] ntlm

2003-11-13 Thread Schelstraete Bart
Hello,

I'm just starting to use NTLM authentication with Squid and I have some 
questions:

a)   Do you really need to use winbind in order to use NTLM 
authentication? If I check the Squid FAQ, they are using the 
wb_ntlmauth. But I don't see any example
 with ntlmauth.

b)   If I tried: 
   auth_param ntlm program /usr/local/squid/libexec/ntlm_auth
DOMAIN/controller.hostname
  (but without winbind)
  The IE browsers just can't connect, they just keep looking for 
the page. - no special error message in the squid log file.
 

c)   fakeauth works without any problems, but the userid is then 
domain/userid. Is there any way to only have the 'userid' , so to drop 
that 'domain'?



rgrds,

  Bart



Re: [squid-users] Error while starting squid : Could not determine fully qualified hostname.

2003-11-03 Thread Schelstraete Bart
Hello,

you 're using the visible_hostname, like it should be.
But your 'visible_hostname" option starts with a space, and I'm not 100% sure if
this valid.
Can you remove the space before visible_hostname?


  BArt

Quoting Deep Ganatra <[EMAIL PROTECTED]>:

> Hi,
> I have configured my squid.conf settings and kept most of the settings 
> default to test the squid.
> but now when i run the squid using squid service start or i try squid -z 
> it shows following error :
> 
> 
> [EMAIL PROTECTED] root]# squid -z
> FATAL: Could not determine fully qualified hostname. Please set 
> 'visible_hostname'
> 
> Squid Cache (Version 2.5.STABLE1): Terminated abnormally.
> CPU Usage: 0.030 seconds = 0.010 user + 0.020 sys
> Maximum Resident Size: 0 KB
> Page faults with physical i/o: 409
> Aborted
> 
> my visual_hostname is Taj1
> 
> since i m running this on local network so didnt give any .com or any 
> other domain.
> 
> I have uploaded my squid.conf on my server  
> (http://www.techbeta.com/4boards/squid.conf)
> 
> I still dont know what its showing me that error.
> 
> can anyone please help me out.
> 
> regards
> Deep
> 
> 





Re: [squid-users] Blocking Yahoo Messenger

2003-11-03 Thread Schelstraete Bart
Hello Rinnaldy,

Blocking that port doesn't work, because YM works with the http_proxy, so over
the squid port.
What you should do is deny access to the site: shttp.msg.yahoo.com



rgrds,

Bart
Quoting Rinnaldy <[EMAIL PROTECTED]>:

> Dea All,
> 
> How to block Yahoo messenger with squid ?? I already use iptables to
> block YM port but it is not working . with this line  :
> -A INPUT -s 192.168.0.0/255.255.255.255 -d 0.0.0.0/0.0.0.0 -p tcp -m tcp
> --dport 5050 -j DROP
> 
> in squid.conf I add this line :
> acl YM_portport 5050
> http_access deny YM_port
> 
> but users still can access YM ,pls advice what should I do with my squid
> and iptables ..
> 
> rgds
> 
> 
> 
> 




Re: [squid-users] Round robin upstream parents....

2003-11-01 Thread Schelstraete Bart
HEllo Donovan,


1.	Have my local squid server use a collection of upstream servers by
default as the next host when serving a HTTP or FTP request
 

You need to use the cache-peer option in the squid configuration file:

For example:

   cache_peer proxy.parent.com parent 3128 3130 round-robin
   cache_peer proxy.parent2.com parent 3128 3130 round-robin
If one goes down, it will be marked as 'invalid' and will not be used 
untill it's available again.
But pls note that round robin proxies can give problems with 'shopping 
sites' or similar. -something I discovered a few weeks ago.
because most sites are 'storing' the IP, and because the ip can change, 
some big problems can occurs. (you can solve this for particular 
websites by using cache_peer_access option but then you won't have 
'round robin' for those websites)

For more info regarding this:  
http://squid.visolve.com/squid24s1/neighbour.htm

2.	Obscure the external address of my proxy to the outside world
 

The  ip address of your local proxy or the parent proxy?  The IP visible 
to the outside world is the  'highest" proxy.  So the one that is 
connect to the internet.



rgrds,

 Bart

--
====
You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"

Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] auth TTL

2003-11-01 Thread Schelstraete Bart
Ilya wrote:

Hello!

What are differences between: authenticate_ttl & auth_param basic 
credentialsttl?

According with squid.conf it sounds as they were the same. Is it so?

wbr,
Ilya
Hello,

I quote from a previous Squid mail to the mailinglist::

'/ In squid-2.5 the meaning of this directive have apparently //changed 
quite drastically, with the meaning of authenticate_ttl moved to /
/ auth_param credentialsttl'



rgds,
  Bart
/
--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] NCSA Module

2003-11-01 Thread Schelstraete Bart
melvin melvin wrote:

Hello,

I am using Squid 2.4 stable 3 on a Linux Suse 8.0. Recently i'm 
starting to set up a user authentication system and would like to use 
the NCSA external application.
However i can't seem to find the NCSA authentication module for the 
user authentication.
Anyone knows where to get it?


Hello,

Pls check the Squid manual:  
http://squid.visolve.com/squid24s1/externals.htm#authenticate_program

rgrds,

  Bart

--

You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] block ICQ

2003-10-29 Thread Schelstraete Bart
Li Wei wrote:

>hi, all
>
>Recently, I found many users use ICQ through proxy server(Squid2.5.STABLE2).
>such as www.icq.com:80
>
>I set one ACL to block it, like following:
>acl QQ dstdom_regex -i www.icq.com
>
>But it seem not to take effect.
>
>  
>

That's correct. The messenger itself is not connecting to www.icq.com ,
that's the webpage itself.
What you should do I check the access.log file, and check what address
are loaded.
As far I could see, icq was connecting to *.icq.aol.com and *.icq.com
over here. (but you need to check this in your sqquid logfile)
SO can try the following acl

acl QQ dstdom_regex -i icq.com icq.aol.com



rgrds,

Bart

-- 
 Schelstraete Bart
 http://www.hansbeke.com
 email: bart at schelstraete.org




Re: [squid-users] Problem accessing some sites

2003-10-27 Thread Schelstraete Bart
Lo,

This is your problem:

acl BANDOMAIN urlpath_regex www .com .net

  => 
http://mis3.home.company/inhouse/COMmon/login.asp?goto=/inhouse/leave/Default.asp&fnum. 

In this acl your blocking every url with www, com or net!
You should use dest domain instead.


rgrds,

  Bart
squid squid wrote:
Hi,

I have just compiled Squid 2.5 Stable 4 and running it on Solaris 8 on 
an Intranet environment. However I am having problem accessing sites 
with URL like 
http://mis3.home.company/inhouse/common/login.asp?goto=/inhouse/leave/Default.asp&fnum. 

The error message is as follows:

The requested URL could not be retrieved.
While trying to retrieve the URL: 
http://mis3.home.company/inhouse/common/login.asp?
The following error was encountered:
Access Denied.
Access control configuration prevents your request from being alloed 
at this time. Pls contact your service provider if you feel this is 
incorrect.

On the access logfile, I got 403 TCP_DENIED:NONE.

Pls advise what could have gone wrong. Thank you.

My squid.conf is as follows:

# NETWORK OPTIONS
http_port 3128
icp_port 0
# OPTION WHICH AFFECT NEIGHBOUR SELECTION ALGORITHM
cache_peer 123.45.1.30 parent 3128 0 no-query proxy-only
acl query urlpath_regex cgi-bin \?
acl dynamic_contents urlpath_regex \*\.asp
acl dynamic_contents urlpath_regex \*\.jsp
no_cache deny query dynamic_contents
# OPTIONS WHICH AFFECT THE CACHE SIZE
cache_mem  10 MB
maximum_object_size 1024 KB
maximum_object_size_in_memory 1024 KB
# LOGFILE PATHNAMES & CACHE DIRECTORIES
cache_dir ufs /usr/local/squid/var/cache 3000 16 256
cache_access_log /usr/local/squid/var/logs/access.log
cache_log /usr/local/squid/var/logs/cache.log
pid_filename /usr/local/squid/var/logs/squid.pid
cache_store_log none
emulate_httpd_log on
log_ip_on_direct off
mime_table /usr/local/squid/etc/mime.conf
log_mime_hdrs off
debug_options ALL,1
log_fqdn off
# OPTIONS FOR TUNING THE CACHE
request_header_max_size 1 KB
negative_ttl 5 minutes
positive_dns_ttl 30 minutes
negative_dns_ttl 1 minutes
# TIMEOUTS
connect_timeout 120 seconds
peer_connect_timeout 120 seconds
read_timeout 5 minutes
request_timeout 5 minutes
half_closed_clients off
pconn_timeout 15 seconds
shutdown_lifetime 10 seconds
# DEFAULT ACCESS CONTROLS
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl SSL_PORTS port 343 443 7002 8000 9000 15000
acl Safe_ports port 80 21 443 563 70 210 1025-65535
acl SSL method CONNECT
# Only allow administrator access from localhost
http_access allow manager localhost
http_access deny manager
# Deny requests to unknown ports
http_access deny !Safe_ports
#Deny CONNECT to other than SSL ports and no direct connection for SSL
http_access deny SSL !SSL_ports
never_direct allow SSL
# Ban on file types and domain
acl BANFILE urlpath_regex \.bmp$ \.mp3$ \.mpg$ \.avi$
acl BANDOMAIN urlpath_regex www .com .net
http_access deny BANFILE
http_access deny BANDOMAIN
# For the cache purge
acl PURGE method purge
http_access allow PURGE localhost
http_access deny PURGE
# Commom application/web servers in local
acl direct-svr dstdomain mis3.home.company
always_direct allow direct-svr
# Commom application/web servers housed remote and access thru' 
123.45.1.30
acl remote-svr dst 123.45.1.31
cache_peer_access 123.45.1.30 allow remote-svr
never_direct allow remote-svr

# Allow requests to proxy
http_access allow all
# HTTPD-ACCELERATOR OPTIONS
# For Squid to run as transparent proxy
httpd_accel_uses_host_header on
# ADMINISTRATIVE PARAMETERS
cache_mgr [EMAIL PROTECTED]
cache_effective_user nobody
visible_hostname proxy.inet.company
# MISCELLANEOUS
dns_testnames home.company mis3.home.company
memory_pools off
cachemgr_passwd none all
snmp_port 0
client_db off
_
Get 10mb of inbox space with MSN Hotmail Extra Storage 
http://join.msn.com/?pgmarket=en-sg




--
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] customize access denied error page for a particular link

2003-10-27 Thread Schelstraete Bart
Raja R wrote:

Hi All,
Can anyone kindly tell me on how to do the following ?
I want to block a site and want to have a separate customized error page for
that .
That error page shud be only for that website.
 

Hello,

See deny_info

first create an ACL for that site, and use deny_info for that acl

rgrds,
  bart
--
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] Stuck every starting swap

2003-10-26 Thread Schelstraete Bart
Awie,

* How big is your Squid cache?
* What kind of Squid cache are you using? (aufs, ufs, diskd)
* What filesystem are you using? (ext2, ext3, reiser, ...)
Bart

Awie wrote:

My system has 256 MB of RAM and I put 8 MB at the cache_mem section

Thx & Rgds,

Awie

- Original Message -
From: "Henrik Nordstrom" <[EMAIL PROTECTED]>
To: "Awie" <[EMAIL PROTECTED]>
Cc: "Squid-users" <[EMAIL PROTECTED]>
Sent: Sunday, October 26, 2003 10:05 PM
Subject: Re: [squid-users] Stuck every starting swap
 

On Sun, 26 Oct 2003, Awie wrote:

   

All,

My Squid (2.5S1) seems always stuck every time system start a swapping
progress. After I restart the program, it run normal again. My linux is
 

RH
 

7.3 with kernel 2.40.18.
 

How much memory do you have?

Have you read the squid FAQ section on memory usage and cache sizing?

Regards
Henrik
   



 



--
====
You can find me on Google or Yahoo...
search for "Schelstraete Bart" or "Bart Schelstraete"
====
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] Stuck every starting swap

2003-10-26 Thread Schelstraete Bart
Awie wrote:

All,

My Squid (2.5S1) seems always stuck every time system start a swapping
progress. After I restart the program, it run normal again. My linux is RH
7.3 with kernel 2.40.18.
Is is any parameter in Linux or Squid that I should apply?
 

You need to tune your Linux (swapping), and try decreasing the cache_mem 
option.
And watch  what squid is doing after youd id that.



   Bart

--
Schelstraete Bart
http://www.hansbeke.com
email: bart at schelstraete.org



Re: [squid-users] Proxy Authentication and Java Applets

2003-10-22 Thread Schelstraete Bart
[EMAIL PROTECTED] wrote:

Hi !! 

You should do the following: 

acl java_jvm browser Java 

then, before your http_access for the authenticated users, use: 

http_access allow java_jvm 

 

Ohhh!
Be aware that you're then allowing every Java client to access the proxy 
WITHOUT authentication!
There are a lot of java programs with uses proxy servers, and with that 
acl you'll allow them all.

rgrds,

 Bart



Re: [squid-users] Help with running 2 instances of squid

2003-10-18 Thread Schelstraete Bart
HEllo,

Are you also starting the squid with the other configuration file?
(squid -f  [configile])
  Bart

Chris Wilcox wrote:

Hi all,

As far as I can tell, I have followed everything I'm supposed to have 
to make this work.  I wish to run two instances of Squid on the same 
machine: Squid/2.4.STABLE6 on Debian Woody stable.

I have squid.conf and squid2.conf with the relevant settings altered.

Squid.conf listens on port 3128 on the external IP.  I have not 
altered the PID file path for this one.
Squid2.conf listens on port 8085 on 127.0.0.1  I have altered the PID 
file path to be /var/run/squid2.pid  I have created a different cache 
directory etc, and specified these within the squid2.conf file, along 
with checkong relevant permissions on these directories.

I have then copied the squid init.d script and called this squid2.  I 
have altered squid2 to have a different name string, and altered all 
relevant entries within this script to point to the squid2 details eg 
cache directory etc.  This squid2 script is set to use 'squid2.pid' as 
I've specified in squid2.conf

Yet still, when I do 'squid2 start' it always returns 'Squid is 
already running!  Process ID 213'

I'm now lost.  Am I missing something really obvious here?  I've been 
trying to get this going for a good few weeks now!

Major thanks for any advice and suggestions!

Regards,

nry

_
Use MSN Messenger to send music and pics to your friends 
http://www.msn.co.uk/messenger






Re: [squid-users] SUPPORT ASSISTENCE

2003-10-18 Thread Schelstraete Bart
Henrik Nordstrom wrote:

On Fri, 17 Oct 2003, Schelstraete Bart wrote:

 

Why do you want to use a access.log file which is bigger then 1,5Gb??
This is NOT good for the perfomance.
   

Squid performance does not really care, 

Yes , and no.

Like you said, the squid performance itself won't decrease because it's 
just 'adding' entries to that logfile.
But after a while you need to do something with that logfile (I think 
everybody needs to look at the access logfiles  to chck something)...it 
can't stay increasing forever. One reason is for example the 2 Gb 
filesize limit on most systems.
Another thing: Most users want to check the logfile, and Squid isn't 
always  running on 'state of the art' machines. And if you then want to 
do something with that huge logfile, you can have some 'serious 
problems'.It will take ages to open that logfile (some program will fail 
to open that file), and it will take a very, very long time to compress 
that logfile, etc,etc.
I don't see any advantages why you should keep such big access logfiles. 
(but maybe there are some reasons, I dunnow)

I'm rotating my logfiles *every day*, and keep them for one week. 
(compressed). After that the logfiles older then one week are deleted 
from the server, but are still available on backup (and on 'reporting 
server). I - personally - think that this is the best way.

rgrds,

 Bart



Re: [squid-users] SUPPORT ASSISTENCE

2003-10-17 Thread Schelstraete Bart
Frederico Guilherme Capute de Oliveira - DATAPREVDF wrote:

I'm using Linux Conectiva 8 with Squid on it, when the access.log hits 1,5
GB, Squid just stop with  non reson, somebody knows with exist a patch for
that !
Thank u, Fred.
 

Hello,

Why do you want to use a access.log file which is bigger then 1,5Gb??
This is NOT good for the perfomance.
rgrds,
 Bart


Re: [squid-users] Bandwith limit in LAN

2003-10-17 Thread Schelstraete Bart
Aristarchus wrote:

Hello! Allow me to get right to the point. I have 15 computers connected in an 100 Mbps Lan and a 512 Kbps DSL connection to the internet. The DSL modem also functions as a router. The question is how can I Limit the bandwith to every computer so that the connection from one computer does not influence another? Is that possible to do with squid? If it is not please recommend another way of doing this. Thank you in advance... 04423
 

Hello,

You need to configure delay pools in squid.
Check the FAQ on the Squid website for more information how to configure 
delay pools.



 Bart



Re: [squid-users] Running two instances of Squid from one binary

2003-10-05 Thread Schelstraete Bart
Chris Wilcox wrote:

Hi all,

Our current project currently requires the use of two seperate squid 
instances with a web filter in the middle.  Clients would connect to 
Squid1 and be authorised then passed to the filter which would use 
Squid2 as the cache.  Squid1 would not cache, just log.

You can run two separate squids by using two separate config files (see
the commandline parameters for squid, particularly -f ).
And that's it. So you need to use 2 different configuration files.
rgrds,

 Bart




Re: [squid-users] acl matching

2003-10-04 Thread Schelstraete Bart
Esteban Ribicic wrote:

is there any pratical way to distinbguish if acl's are matching?
thx
Esteban
 

Euhmincrease the logging.



  Bart



Re: [squid-users] setting up a blacklist

2003-09-19 Thread Schelstraete Bart
Bill,

--
acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls"
acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains"
acl porn "/usr/share/squid/blacklists/porn/expressions"
--
As far as I know this is not correct.
Other Squid users: Pls correct me if I'm wrong.
rgrds,

		Bart

Bill McCormick wrote:

Squid brings my dual Xeon Dell to it's knees on startup and
 

shutdown.

Can you post your squid.conf (without comments or blank lines)?

Adam

   

Here ya go ...

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl homenet src 192.168.212.0/24
http_access allow homenet
http_access allow localhost
http_access deny all
acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls"
acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains"
acl porn "/usr/share/squid/blacklists/porn/expressions"
deny_info ERR_NO_PORNO porn
http_access deny porn
http_reply_access allow all
icp_access allow all
visible_hostname billinux
coredump_dir /var/spool/squid
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.518 / Virus Database: 316 - Release Date: 9/11/2003
 





Re: [squid-users] Why is Squid restarting?

2003-09-17 Thread Schelstraete Bart
Joao Coutinho wrote:

Do you think this is causing Squid to restart?
I will have about 200 people under this Proxy. Today I'm testing it 
with only about 20 people. What do you think would be a good number of 
children process?


This depends on how many concurrent connections  and how clients are 
connecting to your system.
First try to increase it to 10 or something, and check the results. 
(also depends  on your hardware)
Also use the squidmgr about the authenticator statistics.

rgrds,

 Bart



Re: [squid-users] Why is Squid restarting?

2003-09-17 Thread Schelstraete Bart
Hello,

(wb_auth)[11310](wb_basic_auth.c:110): fgets() failed! dying. 
errno=0 (Success)
2003/09/17 11:56:21| parseConfigFile: line 1006 unrecognized: 'default 
all requests # are sent. # #Default: # none' 


Can you send us your squid.conf file?



Bart

Joao Coutinho wrote:

Hi all,

I installed Squid 2.5 stable 3 (--enable-auth="ntlm,basic" 
--enable-basic-auth-helpers="winbind" 
--enable-ntlm-auth-helpers="winbind" 
--enable-external-acl-helpers="winbind_group"). Samba version is
samba-2.2.8a (--with-winbind --with-winbind-auth-challenge)

Everything seems to work fine, my users are browsing the internet, 
squid is authenticating, but once in a while, a popup appears asking 
for loginname and passord. We just click Cancel and continue to browse 
without problems.
Looking at cache.log, I saw that Squid is restarting more than twice a 
day. I have no Idea why.
Above is part of my cache.log. Please help me. Thanks in advance, Joao

2003/09/17 11:24:55| helperStatefulDefer: None available.
2003/09/17 11:24:55| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| WARNING: All ntlmauthenticator processes are busy.
2003/09/17 11:25:30| WARNING: 5 pending requests queued
2003/09/17 11:25:30| Consider increasing the number of 
ntlmauthenticator processes in your config file.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.
2003/09/17 11:25:30| helperStatefulDefer: None available.

2003/09/17 11:56:21| Restarting Squid Cache (version 2.5.STABLE3)...

2003/09/17 11:56:21| FD 6 Closing HTTP connection
2003/09/17 11:56:21| FD 28 Closing ICP connection
(wb_ntlmauth)[11305](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid argument)
(wb_auth)[11307](wb_basic_auth.c:110): fgets() failed! dying. 
errno=0 (Success)
(wb_auth)[11308](wb_basic_auth.c:110): fgets() failed! dying. 
errno=0 (Success)
(wb_auth)[11309](wb_basic_auth.c:110): fgets() failed! dying. 
errno=0 (Success)
(wb_auth)[11311](wb_basic_auth.c:110): 
(wb_ntlmauth)[11303](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=0 (Success)
fgets() failed! dying. errno=22 (Invalid argument)
(wb_ntlmauth)[11304](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid argument)
(wb_ntlmauth)[11306](wb_ntlm_auth.c:273): 
(wb_ntlmauth)[11302](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid ar$
fgets() failed! dying. errno=22 (Invalid argument)
(wb_auth)[11310](wb_basic_auth.c:110): fgets() failed! dying. 
errno=0 (Success)
2003/09/17 11:56:21| parseConfigFile: line 1006 unrecognized: 'default 
all requests # are sent. # #Default: # none'
2003/09/17 11:56:21| Store logging disabled
2003/09/17 11:56:21| DNS Socket created at 0.0.0.0, port 32783, FD 5
2003/09/17 11:56:21| Adding nameserver 172.17.10.1 from /etc/resolv.conf
2003/09/17 11:56:21| helperOpenServers: Starting 5 'redir.pl' processes
2003/09/17 11:56:22| helperStatefulOpenServers: Starting 10 
'wb_ntlmauth' processes
(wb_ntlmauth)[12510](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12511](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12512](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12513](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12514](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12515](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12516](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12517](wb_ntlm_auth.c:355): target domain is domain
2003/09/17 11:56:22| helperOpenServers: Starting 5 'wb_auth' processes
(wb_ntlmauth)[12518](wb_ntlm_auth.c:355): target domain is domain
(wb_ntlmauth)[12519](wb_ntlm_auth.c:355): target domain is domain
2003/09/17 11:56:22| helperOpenServers: Starting 5 'wb_group' processes
2003/09/17 11:56:22| Accepting HTTP connections at 0.0.0.0, port 3128, 
FD 6.
2003/09/17 11:56:22| Accepting ICP messages at 0.0.0.0, port 3130, FD 36.
2003/09/17 11:56:22| WCCP Disabled.
2003/09/17 11:56:22| Loaded Icons.
2003/09/17 11:56:22| Ready to serve requests.
2003/09/17 12:20:38| AuthenticateNTLMHandleReply: invalid callback 
data. Releasing helper '0x8469e28'.

2003/09/17 12:29:20| Restarting Squid Cache (version 2.5.STABLE3)...

2003/09/17 12:29:20| FD 6 Closing HTTP connection
2003/09/17 12:29:20| FD 36 Closing ICP connection
(wb_ntlmauth)[12511](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid argument)
(wb_ntlmauth)[12512](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid argument)
(wb_ntlmauth)[12510](wb_ntlm_auth.c:273): fgets() failed! dying. 
errno=22 (Invalid argument)
(wb_ntlmauth)[12514](wb_ntlm_auth.c:273): 

Re: [squid-users] blocking yahoo messenger?

2003-08-25 Thread Schelstraete Bart
Louie Miranda wrote:

well the funny thing is..

i added

acl yahoo dstdom_regex messenger.yahoo.com
http_access deny yahoo
and still on access.log yahoo web messenger is still passing thru.
 

Hello,
Pls check previous posts to this list. This has been discussed many, 
many times in the past.
(just check the acess log to what the client is connecting)



rgrds,
 Bart


Re: [squid-users] page can not be displayed

2003-08-15 Thread Schelstraete Bart
Andy Dean wrote:

Hi people

Just wondering if some one can point me in the right direction, squid from
time to time gives "page can not be displayed erros" but if users press
F5/refresh then they will get the page.  what setting is this in squid, i
should imagine it`s a dns time out error or something, if any cn give us
some help would be much apreciated.
 

Andy,

Read the FAQ or the history of this mailinglist.

rgrds,
 Bart


Re: [squid-users] Kazaa - ICQ - FTP through Squid 2.4 STABLE 7

2003-08-14 Thread Schelstraete Bart
Chris Wilcox wrote:

Squid wouldn't affect the working of these programs and can't since 
Squid is a http proxy, not a proxy for other protocols such as ftp, 
pop3 and smtp.  If the software worked before Squid and you haven't 
touched the configuration of these programs since, then there's no 
reason for them not to work now.  If you have configured these 
programs to somehow use squid then you need to reset the configuration 
to what you had before.  As far as I'm aware (and I have played with 
this at home a fair bit) the only thing that would stop your programs 
from working would be firewall related.

In short words, what you want is not possible with Squid.  Squid can 
only cache and handle http requests.

???
ICQ (for example: Netscape 7/ Trillian, ...)  works with Squid and FTP 
works with Squid. (for example: Netscape)
I don't use those other program, so I cannot tell that.

rgrds,

 Bart



Re: [squid-users] Problem with load balanced site

2003-08-14 Thread Schelstraete Bart
Mick Reichelt wrote:

Hi There,

I am running squid 2.4STABLE1 and was wondering if anyone has
experienced any problems with squid accessing load balanced sites. I am
having this problem, where a particular site (national.com.au) which
squid just doesn't want to have a bar of. I have tried the no_cache
option and still no joy. Any help would be appreciated.
Hello Mick,

I never experienced any problems with this. Do you know how that site is 
'loadbalanced'?
Is it DNS round robin or something?
Maybe it can be caused by some DNS caching.

rgrds,
 Bart


Re: [squid-users] always_direct dont work

2003-08-14 Thread Schelstraete Bart
Jordi Vidal wrote:

Hi,

	I'm trying to setup a rule to avoid Nagios from fetching web pages
from the cache of my squid transparent proxy, forcing to check directly
with remote server, but squids seems to ignore completely the rule. 

	My question is: is the rule "always_direct"  usable in a 
transparent proxy configuration?

	My squid version is 2.4.STABLE7. Relevant parts of squid.conf 
follows:

httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
cachemgr_passwd cebolla all
acl local-servers dstdomain .wtn
acl nagios browser check_http
always_direct allow nagios
always_direct allow local-servers


Hello,

You should use the no-cache attribute for this.
For example:;
acl local-servers dstdomain nagios.com
no-cache deny local-servers
...
rgrds,

	Bart

//



Re: [squid-users] always_direct dont work

2003-08-14 Thread Schelstraete Bart
Adam Aube wrote:

Thank you for your reply, but it doesnt work in my version
of squid:
   

 

# squid -k reconfigure
2003/08/11 20:01:58| parseConfigFile: line 1717 unrecognized:
'no-cache deny local-servers'
   

Seems to me there was a small typo - try "no_cache"

Damn.it's my keyboard who made this typo!
:)
  Bart



Re: [squid-users] CPU utilization performance issue

2003-08-14 Thread Schelstraete Bart
Adam Aube wrote:

I recall a discussion on ufs vs aufs vs diskd some time ago.

(If I recall correctly)

With a single cache disk, aufs is better on Linux and diskd
is better on other OSes.
However, when you have multiple cache disks, diskd is better
regardless of platform.
 

Adam,

'Multiple cache disks', does that included hardware raids, because that 
are also 'multiple disks'.
(but one disk for the OS).

I will do some tests regarding this in the future.

  Bart



Re: [squid-users] How to cycle thru a pool of IPs for outgoing traffic?

2003-08-14 Thread Schelstraete Bart
Andre Tomás wrote:

I have Squid running on a machine that has a class C network bound to it. I'd like Squid to randomly cycle thru the whole range of IPs for outgoing traffic. I found how to route traffic based on ACL but that's not exactly what I need. I simply need to randomly select an address out of a pool and use it for a particular session. Someone please guide me to the proper doc.

Hello,

For OUTGOING traffic only?  Does this mean that you have multiple nic's?
Pls explain a little bit more.
(and + what OS, and which Squid version)
rgrds,
 Bart


Re: [squid-users] CPU utilization performance issue

2003-08-14 Thread Schelstraete Bart
Hello Tay,

I used both on my live server.
Reiser - aufs/diskd
Ext3 - aufs/diskd
And I'm now using Reiser with Diskd.
I swhitched with this a lot of times, because the disks are the 
reaaal bottleneck of my squid. It slows down the trafiic almost 4 times.
Direct connection=  1,6mb/s ,  via Squid with caching enabled  = 
400Kb/s.  (with ext3 /ufs)
So that was really not good! And now I'm waiting for the figures for 
reiser/diskd. But it seems to be a lot faster.



  Bart
Tay Teck Wee wrote:
Hi Bart,

I'm using reiserfs. aufs coz its more suitable for
linux. 

--
Wolf
--- Schelstraete Bart <[EMAIL PROTECTED]> wrote:
 

Hzllo,

Why not using Reiser instead of ext3 with diskd?
I read a lot of articles that are saying that reiser
is much fast for 
Squid. (a lot of 'small files')



  Bart

Zand, Nooshin wrote:

   

Hi,
I am just wonder why you are not using diskd.
Based on Benchmarking that I read, diskd provides
 

faster I/O performance.
   

I am planning to run squid on Linux Redhat 9.0 and
 

thinking to use ext3 and diskd.
   

Thanks,
Nooshin
-Original Message-
From: Tay Teck Wee
 

[mailto:[EMAIL PROTECTED]
   

Sent: Friday, August 08, 2003 2:22 AM
To: squid-users
Subject: Re: [squid-users] CPU utilization
 

performance issue
   

Hi everyone,

thanks for the input. The ACL list have since been
slightly altered, using only src(22 entries),
dstdomain(114 entries) and url_regex(20 entries). I
 

am
   

currently on kernel 2.4.20-19.9 so the
 

Hyperthreading
   

might hv been optimized. 

Now the machine is handling about 110 req/s but
 

again
   

the CPU will climb to abt 90-95%. Is it possible
 

for
   

my squid box to go beyond 180 req/s, which is the
 

peak
   

for each proxies in the existing pool(ISP env)? I
 

am
   

trying to replace my existing NetCaches with
squids...one box for one box.  

I am wondering if its because reiserfs will consume
more CPU than other fs like ext3? Will changing my
cache partitions to reiserf lower down the CPU
 

usage?
   

Or can anyone suggest other possible improvements?
Thanks.
my 3 caching partitions are on 3 separate disks:-
/dev/sdb1  /cdata1  reiserfs notail,noatime 1 2
/dev/sdc1  /cdata2  reiserfs notail,noatime 1 2
/dev/sdd1  /cdata3  reiserfs notail,noatime 1 2
--
Wolf
--- Tay Teck Wee <[EMAIL PROTECTED]> wrote:

Hi,

 

when I'm getting about 90 req/s or 800 concurrent
connection(according to my foundry L4) to my
squid(RedHat 8.0/2.5 Stable3 w deny_info patch),
   

the
   

CPU utilization avg abt 80%. How do I lower the
   

CPU
   

utilization of my squid? Thanks.  

Below is my machine specs:-

Intel Xeon single-processor 2.4GHz(DELL 2650)
2G physical RAM(w 2G swap under linux)
2X 33G for everything except caching (mirror)
3X 33G for caching (volume) 

/dev/sda7   505605 68437411064
   

15% /
/dev/sda1   124427  9454108549
   

   

9% /boot
/dev/sdb1 35542688201248  35341440
   

   

1% /cdata1
/dev/sdc1 35542688200888  35341800
   

   

1% /cdata2
/dev/sdd1 35542688200940  35341748
   

   

1% /cdata3
/dev/sda3  1035692 49796933284
   

   

6% /home
none   1032588 0   1032588
   

   

0% /dev/shm
/dev/sda5  1035660691648291404
   

71% /usr
/dev/sda6   505605 76236403265
   

16% /usr/local
/dev/sda8 29695892 83456  28103936
   

   

1% /var

Below is my squid.conf(only the essential). 

For ACL, basically I hv 3 acl list(in 3 separate
files), one containing allowable IPs while the
   

other
   

contains deny IPs. I also hv 3 list of banned
   

sites
   

list(in 3 separate files).:-

http_port 8080
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 400 MB
cache_swap_low 92
cache_swap_high 95
maximum_object_size 2 MB
maximum_object_size_in_memory 100 KB
cache_replacement_policy heap GDSF
memory_replacement_policy heap GDSF
cache_dir aufs /cdata1 16000 36 256
cache_dir aufs /cdata2 16000 36 256
cache_dir aufs /cdata3 16000 36 256
cache_access_log
   

/var/log/cachelog/cache.access.log
   

cache_log /var/log/cachelog/cache.log
cache_store_log none
quick_abort_min -1 KB
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
#3 banned list files
acl SBA dstdomain "/usr/local/squid/etc/SBA.txt"
acl CNB dstdomain "/usr/local/squid/etc/CNB.txt"
acl CNB2 url_regex "/usr/local/squid/etc/CNB2.txt"
#3 access list files
acl NetTP src "/usr/local/squid/etc/NetTPsrc.acl"
acl NetDeny src "/usr/local/squid/etc/deny.acl"
acl NetAllow src "/usr/local/squid/etc/allow.acl"
http_access deny SBA
http_access deny CNB
http_access deny CNB2
http_access deny NetDe

Re: [squid-users] CPU utilization performance issue

2003-08-14 Thread Schelstraete Bart
Adam Aube wrote:

Can somebody explain to me why it's worth considering putting
a Squid cache onto a Raid setup anyway?
   

RAID isn't just for precious data - it's to keep a disk failure
from taking down your system. Without RAID, if your cache disk
crashed, so would Squid.
 

Correct, We're using Squid as main proxy server. So it should survive a 
disk failure.

 Bart



Re: [squid-users] 2003/08/13 08:20:21| httpAccept: FD 15: accept failure: (24) Too many open files

2003-08-14 Thread Schelstraete Bart
Just increase the number of open files in your OS.

(ulimit)


rgrds,

BArt
Quoting Brian Hechinger <[EMAIL PROTECTED]>:

> i get a TON of these.  had to restart squid.  is this related to my other
> issue
> with Resource temporarily unavailable?  is this the solaris ufs thing biting
> me
> in the you know what?
> 
> if it's the UFS issue, i can drop VxFS on there to fix it, if it's the
> other
> thing, it might be time to port the SiteMinder patch to STABLE3
> 
> thanks!
> 
> -brian
> -- 
> "You know, evil comes in many forms, be it a man-eating cow or Joseph
> Stalin.
> But you can't let the package hide the pudding. Evil is just plain bad! You
> don't cotton to it! You gotta smack it on the nose with the rolled up
> newspaper
> of goodness! Bad dog! Bad dog!"   -- The Tick
> 


Schelstraete Bart
[EMAIL PROTECTED] - http://www.schelstraete.org
  http://langmixer.mozdev.org


Re: [squid-users] Newbie at Squid.

2003-08-11 Thread Schelstraete Bart
Alvaro Gordon-Escobar wrote:

I just got squip up and running.  

I want to block .exe .zip .msi and .vbs cmd.exe root.exe  etc.
I don't want people to download this files.  especially becuase my file server is 
running low on space.
I want o block these files from HTTP and FTP.
I tried to block FTP downloads, but that block all  FTP dwonloads, including PDF and legit word docs.

I have read in some doc that a txt file can be created too use as a filter.

 

Hello ,

Pls read the FAQ first, which is available on the Squid website:   
http://www.squid-cache.org
(what you should always do)
This will explain alot.

rgrds,

 Bart



Re: [squid-users] LAG !!!

2003-08-11 Thread Schelstraete Bart
squid_user wrote:

Hello everyone,

I ve been useing squid for about 1 year. I didnt notice that earlyer
but last
time i found taht when i want to open some WWW pages then i have to
wait about sometimes 10-15 sec before browser show me something.
Is that normal ? or maybe i should add something to squid.conf to
avoid this lagg... i dont know plz help me to solve that problem.
When i turn off squid then web browsing works much more quick.

will be thankful for any advice

 

Maybe DNS problem on the Squid proxy server?



  Bart



Re: [squid-users] deny_info and http_reply_access

2003-08-10 Thread Schelstraete Bart
Joshua Brindle wrote:

after trying to use deny_info with my http_reply_access
acl and being unsuccessful i searched the web and found
that others had that problem and that it was a known limitation.
My question is, what kind of limitation is it? one where the code
just hasn't been written or is it a design limitation? (in squid-3)
 



3. Known limitations

There is a few limitations to this version of Squid that we hope to 
correct in a later release

*deny_info*

   deny_info only works for http_access, not for the acls listen in
   http_reply_access


I didn't see anything about this in version 3 -yet.

rgrds,

  Bart










Re: [squid-users] Log files too large

2003-08-10 Thread Schelstraete Bart
Schelstraete Bart wrote:

Gator wrote:

I am finding that Squid (2.5.STABLE2) will fail when the log files reach
a certain size.  I moved them off to access.log.2 and store.log.2 and
life was fine again.
1624135928 Aug  8 10:36 access.log.2
2147483647 Aug  8 09:02 store.log.2
How do I set up these files to rotate automatically so this doesn't
happen again?
 

You cannot do that automatically. What I'm doing is create a cronjob 
that  rotates the logfiles every night and is creating statistics for 
that day.
Squid doesn't have a limit on the file size, but the filesystem has a 
2Gb filesize limit.
Sorry my mistake. Squid should be modified to allow files bigger then 2 
gigs...but the question is: Who wants that
I think nobody wants to use this.



rgrds,

 Bart



Re: [squid-users] Log files too large

2003-08-09 Thread Schelstraete Bart
Gator wrote:

I am finding that Squid (2.5.STABLE2) will fail when the log files reach
a certain size.  I moved them off to access.log.2 and store.log.2 and
life was fine again.
1624135928 Aug  8 10:36 access.log.2
2147483647 Aug  8 09:02 store.log.2
How do I set up these files to rotate automatically so this doesn't
happen again?
 

You cannot do that automatically. What I'm doing is create a cronjob 
that  rotates the logfiles every night and is creating statistics for 
that day.
Squid doesn't have a limit on the file size, but the filesystem has a 
2Gb filesize limit.



  Bart



Re: [squid-users] CPU utilization performance issue

2003-08-08 Thread Schelstraete Bart
Hzllo,

Why not using Reiser instead of ext3 with diskd?
I read a lot of articles that are saying that reiser is much fast for 
Squid. (a lot of 'small files')



  Bart

Zand, Nooshin wrote:

Hi,
I am just wonder why you are not using diskd.
Based on Benchmarking that I read, diskd provides faster I/O performance.
I am planning to run squid on Linux Redhat 9.0 and thinking to use ext3 and diskd.
Thanks,
Nooshin
-Original Message-
From: Tay Teck Wee [mailto:[EMAIL PROTECTED]
Sent: Friday, August 08, 2003 2:22 AM
To: squid-users
Subject: Re: [squid-users] CPU utilization performance issue
Hi everyone,

thanks for the input. The ACL list have since been
slightly altered, using only src(22 entries),
dstdomain(114 entries) and url_regex(20 entries). I am
currently on kernel 2.4.20-19.9 so the Hyperthreading
might hv been optimized. 

Now the machine is handling about 110 req/s but again
the CPU will climb to abt 90-95%. Is it possible for
my squid box to go beyond 180 req/s, which is the peak
for each proxies in the existing pool(ISP env)? I am
trying to replace my existing NetCaches with
squids...one box for one box.  

I am wondering if its because reiserfs will consume
more CPU than other fs like ext3? Will changing my
cache partitions to reiserf lower down the CPU usage?
Or can anyone suggest other possible improvements?
Thanks.
my 3 caching partitions are on 3 separate disks:-
/dev/sdb1  /cdata1  reiserfs notail,noatime 1 2
/dev/sdc1  /cdata2  reiserfs notail,noatime 1 2
/dev/sdd1  /cdata3  reiserfs notail,noatime 1 2
--
Wolf
--- Tay Teck Wee <[EMAIL PROTECTED]> wrote: >
Hi,
 

when I'm getting about 90 req/s or 800 concurrent
connection(according to my foundry L4) to my
squid(RedHat 8.0/2.5 Stable3 w deny_info patch), the
CPU utilization avg abt 80%. How do I lower the CPU
utilization of my squid? Thanks.  

Below is my machine specs:-

Intel Xeon single-processor 2.4GHz(DELL 2650)
2G physical RAM(w 2G swap under linux)
2X 33G for everything except caching (mirror)
3X 33G for caching (volume) 

/dev/sda7   505605 68437411064 
15% /
/dev/sda1   124427  9454108549  
9% /boot
/dev/sdb1 35542688201248  35341440  
1% /cdata1
/dev/sdc1 35542688200888  35341800  
1% /cdata2
/dev/sdd1 35542688200940  35341748  
1% /cdata3
/dev/sda3  1035692 49796933284  
6% /home
none   1032588 0   1032588  
0% /dev/shm
/dev/sda5  1035660691648291404 
71% /usr
/dev/sda6   505605 76236403265 
16% /usr/local
/dev/sda8 29695892 83456  28103936  
1% /var

Below is my squid.conf(only the essential). 

For ACL, basically I hv 3 acl list(in 3 separate
files), one containing allowable IPs while the other
contains deny IPs. I also hv 3 list of banned sites
list(in 3 separate files).:-
http_port 8080
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 400 MB
cache_swap_low 92
cache_swap_high 95
maximum_object_size 2 MB
maximum_object_size_in_memory 100 KB
cache_replacement_policy heap GDSF
memory_replacement_policy heap GDSF
cache_dir aufs /cdata1 16000 36 256
cache_dir aufs /cdata2 16000 36 256
cache_dir aufs /cdata3 16000 36 256
cache_access_log /var/log/cachelog/cache.access.log
cache_log /var/log/cachelog/cache.log
cache_store_log none
quick_abort_min -1 KB
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
#3 banned list files
acl SBA dstdomain "/usr/local/squid/etc/SBA.txt"
acl CNB dstdomain "/usr/local/squid/etc/CNB.txt"
acl CNB2 url_regex "/usr/local/squid/etc/CNB2.txt"
#3 access list files
acl NetTP src "/usr/local/squid/etc/NetTPsrc.acl"
acl NetDeny src "/usr/local/squid/etc/deny.acl"
acl NetAllow src "/usr/local/squid/etc/allow.acl"
http_access deny SBA
http_access deny CNB
http_access deny CNB2
http_access deny NetDeny
http_access allow NetAllow
http_access allow NetTP
http_access deny all
http_reply_access allow all
cache_effective_user squid
cache_effective_group squid
logfile_rotate 10
deny_info ERR_SBA_DENIED SBA
deny_info ERR_CNB_DENIED CNB CNB2
memory_pools off
coredump_dir /var/log/cachelog
Thanks again!

Regards,
Wolf
__
Do You Yahoo!?
Send free SMS from your PC!
http://sg.sms.yahoo.com 
   

__
Do You Yahoo!?
Send free SMS from your PC!
http://sg.sms.yahoo.com
 





Re: [squid-users] security concern

2003-08-08 Thread Schelstraete Bart
Tay,

Tay Teck Wee wrote:

Hi,

I did a telnet to my squid port 8080 and input an
invalid request n got the following reply(truncated):-
HTTP/1.0 400 Bad Request
Server: squid/2.5.STABLE3line 2
Mime-Version: 1.0
Date: Fri, 08 Aug 2003 17:52:22 GMT
Content-Type: text/html
Content-Length: 1213
Expires: Fri, 08 Aug 2003 17:52:22 GMT
X-Squid-Error: ERR_INVALID_REQ 0
X-Cache: MISS from cr
Proxy-Connection: close
 

You can not change that, or you need to modify the source code for this.

  Bart




Re: [squid-users] CPU utilization performance issue

2003-08-08 Thread Schelstraete Bart
I know that ext3 uses a lot more CPU on a lot of disk I/O. This is a 
known issue.
Btw, did you compile the Squid with the gnu-regex option? Because that 
one is faster on Linux systems. (if you're using  a lot of acl's)

rgrds,
 BArt
Tay Teck Wee wrote:
Hi Bart,

I'm using reiserfs. aufs coz its more suitable for
linux. 

--
Wolf
--- Schelstraete Bart <[EMAIL PROTECTED]> wrote:
 

Hzllo,

Why not using Reiser instead of ext3 with diskd?
I read a lot of articles that are saying that reiser
is much fast for 
Squid. (a lot of 'small files')



  Bart

Zand, Nooshin wrote:

   

Hi,
I am just wonder why you are not using diskd.
Based on Benchmarking that I read, diskd provides
 

faster I/O performance.
   

I am planning to run squid on Linux Redhat 9.0 and
 

thinking to use ext3 and diskd.
   

Thanks,
Nooshin
-Original Message-
From: Tay Teck Wee
 

[mailto:[EMAIL PROTECTED]
   

Sent: Friday, August 08, 2003 2:22 AM
To: squid-users
Subject: Re: [squid-users] CPU utilization
 

performance issue
   

Hi everyone,

thanks for the input. The ACL list have since been
slightly altered, using only src(22 entries),
dstdomain(114 entries) and url_regex(20 entries). I
 

am
   

currently on kernel 2.4.20-19.9 so the
 

Hyperthreading
   

might hv been optimized. 

Now the machine is handling about 110 req/s but
 

again
   

the CPU will climb to abt 90-95%. Is it possible
 

for
   

my squid box to go beyond 180 req/s, which is the
 

peak
   

for each proxies in the existing pool(ISP env)? I
 

am
   

trying to replace my existing NetCaches with
squids...one box for one box.  

I am wondering if its because reiserfs will consume
more CPU than other fs like ext3? Will changing my
cache partitions to reiserf lower down the CPU
 

usage?
   

Or can anyone suggest other possible improvements?
Thanks.
my 3 caching partitions are on 3 separate disks:-
/dev/sdb1  /cdata1  reiserfs notail,noatime 1 2
/dev/sdc1  /cdata2  reiserfs notail,noatime 1 2
/dev/sdd1  /cdata3  reiserfs notail,noatime 1 2
--
Wolf
--- Tay Teck Wee <[EMAIL PROTECTED]> wrote:

Hi,

 

when I'm getting about 90 req/s or 800 concurrent
connection(according to my foundry L4) to my
squid(RedHat 8.0/2.5 Stable3 w deny_info patch),
   

the
   

CPU utilization avg abt 80%. How do I lower the
   

CPU
   

utilization of my squid? Thanks.  

Below is my machine specs:-

Intel Xeon single-processor 2.4GHz(DELL 2650)
2G physical RAM(w 2G swap under linux)
2X 33G for everything except caching (mirror)
3X 33G for caching (volume) 

/dev/sda7   505605 68437411064
   

15% /
/dev/sda1   124427  9454108549
   

   

9% /boot
/dev/sdb1 35542688201248  35341440
   

   

1% /cdata1
/dev/sdc1 35542688200888  35341800
   

   

1% /cdata2
/dev/sdd1 35542688200940  35341748
   

   

1% /cdata3
/dev/sda3  1035692 49796933284
   

   

6% /home
none   1032588 0   1032588
   

   

0% /dev/shm
/dev/sda5  1035660691648291404
   

71% /usr
/dev/sda6   505605 76236403265
   

16% /usr/local
/dev/sda8 29695892 83456  28103936
   

   

1% /var

Below is my squid.conf(only the essential). 

For ACL, basically I hv 3 acl list(in 3 separate
files), one containing allowable IPs while the
   

other
   

contains deny IPs. I also hv 3 list of banned
   

sites
   

list(in 3 separate files).:-

http_port 8080
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 400 MB
cache_swap_low 92
cache_swap_high 95
maximum_object_size 2 MB
maximum_object_size_in_memory 100 KB
cache_replacement_policy heap GDSF
memory_replacement_policy heap GDSF
cache_dir aufs /cdata1 16000 36 256
cache_dir aufs /cdata2 16000 36 256
cache_dir aufs /cdata3 16000 36 256
cache_access_log
   

/var/log/cachelog/cache.access.log
   

cache_log /var/log/cachelog/cache.log
cache_store_log none
quick_abort_min -1 KB
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
#3 banned list files
acl SBA dstdomain "/usr/local/squid/etc/SBA.txt"
acl CNB dstdomain "/usr/local/squid/etc/CNB.txt"
acl CNB2 url_regex "/usr/local/squid/etc/CNB2.txt"
#3 access list files
acl NetTP src "/usr/local/squid/etc/NetTPsrc.acl"
acl NetDeny src "/usr/local/squid/etc/deny.acl"
acl NetAllow src "/usr/local/squid/etc/allow.acl"
http_access deny SBA
http_access deny CNB
http_access deny CNB2
http_access deny NetDeny
http_access allow NetAllow
http_access allow NetTP
http_access deny all
http_reply_access allow all
cache_effective_user squid
cache_effective_group squid
logfile_rotate 10
deny_info ERR_SBA_DENIED SBA
deny_inf

Re: [squid-users] squid authentication

2003-08-05 Thread Schelstraete Bart
Hello,

In that example, users that were created with the password file can suf 
to your allowed sites.
If I usderstand you correctly, you also want that certain users can surf 
without restrictions? Correct?

rgrds,

  Bart

[EMAIL PROTECTED] wrote:

Thank a lot Bart.!  I'm   starting to understand.  Now I know how to allow one 
IP to access to one site.
But , really, I need that a list of users, do that, instead only one IP.  Can 
you tell me how to do this ?

Maybe the confusion was , because I have two different problems. I need too 
(in other hand) that one server (fix IP ) saling to the web , WITHOUT 
authentication and without restriction. I use NCSA authentication.

Sorry for my english., and very very much for your help.!

Miguel
 

Hello,

An example:

--
acl servers src IP/255.255.255.255
acl password proxy_auth REQUIRED
acl url url_regex -i "/usr/local/squid/etc/deny.txt"
http_access allow servers
http_access url password
http_access deny all
--
'IP' is the IP address of that one server.
"/usr/local/squid/etc/deny.txt" is a text file with the url's that you 
want to allow.

rgrds,
   Bart
[EMAIL PROTECTED] wrote:

   

Hi Henrik, thanks for your reply.  But i have one server that don't need 
 

acces 
 

by authentication. He need access whithout any request. 
In other hands I have a list of users, that must access by authentication, 
 

BUT 
 

only access at one and only site.
I'm sure you have a lot of work, but any or  whatever sample you can give 
 

me, 
 

or url with an similar example, will be apreciated. Sincerely thanks. 



 

On Sunday 03 August 2003 03.24, [EMAIL PROTECTED] wrote:
  

   

Hello friends:  I am new in the subject squid, I'd like  help  to
bypass authenticacion of squid for a local direction IP (that
server must leave to Internet without requesting me user and
password)
I d like to know too , how can leave user group access ONLY  to one
site (corporate) and nothing else. But the other people will must
access to the rest of internet.


 

You do this by allowing that server access to that site before where 
you require others to authenticate.

http_access is a ordered list of rules. The FIRST rule where all acl 
names listed matches the request determines if the request is to be 
allowed or denied.

When Squid encounters an ACL requiring a username (proxy_auth etc) it 
requires authentication from the user.

What this means is that you need to create two acls, one for matching 
the server and one for the site, then make a http_access rule 
allowing the combination of these two somewhere before your 
http_access rule which requires authentication.

  

   

Sorry for my english. I appreciate any help. Thanks to the
community.


 

Your english is fine. Most of us are not native english speaking.

Regards
Henrik
--
Donations welcome if you consider my Free Squid support helpful.
https://www.paypal.com/xclick/business=hno%40squid-cache.org
If you need commercial Squid support or cost effective Squid or
firewall appliances please refer to MARA Systems AB, Sweden
http://www.marasystems.com/, [EMAIL PROTECTED]
  

   



 

   



 





Re: [squid-users] squid authentication

2003-08-04 Thread Schelstraete Bart
Hello,

An example:

--
acl servers src IP/255.255.255.255
acl password proxy_auth REQUIRED
acl url url_regex -i "/usr/local/squid/etc/deny.txt"
http_access allow servers
http_access url password
http_access deny all
--
'IP' is the IP address of that one server.
"/usr/local/squid/etc/deny.txt" is a text file with the url's that you 
want to allow.

rgrds,
   Bart
[EMAIL PROTECTED] wrote:

Hi Henrik, thanks for your reply.  But i have one server that don't need acces 
by authentication. He need access whithout any request. 
In other hands I have a list of users, that must access by authentication, BUT 
only access at one and only site.
I'm sure you have a lot of work, but any or  whatever sample you can give me, 
or url with an similar example, will be apreciated. Sincerely thanks. 

 

On Sunday 03 August 2003 03.24, [EMAIL PROTECTED] wrote:
   

Hello friends:  I am new in the subject squid, I'd like  help  to
bypass authenticacion of squid for a local direction IP (that
server must leave to Internet without requesting me user and
password)
I d like to know too , how can leave user group access ONLY  to one
site (corporate) and nothing else. But the other people will must
access to the rest of internet.
 

You do this by allowing that server access to that site before where 
you require others to authenticate.

http_access is a ordered list of rules. The FIRST rule where all acl 
names listed matches the request determines if the request is to be 
allowed or denied.

When Squid encounters an ACL requiring a username (proxy_auth etc) it 
requires authentication from the user.

What this means is that you need to create two acls, one for matching 
the server and one for the site, then make a http_access rule 
allowing the combination of these two somewhere before your 
http_access rule which requires authentication.

   

Sorry for my english. I appreciate any help. Thanks to the
community.
 

Your english is fine. Most of us are not native english speaking.

Regards
Henrik
--
Donations welcome if you consider my Free Squid support helpful.
https://www.paypal.com/xclick/business=hno%40squid-cache.org
If you need commercial Squid support or cost effective Squid or
firewall appliances please refer to MARA Systems AB, Sweden
http://www.marasystems.com/, [EMAIL PROTECTED]
   



 





Re: [squid-users] typing "squid -z" doesn't start

2003-07-30 Thread Schelstraete Bart
Matt Babineau wrote:

Hi all!

For some reason my new install (Squid 2.5 Stable 3) doesn't stat, it jsut gives me a cannot find this command error in the shell! I am running Redhat 9, any suggestions on how to get this to work?

 

MAtt,

I suppose the Squid binary is not in your path.
Go to the squid directory, go to the sbin directory and enter:
 ./squid -z
rgrds,
  Bart


[squid-users] squid-internal-static on Error pages.

2003-07-26 Thread Schelstraete Bart
Hello,

A little question,
It's possible to load the Squid icons with squid like this:
"http://my.proxy.host/squid-internal-static/icons/my_logo.gif";

Is there also a possibility to load the Error pages with the squid-internal-static?
(the ones that exist in share/errors/[Language])


rgrds,

    Bart

--
Schelstraete Bart
We're the beat in your feet! C-dance!
 List online: http://www.c-dance.biz/stream_low.asx


Re: [squid-users] Squid and Telnet over Networks

2003-07-24 Thread Schelstraete Bart
There is software avialable on the market which allows you to telnet 
over an http proxy (Squid off cource).
That software is only a java applet,and nothing is needed on server side.

rgrds,
  Bart
[EMAIL PROTECTED] wrote:
Hi,
IM using squid and have this configuration
Internet -> switch (192.168.0 network) -> server(running squid)
the server connects to next line-<
V
internal router(DHCP - 192.168.1 network) -> nodes
how do i go about haveing the 192.168.1 network telnet out to the internet?

the current configuration i have is this

#---#

acl ournetwork 192.168.0.0/255.255.255.0 192.168.1.0/255.255.255.0
acl telnet_ports port 23
acl telnet_target port 4004
http_access ournetwork telnet_ports
http_access ournetwork telnet_target
#--#

Any sugestions would be greatful,

Thanks in advance
PS- please email me with sugestions
David Walker
Mudsite Hosting
[EMAIL PROTECTED]
 





[squid-users] Stable4

2003-07-22 Thread Schelstraete Bart
Hello,

A few days ago Hernik talked about Squid 2.5 Stable4.
Somebody knows when it will be available?
Is it a matter of weeks or months?

(because I want to update, but if stable 4 arrives very soon, I'll wait for that
version)


rgrds,
  Bart

--
Schelstraete Bart
[EMAIL PROTECTED] - http://www.schelstraete.org
  http://langmixer.mozdev.org


[squid-users] Re: Squid but propably off topic?

2003-07-22 Thread Schelstraete Bart
Hello,

Just FYI:
This problem was caused by the caching itself.
When I disable caching for all intranet hosts, the speed is like it 
should be..(+/- local lan speed)
So it seems the my Squid disk performance is not like it should be.

rgrds,

  Bart

Schelstraete Bart wrote:

Hello,

I 'm facing a problem with my Squid proxy server, and I hope maybe of 
you can help me. (the problem is maybe not squid related)
I installed Squid 2.5 on a Linux server which is currently live (RH 
8.0). This machine is a 2 CPU machine, with 1 GIG RAM, 2 x 100 NIC. 
(in fact this are 1G bit cards, but it's currently connected on a 
100Mb switch).
And I also installed Squid 2.5 on a HP-UX machine , which I'm testing 
now. (HP-UX 11i). This machine has 3 CPU's, and 1x100 NIC. (also 1Gbit 
card)
(completely the same Squid configuration)

Everything works fine, and Squid itselfs works quite fast. Because we 
have a lot of servers in our lan, which doesn't need to use the 
'cache_peer', I configured Squid that he goes directly to those 
machine, so without the cache_peer.

Now yesterday I did some tests, and I downloaded a file via ftp 
through the proxy. (proxy will bypass cache_peer for lan, and I 
configured Squid).
If I used the Linux Squid proxy , I got a download speed of +/- 300 
Kb/s. But If I'm using the HP-UX Squid proxy I get a download speed of 
1000 / 1200 Kb/s..

I double check the configuration, and I checked the access.log file, 
and Squid not using the cache_peer for that download.
So I don't really understand why the Linux machine is that slow 
comparing with the HP-UX.
Does somebody has a clue what  I can do to improve this kind of 
connections?



   Bart





Re: [squid-users] Compressing HTML pages before sending to client

2003-07-21 Thread Schelstraete Bart
Hello,

What about performance? If Squids needs to compress every file, then 
will decrease the performance a lotand it will ask a lot of CPU.
If you're using one Squid proxy it will save diskspace, but that's it.
But if you have multiple Squids running, and those Squid can communicate 
to each other with compressed files it will save a lot  bandwith, and 
the speed between those 2 servers will increase..but like I said, 
you need more powerfull machines.

rgrds,

  Bart
Robert Collins wrote:
Oops, seem to have deleted the first  post in the thread...

On Mon, 2003-07-21 at 13:03, Tony Melia (DMS) wrote:
 

Sounds good, but I think it would make more sense to compress the file
before committing it to disk at the caching level so as to compress the
cache.  

-Original Message-
From: Robert Mena [mailto:[EMAIL PROTECTED] 
Sent: Monday, 21 July 2003 12:07
To: [EMAIL PROTECTED]
Subject: Re: [squid-users] Compressing HTML pages before sending to client

Hi, 

I recently searched the archives and found one
post/reply where the ability to dinamically compress
the HTML before sending to the users was put as a
3.1/3.2 feature.
I do not subscribe to the devel list so should we
expect this 3.1 for this year ?
   

No. 3.0 is in the release process now, 3.1 at the earliest will be next
year.
 

Is there any other proxy (and that can me sent to me
directly) that do offer this feature ?
   

Not that I'm aware of, although apache mod_gzip + mod_proxy may do this.

 

It really can save a lot of bandwidth and time for
dial-up users so please consider adding this as soon
as possible.
   

It's been implemented before in the TE branch on devel.squid-cache.org
by Patrick McManus, and then enhanced by me to support proxy-proxy
compression as well. However, severe logic problems prevented
stabilisation of this feature in the 2.x codebase. Thats why it's slated
for a 3.x release.
If you will find it valuable, you might consider sponsoring (alone or as
a group of interested folk) a squid developer to implement it for 3.x.
I'd be happy to discuss this with you..
Cheers,
Rob
(Squid developer)
 





Re: [squid-users] Get Squid to log to MySql database?

2003-07-20 Thread Schelstraete Bart
Chris Wilcox wrote:

Morning all,

After a fair amount of searching I'm none-the-wiser on this one.  Is 
there any way (easy preferred but not essential!) I can get Squid to 
log directly to a MySql database or do I need to regularly run a 
script via cron to put the squid logfile into MySql?


Chris,

There was a project for this, but as far as I know that project doesn't 
exist anymore.
And on the other side..If you will write every access to a 
database,  I think this wo'nt be very good for the performance.

rgrds,

 Bart

--
Is that a radio in your pocket, or are you just happy to hear us? C-dance!


Re: [squid-users] peering

2003-07-19 Thread Schelstraete Bart
Chris,

Isn't it possible that your cache peer requires authentication, or that 
it doesn't allow your host?

rgrds,

  Bart

Chris Knipe wrote:

Lo everyone,

I have setup two squid servers in a parent & sibling relation.  The peering
itself seems to be setup correctly, both proxies start, and I can see that
both proxies contact each other via the cache log.
On my parent proxy however, I get constant 403's when the sibling tries to
query it.  I suspect it is a acl that I am missing, but I'm not sure what...
1058645715.781  4 x.x.x TCP_DENIED/403 1469 GET
y.y.y:3128/squid-internal-dynamic/netdb - NONE/- text/html
x.x.x.x is my sibling proxy, plain and simply setup with:
cache_peer y.y.y.y parent 3128 3130 default
I have given x.x.x.x ICMP Query access (ACL), as well as http query
access.
What am I missing?



 





Re: [squid-users] Running squid -k reconfigure frequently

2003-07-19 Thread Schelstraete Bart
Steve,

We're also using LDAP for authentication , using the pam_ldap 
autheticator (using ldap group)
And I never restarted Squid. So it works perfect!



   Bart
Steve Cody wrote:
I have Squid running in a cybercafe environment.  Frequently, new users
will be added, and existing user's accounts will be disabled.  This is
to enable and disable Internet access for people.
I'm currently using access with a password file that gets modified
whenever there is a user change.  The changes don't take effect until I
restart squid, or do a squid -k reconfigure.
Someone else suggested that I use LDAP to avoid this issue, and I'm
looking into that option right now.
Steve

On Sat, 2003-07-19 at 02:30, Peter Koinange wrote:
 

Something wrong here, why would you really need to ran squid -k so often, I
believe you problem here is administration I find it impossible see why you
are making changes every 2 minutes. Come up with a admin policy on how often
changes are done and whn they should take effect
Peter
- Original Message -
From: "Steve Cody" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Saturday, July 19, 2003 7:31 AM
Subject: [squid-users] Running squid -k reconfigure frequently
   

Hello all,

I would like to know if there is a negative impact on a system, or on
 

Squid,
   

if the squid -k reconfigure command is ran at a frequent interval.  For
example, I would like to run it about every 2 minutes to make user changes
take effect rapidly.
Is that going to be a problem?  If so, is there a better interval to use,
 

or
   

a better way to make user changes take effect?  I'm using user
 

authentication
   

and the changes I'm talking about are user creation, enabling, disabling,
 

and
   

deleting.  The changes don't take effect until the squid -k reconfigure
command is ran.
Thanks in advance!
Steve Cody
--
Open WebMail Project (http://openwebmail.org)
 



 





Re: [squid-users] redirecting request...!!!!

2003-07-19 Thread Schelstraete Bart
"progs.proxy.com" is the one the will be used to downloads programs from.
"browse.proxy.com" is the one that will be use to browse.
You need to change this with your values off course.

rgrds,

 Bart

S ý è d F ú r q à n wrote:

Thanks for ur help bart..
but can u tell me where i define the another pc name in the code which 
u listed below..?

Thanks & b-regards

Furqan Abbas


From: Schelstraete Bart <[EMAIL PROTECTED]>
To: S ý è d F ú r q à n <[EMAIL PROTECTED]>
CC: [EMAIL PROTECTED]
Subject: Re: [squid-users] redirecting request...
Date: Sat, 19 Jul 2003 15:46:24 +0200
MIME-Version: 1.0
Received: from squid-cache.org ([206.168.0.9]) by 
mc3-f2.law16.hotmail.com with Microsoft SMTPSVC(5.0.2195.5600); Sat, 
19 Jul 2003 06:48:25 -0700
Received: (qmail 41392 invoked by uid 1007); 19 Jul 2003 13:46:34 -
Received: (qmail 41376 invoked from network); 19 Jul 2003 13:46:34 -
X-Message-Info: JGTYoYF78jEHjJx36Oi8+Q1OJDRSDidP
Mailing-List: contact [EMAIL PROTECTED]; run by ezmlm
Precedence: bulk
List-Post: <mailto:[EMAIL PROTECTED]>
List-Help: <mailto:[EMAIL PROTECTED]>
List-Unsubscribe: <mailto:[EMAIL PROTECTED]>
List-Subscribe: <mailto:[EMAIL PROTECTED]>
Delivered-To: mailing list [EMAIL PROTECTED]
Message-ID: <[EMAIL PROTECTED]>
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.4b) 
Gecko/20030603 Thunderbird/0.1a
X-Accept-Language: en-us, en
References: <[EMAIL PROTECTED]>
In-Reply-To: <[EMAIL PROTECTED]>
X-Spam-Status: No, hits=-30.1 
required=5.0tests=EMAIL_ATTRIBUTION,IN_REP_TO,PLING_PLING,QUOTED_EMAIL_TEXT, 
 REFERENCES,REPLY_WITH_QUOTES,USER_AGENT_MOZILLA_UAversion=2.50
X-Spam-Level: X-Spam-Checker-Version: SpamAssassin 2.50 
(1.173-2003-02-20-exp)
Return-Path: 
[EMAIL PROTECTED]
X-OriginalArrivalTime: 19 Jul 2003 13:48:27.0033 (UTC) 
FILETIME=[6E12D090:01C34DFC]

S ý è d F ú r q à n wrote:

Hello members.

i m running squid in my small (60 computers) networks.  and i need 
to redirect the request ... means . i've 2 server... 1 for browsing 
... and the another for downloading is it possible.. when client 
send the request of downloading of any mp3 , exe then it will 
automatically redirect to another downloading server... .it is 
possible in squid..? if it is.. then kindly help me out...


Hello,

You can try this:

acl progs urlpath_regex -i \.exe$ \.com$ \.mp3$  (and other 
files off course)
cache_peer progs.proxy.com parent 3128 3130
cache_peer_access progs.proxy.com allow progs !all
cache_peer browse.proxy.com parent 3128 3130
cache_peer_access browse.proxy.com allow all !progs



That should work, but I'm not sure.

  Bart

--
Pls sign the following guestbook:   http://www.cable-dance.be   and 
ask to re-activate cable-dance!
Pls send this message to everyone ASAP.
Tnx, Bart



_
Add photos to your messages with MSN 8. Get 2 months FREE*. 
http://join.msn.com/?page=features/featuredemail






Re: [squid-users] redirecting request...!!!!

2003-07-19 Thread Schelstraete Bart
S ý è d F ú r q à n wrote:

Hello members.

i m running squid in my small (60 computers) networks.  and i need to 
redirect the request ... means . i've 2 server... 1 for browsing ... 
and the another for downloading is it possible.. when client send 
the request of downloading of any mp3 , exe then it will automatically 
redirect to another downloading server... .it is possible in squid..? 
if it is.. then kindly help me out...


Hello,

You can try this:

acl progs urlpath_regex -i \.exe$ \.com$ \.mp3$  (and other 
files off course)
cache_peer progs.proxy.com parent 3128 3130
cache_peer_access progs.proxy.com allow progs !all
cache_peer browse.proxy.com parent 3128 3130
cache_peer_access browse.proxy.com allow all !progs



That should work, but I'm not sure.

  Bart

--
Pls sign the following guestbook:   http://www.cable-dance.be   and ask 
to re-activate cable-dance!
Pls send this message to everyone ASAP.
Tnx, Bart





Re: [squid-users] restricting download of .exe files ....

2003-07-18 Thread Schelstraete Bart
HEllo,

An example:

---

acl microsoft dstdomain .microsoft.com
http_access allow microsoft
acl progs urlpath_regex -i \.exe$
http_access deny progs
...
http_access deny all
---

In that order, every access to microsoft is allowed even .exe files are allowed.
On all other site , exe files are blocked.


rgrds,
Bart
--
Pls sign the following guestbook:   http://www.cable-dance.be   and ask 
to re-activatr cable-dance!
Pls! Sent the message to everyone ASAP.
Tnx, Bart

Mark A Lewis wrote:

This should be written up for the FAQ with a few examples. Has to be one of
the most common questions that passes this list.
-Original Message-
From: Schelstraete Bart [mailto:[EMAIL PROTECTED] 
Sent: Friday, July 18, 2003 4:16 PM
To: Kenn Murrah
Cc: 'squid'
Subject: Re: [squid-users] restricting download of .exe files 



 

Thanks for the help (I actually did try to find this answer but
couldn't ... and decided to ask the list for its help (thinking that 
was, in fact, the purpose of the list.))

Anyway, let me ask a followup question  what if I want to disallow
all .exe files EXCEPT those from microsoft.com?  is that possible?  
Feel free, as I know you will, to tell me to RTFM, but any help in 
locating the right place in the right FM would be greatly appreciated.
   

You can first create an acl to allow anything to microsoft.com , and 
then specify the acl to block .exe files.
That's what I'm using in the live environment.

rgrds,
 Bart
--
Pls sign the following guestbook:   http://www.cable-dance.be   and ask 
to re-activatr cable-dance!
Pls! Sent the message to everyone ASAP.
Tnx, Bart

**
This message was virus scanned at siliconjunkie.net and
any known viruses were removed. For a current virus list
see http://www.siliconjunkie.net/antivirus/list.html
 





Re: [squid-users] Refresh Cache

2003-07-18 Thread Schelstraete Bart
Richard Sumilang wrote:

Is there a way to configure squid to empty it's cache every X amount 
of days?


Hello Richard,

I don't think you want to clear the complete squid cache after XX days, 
why should you do that?
If you want to refresh pages after an amount of time, you can use the 
'refresh_pattern' option in the squid.Conf file.

rgrds,

  Bart
--
Pls sign the following guestbook:   http://www.cable-dance.be   and ask 
to re-activate cable-dance!
Pls send the message to everyone ASAP.
Tnx, Bart




Re: [squid-users] restricting download of .exe files ....

2003-07-18 Thread Schelstraete Bart

Thanks for the help (I actually did try to find this answer but 
couldn't ... and decided to ask the list for its help (thinking that 
was, in fact, the purpose of the list.))

Anyway, let me ask a followup question  what if I want to disallow 
all .exe files EXCEPT those from microsoft.com?  is that possible?  
Feel free, as I know you will, to tell me to RTFM, but any help in 
locating the right place in the right FM would be greatly appreciated.
You can first create an acl to allow anything to microsoft.com , and 
then specify the acl to block .exe files.
That's what I'm using in the live environment.

rgrds,
 Bart
--
Pls sign the following guestbook:   http://www.cable-dance.be   and ask 
to re-activatr cable-dance!
Pls! Sent the message to everyone ASAP.
Tnx, Bart




Re: [squid-users] Squid but propably off topic?

2003-07-18 Thread Schelstraete Bart
Henrik Nordstrom wrote:

Doublecheck the network connectivity of the Linux box. Maybe there is

a disagreement on half/full duplex etc?
 

Henrik,

That's always this first thing that I check, and those settings are correct.



   Bart



[squid-users] Squid but propably off topic?

2003-07-18 Thread Schelstraete Bart
Hello,

I 'm facing a problem with my Squid proxy server, and I hope maybe of 
you can help me. (the problem is maybe not squid related)
I installed Squid 2.5 on a Linux server which is currently live (RH 
8.0). This machine is a 2 CPU machine, with 1 GIG RAM, 2 x 100 NIC. (in 
fact this are 1G bit cards, but it's currently connected on a 100Mb switch).
And I also installed Squid 2.5 on a HP-UX machine , which I'm testing 
now. (HP-UX 11i). This machine has 3 CPU's, and 1x100 NIC. (also 1Gbit card)
(completely the same Squid configuration)

Everything works fine, and Squid itselfs works quite fast. Because we 
have a lot of servers in our lan, which doesn't need to use the 
'cache_peer', I configured Squid that he goes directly to those machine, 
so without the cache_peer.

Now yesterday I did some tests, and I downloaded a file via ftp through 
the proxy. (proxy will bypass cache_peer for lan, and I configured Squid).
If I used the Linux Squid proxy , I got a download speed of +/- 300 
Kb/s. But If I'm using the HP-UX Squid proxy I get a download speed of 
1000 / 1200 Kb/s..

I double check the configuration, and I checked the access.log file, and 
Squid not using the cache_peer for that download.
So I don't really understand why the Linux machine is that slow 
comparing with the HP-UX.
Does somebody has a clue what  I can do to improve this kind of connections?



   Bart



Re: [squid-users] NCSA Authentication...help

2003-07-11 Thread Schelstraete Bart
David Jacobs wrote:

I realize this should be a simple thing, but I am new to squid.  I am
using the RPM that comes with redhat 9.  I setup NCSA authentication and
I am getting a login and password prompt from the browser when I hit the
proxy, but it does not authenticate (I did create a passwd file using
htpasswd) I thought is was configured correctly until I saw "Too few
Basic Authenicator processes are running" when I do tail -f
/var/log/messages while I am trying to log in.  Is this a clear
indication of a problem?
Hello David,

Increase this value in your squid.conf file:

authenticate_children

(for example: 5 or 10)

rgrds,
  Bart


Re: [squid-users] Squid 2.5S2 - ACL problem

2003-07-09 Thread Schelstraete Bart
HEllo,

yahoo also uses other url's then those you gave, and that's not allowed in your
setup. (for example banners etc)
You should increase your debug_options and check the cache.log file which page
is blocked.


rgrds,

Bart

Quoting Balzi Andrea <[EMAIL PROTECTED]>:

> Hi!
> 
> In the "dominiautorizzati" ACL I have inserted some domains:
> 
> it.yahoo.com
> .yahoo.it
> it.rd.yahoo.com
> it.search.yahoo.com
> 
> When a user tries to make a search through Yahoo Italy receives a denied 
> access.
> For the problem with Yahoo Italy I have tried also to put yahoo.com, but 
> the problems remains.
> If the user uses one of the other site of Yahoo there aren't problems.
> The other "dominiautorizzati" domains in do not have the same problem.
> I have controlled that in the other ACL it did not appear the domains of
> Yahoo.
> 
> How I can resolve the problem?
> 
> We have the follow acl:
> 
> #Access Control List
> acl all src 0.0.0.0/0.0.0.0
> acl localhost src 127.0.0.1/255.255.255.255
> acl ipunico max_user_ip -s 1
> acl password proxy_auth_regex -i
> "/usr/share/squid/blacklists/interne/utenti"
> acl dominigruppo dstdomain "/etc/squid/blacklists/interne/dominigruppo"
> acl urlbloccate url_regex -i "/etc/squid/blacklists/interne/urlbloccate"
> acl domregexbloccati dstdom_regex -i 
> "/etc/squid/blacklists/interne/domregexbloccati"
> acl dominibloccati dstdomain "/etc/squid/blacklists/interne/dominibloccati"
> acl dominiautorizzati dstdomain 
> "/etc/squid/blacklists/interne/dominiautorizzati"
> acl urlnocache url_regex -i "/etc/squid/blacklists/interne/urlnocache"
> acl ads_domains dstdom_regex "/usr/share/squid/blacklists/ads/domains"
> acl aggressive_domains dstdom_regex 
> "/usr/share/squid/blacklists/aggressive/domains"
> acl audio-video_domains dstdom_regex 
> "/usr/share/squid/blacklists/audio-video/domains"
> acl drugs_domains dstdom_regex "/usr/share/squid/blacklists/drugs/domains"
> acl gambling_domains dstdom_regex 
> "/usr/share/squid/blacklists/gambling/domains"
> acl hacking_domains dstdom_regex
> "/usr/share/squid/blacklists/hacking/domains"
> acl mail_domains dstdom_regex "/usr/share/squid/blacklists/mail/domains"
> acl porn_domains dstdom_regex "/usr/share/squid/blacklists/porn/domains"
> acl proxy_domains dstdom_regex "/usr/share/squid/blacklists/proxy/domains"
> acl violence_domains dstdom_regex 
> "/usr/share/squid/blacklists/violence/domains"
> acl warez_domains dstdom_regex "/usr/share/squid/blacklists/warez/domains"
> acl QUERY urlpath_regex -i cgi-bin .cgi
> acl METHOD method CONNECT POST
> acl ssl proto HTTPS
> 
> #Regole
> http_access deny ipunico
> http_access allow dominigruppo
> http_access deny ads_domains
> http_access deny aggressive_domains
> http_access deny audio-video_domains
> http_access deny drugs_domains
> http_access deny gambling_domains
> http_access deny hacking_domains
> http_access deny mail_domains
> http_access deny porn_domains
> http_access deny proxy_domains
> http_access deny violence_domains
> http_access deny warez_domains
> http_access allow dominiautorizzati
> http_access deny urlbloccate
> http_access deny domregexbloccati
> http_access deny dominibloccati
> http_access allow password
> http_access allow localhost
> http_access deny all
> no_cache deny QUERY
> no_cache deny urlnocache
> no_cache deny METHOD
> no_cache deny ssl
> 
> 


Schelstraete Bart
[EMAIL PROTECTED] - http://www.schelstraete.org
  http://langmixer.mozdev.org


RE: [squid-users] NoProxy directive equivalent in squid

2003-07-07 Thread Schelstraete Bart
Citeren Chris Vaughan <[EMAIL PROTECTED]>:

> I am trying to get to an intranet site on a remote host, that our office has
> a point to point connection for. We are trying to tell the server not to use
> proxy for the specific internal domain names for this site.

Chris,

You can use -for example:
  acl internal dstdomain yahoo.com  (or something similar)
  always_direct allow internal
  never_direct deny internal


rgrds,
  Bart


Re: [squid-users] NoProxy directive equivalent in squid

2003-07-07 Thread Schelstraete Bart
Chris Vaughan wrote:

Hello,

I am trying to identify an equivalent in squid to the apache NoProxy
directive. When I previously asked about this, I was told the answer was
held in the FAQ. However, having looked at the FAQ, I am unable to find what
I am supposed to be looking for. Any help is appreciated.
 

Hello Chris,

What o  you want to accomplisch with 'No proxy':

 a)   Do not cache?  (no_cache option)
 b)   Do no forward to parent proxies? (always_direct, 
never_direct options)
 c)   Block the URL? (ACL)
   

Pls be more specific.

rgrds,
Bart


  1   2   >