Re: [squid-users] change in source code

2008-04-15 Thread Henrik Nordstrom
mån 2008-04-14 klockan 20:13 -0700 skrev Anil Saini:
 actually its is thr in squid.conf fine that max limit is 32
 but i increased the limit to 60..and no of dns processes increases...but i
 dont know it will effect the squid or not

If the number of helpers increased then it worked.

The limit of 32 was in old versions of Squid, and it seems the comment
in the dns_children description is a leftover.

Another way is to build Squid without --disable-internal-dns. This will
remove this directive as it isn't even relevant any more..

Regards
Henrik



[squid-users] Accessing cachemgr.cgi

2008-04-15 Thread hdkutz
Hello List,
pretty new to squid 3.0.
Tried to configure cachemgr.cgi.
Problem:
Squid is not listening to his standard port 3128.
It is configured to Listen on port 80.
Apache Webserver is configured to use port 3128.
If I try to access http://proxy:3128/cgi-bin/cachemgr.cgi I'll get
snip
connect 127.0.0.1:80: (111) Connection refused
snip

snipy
[EMAIL PROTECTED] etc]# grep manager squid.conf
acl manager proto cache_object
http_access allow manager localhost 
http_access deny manager
[EMAIL PROTECTED] etc]# grep localhost squid.conf
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
http_access allow manager localhost
http_access allow localhost
[EMAIL PROTECTED] etc]# grep 127.0.0.1 cachemgr.conf 
127.0.0.1
127.0.0.1:80
snipy

Am I missing something?
-- 
Han Solo:
I love you.
Princess Leia:
I know.


[squid-users] How to check the cache_peer sibling is working?

2008-04-15 Thread John Lui
I used two squid server (A,B) as sibling,the
icp_port,query_icmp,icp_access,cache_peer has been setted.

query_icmp on
icp_port 3130
cache_peer A(/B) sibling 80 3130 proxy-only
icp_access allow All


One object has been cached by  server A and not cached by server B.
when i want to get the object from server B,the X-Cache from server B
showed that the object was read from the original server not the
server A.
What is wrong with my operation?
Thks   John


Re: [squid-users] Getting sibling caches to work in an accelerator setup

2008-04-15 Thread Amos Jeffries

Patrik Ellrén wrote:

We have a setup with a number of identical application servers running on Windows 2003 
Server. On each server there is an instance of Squid (2.6 stable 18) that runs as an 
accelerator. The accelerator mode seems to work fine when each Squid instance is only 
accelerating its own application server but we would like the Squids to run 
as siblings and we have not been able to get it to work.

Even though the objects are cached by Squid on one machine calls from a sibling 
generates a combination of:

UDP_HIT/000
TCP_MISS/504

It looks like the ICP call indicates a hit but when a Squid tries to retrieve 
the cached object it is not found in the cache. The max-age and expire headers 
are set to allow caching for weeks (and it does work when the Squids are 
accelerating only its own origin server), cache-control is set to public and no 
other headers have been set.

If we add allow_miss for the cache_peer tags then the objects will be retrieved 
but they will come from the sibling's origin server and not from the cache so 
it looks like the communication works.

Does anyone have an idea what could cause this behaviour?


Random wild guesses:
 - The sibling has a '504 timeout' error object cached for that URL.

 - The sibling has the object, UDP gets through but TCP connection is 
firewalled.


 - The ICP request does not match the HTCP request (ie the HTCP is 
looking for a specific object, but the ICP doesn't have the right Vary 
setting passed on to the sibling.)


 - You have an old cache-digest and the sibling has expired the object 
to make new space


(don't ask me why, all a wild guess remember).

Amos
--
Please use Squid 2.6.STABLE19 or 3.0.STABLE4


Re: [squid-users] squid as reverse proxy, serving large files

2008-04-15 Thread Lin Jui-Nan Eric
Hi All,

Found that it is a FreeBSD deadlock bug:

http://www.freebsd.org/cgi/query-pr.cgi?pr=106317


Re: [squid-users] Reverse proxy for Primary and then Secondary

2008-04-15 Thread Amos Jeffries

Indunil Jayasooriya wrote:

On Thu, Apr 10, 2008 at 7:48 PM, Amos Jeffries [EMAIL PROTECTED] wrote:

Indunil Jayasooriya wrote:


Hi all,

I have 2 web servers . One is Primary and the other is Secondary.

Pls asssume
ip of primary is 1.2.3.4
ip of secondary 2.3.4.5

I want squid resverse proxy to forward traffic to primary server.
When, the primary goes offline, it should forward to Secondary web
Server.

How can I acheive this task?

I am going to keep squid as a reverse proxy in front of them?

pls assume ip of reverse proxy is 5.6.7.8

How Can I write rules in squid.conf?

pls see below rules.


http_port 80 accel defaultsite=your.main.website

cache_peer ip.of.primarywebserver parent 80 0 no-query originserver
cache_peer ip.of.secondarywebserver parent 80 0 no-query originserver

acl our_sites dstdomain your.main.website
http_access allow our_sites


 Add:squid-users squid-users@squid-cache.org
  cache_peer_access ip.of.primarywebserver allow our_sites
  cache_peer_access ip.of.secondarywebserver allow our_sites
  never_direct allow our_sites


Hi, amos,

Then, Comple rule set will be this. Pls let me know.


 http_port 80 accel defaultsite=your.main.website

 cache_peer ip.of.primarywebserver parent 80 0 no-query  originserver

 cache_peer ip.of.secondarywebserver parent 80 0 no-query  originserver

 acl our_sites dstdomain your.main.website

http_access allow our_sites

cache_peer_access ip.of.primarywebserver allow our_sites

cache_peer_access ip.of.secondarywebserver allow our_sites
never_direct allow our_sites



Looks good.
If you have multiplewebsites hosted you may need both accel vhost 
options on the http_port.







 Squid follows that behavior by default.

 FYI, There are some additional monitor* options to fine-tune recovery.


What are they?



http://www.squid-cache.org/Versions/v2/2.6/cfgman/cache_peer.html
http://www.squid-cache.org/Versions/v3/3.0/cfgman/cache_peer.html


Amos
--
Please use Squid 2.6.STABLE19 or 3.0.STABLE4


Fwd: [squid-users] Reverse proxy for Primary and then Secondary

2008-04-15 Thread Indunil Jayasooriya
 Looks good.
 If you have multiplewebsites hosted you may need both accel vhost
options on the http_port.

NOTED , Thanks




-- 
Thank you
Indunil Jayasooriya


[squid-users] Adding Header file

2008-04-15 Thread Paras Fadte
Hi,

I was trying to add header file stack.h  in one of the squids source
code  file so that I could define stack data types. But it doesn't
seem to work . Can anybody help me out with this please?

Thank you.

-plf


[squid-users] Parent selection mechanism?

2008-04-15 Thread Janis

Hi!

Is there some possibility to tell to secondary proxy which parent  
(cache_peer) to connect depending from received request.


for example, if sec. recievs reqest for page from address  
xxx.xxx.xxx.xxx, ir selects parent1, iy - yyy...yyy - parent2 etc.


The reason - parent1 serves better external zone1 (has no traffic  
limits, for example), parent2 - zone2 (which has some limits, but for  
a range of requests2 these limits plays no role)


Janis


This message was sent using IMP, the Internet Messaging Program.




Re: [squid-users] Parent selection mechanism?

2008-04-15 Thread Amos Jeffries

Janis wrote:

Hi!

Is there some possibility to tell to secondary proxy which parent 
(cache_peer) to connect depending from received request.


for example, if sec. recievs reqest for page from address 
xxx.xxx.xxx.xxx, ir selects parent1, iy - yyy...yyy - parent2 etc.


The reason - parent1 serves better external zone1 (has no traffic 
limits, for example), parent2 - zone2 (which has some limits, but for a 
range of requests2 these limits plays no role)


cache_peer_access.

The ACL part can contain any combination of request ACL.

http://www.squid-cache.org/Versions/v2/2.6/cfgman/cache_peer_access.html
http://www.squid-cache.org/Versions/v3/3.0/cfgman/cache_peer_access.html

Amos
--
Please use Squid 2.6.STABLE19 or 3.0.STABLE4


Re: [squid-users] Adding Header file

2008-04-15 Thread Amos Jeffries

Paras Fadte wrote:

Hi,

I was trying to add header file stack.h  in one of the squids source
code  file so that I could define stack data types. But it doesn't
seem to work . Can anybody help me out with this please?



Code discussions in squid-dev please.

You need to:
 - add the files to the right sub-directory in the source,
 - add to teh matching Makefile.am in all the right places,
 - run bootstrap.sh,
 - run configure,
 - run make check,
 - ... and fix any of the errors that come up about your files.
 - test it works, and then submit a patch and request to have it used.

NP: other errors in existing code. If any occur, please mention to 
squid-dev so we can check and fix.


Amos
--
Please use Squid 2.6.STABLE19 or 3.0.STABLE4


[squid-users] Squid NTLM Auth Failing on Long Passwords

2008-04-15 Thread andrew . lathrop
I appear to have run into an issue with Squid failing to authenticate 
users with long passwords.  I have had a few users that always get a 
username/password prompt box which re-appears even if the correct info is 
entered.  The AD server logs each of the attempts as a bad password. Squid 

appears to log it as Empty LM password supplied for user ... 
No-Auth.  (Only verified for some users)  The only thing I can find in 
common between these users would be password that are over 14 characters 
in length.  Is this a possible source of the errors/constant password 
prompt?  From doing some reading it appears that the LanMan hash value 
becomes NULL after 14 chars are inputed as a password.  I'm at a loss for 
a solution short of telling my users that they need to use shorter 
passwords.  Any thoughts are appreciated.  Thanks,

Andrew


RE: [squid-users] Squid2-only plugin from Secure Computing

2008-04-15 Thread Alex Rousskov

On Thu, 2008-03-20 at 17:46 +1100, Adam Carter wrote:
  I would be happy to try to resolve this issue with Secure Computing.
  However, I need more information:
 
  - What exactly is the Secure Computing plugin that supports Squid2 and
  does not support Squid3? Does it have a name and a version number?
 
 I think SmartFilter patches the squid source, so is tied to specific
 versions. It certainly adds another option to the configure script.
 You can download it for free from SecureComputing's website and have
 look. Sorry I cant be more helpful but I'm not a developer.
 
 Smartfilter 4.2.1 works with squid 2.6-17.
 
 http://www.securecomputing.com/index.cfm?skey=1326

FYI: We have started talking to Secure Computing regarding Squid3
compatibility of the SmartFilter plugin. I will keep you updated.

Thank you,

Alex.




[squid-users] Site filtering issue

2008-04-15 Thread Sheldon Carvalho
Site filtering issue

I am having issues with filtering of my websites. I have setup squid
2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
Squid seems to log all sites that are going out from other stations
but does not filter and of the sites. They all go through.
My denied_domains.acl has
.youtube.com
.hotmail.com
.live.com
But these sites don't seem to get blocked out.  I had also issues this
command thinking that it was to do with Iptables
iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
192.168.1.1:3128
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
--to-port 3128

Initially squid wouldn't work; everything would be blocked so I
disable the firewall which allowed access. SO I put a custom allow to
port 3128 which opened it up but to all sites.

--
squid.conf
--
visible_hostname vanderpolgroup

http_port 3128

maximum_object_size 32768 KB
maximum_object_size_in_memory 128 KB

cache_mem 256 MB
cache_dir ufs /var/spool/squid 7 32 512

cache_access_log /var/log/squid/access.log
cache_log /var/log/squid/cache.log

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl our_network src 192.168.10.0/24
acl to_localhost dst 127.0.0.0/8

acl SSL_ports port 443  # SSL
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 563 70
acl CONNECT method CONNECT


acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl

acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
acl aggressive_blacklist dstdom_regex /etc/squid/blacklist/aggressive/domains
acl audio-video_blacklist dstdom_regex
/etc/squid/blacklist/audio-video/domains
acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
acl gambling_blacklist dstdom_regex /etc/squid/blacklist/gambling/domains
acl hacking_blacklist dstdom_regex /etc/squid/blacklist/hacking/domains
acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
acl porn_blacklist dstdom_regex /etc/squid/blacklist/porn/domains
acl proxy_blacklist dstdom_regex /etc/squid/blacklist/proxy/domains
acl redirector_blacklist dstdom_regex /etc/squid/blacklist/redirector/domains
acl spyware_blacklist dstdom_regex /etc/squid/blacklist/spyware/domains
acl suspect_blacklist dstdom_regex /etc/squid/blacklist/suspect/domains
acl violence_blacklist dstdom_regex /etc/squid/blacklist/violence/domains
acl warez_blacklist dstdom_regex /etc/squid/blacklist/warez/domains
acl networking_blacklist dstdom_regex /etc/squid/blacklist/networking/domains

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow our_network
http_access deny all
icp_access allow all
#miss_access allow all

http_access allow custom_allowed_domains
http_access deny custom_denied_domains

http_access deny ads_blacklist
http_access deny aggressive_blacklist
http_access deny audio-video_blacklist
http_access deny drugs_blacklist
http_access deny gambling_blacklist
http_access deny hacking_blacklist
http_access deny mail_blacklist
http_access deny porn_blacklist
http_access deny proxy_blacklist
http_access deny redirector_blacklist
http_access deny spyware_blacklist
http_access deny suspect_blacklist
http_access deny violence_blacklist
http_access deny warez_blacklist
http_access deny networking_blacklist

cache_mgr [EMAIL PROTECTED]


Thanks
Sheldon


Re: [squid-users] change in source code

2008-04-15 Thread Guy Helmer

Henrik Nordstrom wrote:

mån 2008-04-14 klockan 20:13 -0700 skrev Anil Saini:
  

actually its is thr in squid.conf fine that max limit is 32
but i increased the limit to 60..and no of dns processes increases...but i
dont know it will effect the squid or not



If the number of helpers increased then it worked.

The limit of 32 was in old versions of Squid, and it seems the comment
in the dns_children description is a leftover.

Another way is to build Squid without --disable-internal-dns. This will
remove this directive as it isn't even relevant any more..

Regards
Henrik
  
Is this true for 2.6?  I had to enable dns_defnames in response to a 
customer requirement, and found that I had to rebuild 2.6.18 with 
--disable-internal-dns so dns_defnames would work.


Thanks,
Guy

--
Guy Helmer, Ph.D.
Chief System Architect
Palisade Systems, Inc.



Re: [squid-users] change in source code

2008-04-15 Thread Adrian Chadd
On Tue, Apr 15, 2008, Guy Helmer wrote:

 Is this true for 2.6?  I had to enable dns_defnames in response to a 
 customer requirement, and found that I had to rebuild 2.6.18 with 
 --disable-internal-dns so dns_defnames would work.

Doesn't it pick that stuff up from /etc/resolv.conf w/ the external DNS
code?




Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


[squid-users] Site filtering issue

2008-04-15 Thread Sheldon Carvalho
Site filtering issue

I am having issues with filtering of my websites. I have setup squid
2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
Squid seems to log all sites that are going out from other stations
but does not filter and of the sites. They all go through.
My denied_domains.acl has
.youtube.com
.hotmail.com
.live.com
But these sites don't seem to get blocked out.  I had also issues this
command thinking that it was to do with Iptables
iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
192.168.1.1:3128
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
--to-port 3128

Initially squid wouldn't work; everything would be blocked so I
disable the firewall which allowed access. SO I put a custom allow to
port 3128 which opened it up but to all sites.

--
squid.conf
--
visible_hostname vanderpolgroup

http_port 3128

maximum_object_size 32768 KB
maximum_object_size_in_memory 128 KB

cache_mem 256 MB
cache_dir ufs /var/spool/squid 7 32 512

cache_access_log /var/log/squid/access.log
cache_log /var/log/squid/cache.log

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl our_network src 192.168.10.0/24
acl to_localhost dst 127.0.0.0/8

acl SSL_ports port 443  # SSL
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 563 70
acl CONNECT method CONNECT

acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl

acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
acl aggressive_blacklist dstdom_regex /etc/squid/blacklist/aggressive/domains
acl audio-video_blacklist dstdom_regex
/etc/squid/blacklist/audio-video/domains
acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
acl gambling_blacklist dstdom_regex /etc/squid/blacklist/gambling/domains


http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow our_network
http_access deny all
icp_access allow all
#miss_access allow all

http_access allow custom_allowed_domains
http_access deny custom_denied_domains

http_access deny ads_blacklist
http_access deny aggressive_blacklist
http_access deny audio-video_blacklist
http_access deny drugs_blacklist
http_access deny gambling_blacklist

cache_mgr [EMAIL PROTECTED]


Thanks
Sheldon


[squid-users] High availability based on Squid process

2008-04-15 Thread Nick Duda
I might need to take this elsewhere, but curious if anyone is doing this 
already.

I need to have a failover Squid proxy server in the event the primary goes 
down...when I say down, I mean Squid is not working. Is there any linux high 
availability (fault tolerance) software solutions that would failover if the 
squid process is not running?

- Nick


Re: [squid-users] change in source code

2008-04-15 Thread Guy Helmer

Adrian Chadd wrote:

On Tue, Apr 15, 2008, Guy Helmer wrote:

  
Is this true for 2.6?  I had to enable dns_defnames in response to a 
customer requirement, and found that I had to rebuild 2.6.18 with 
--disable-internal-dns so dns_defnames would work.



Doesn't it pick that stuff up from /etc/resolv.conf w/ the external DNS
code?

Adrian
  
I expected it to do so, but browsing to unqualified local domain names 
didn't work on 2.6 until I rebuilt with --disable-internal-dns.


I think it worked as I expected in 3.0 (without having to rebuild with 
--disable-internal-dns), which is where I initially enabled dns_defnames 
for my own testing.


Guy

--
Guy Helmer, Ph.D.
Chief System Architect
Palisade Systems, Inc.



Re: [squid-users] Squid NTLM Auth Failing on Long Passwords

2008-04-15 Thread Guido Serassio

Hi,

Il 17:38 15/04/2008 [EMAIL PROTECTED] ha scritto:

I appear to have run into an issue with Squid failing to authenticate
users with long passwords.  I have had a few users that always get a
username/password prompt box which re-appears even if the correct info is
entered.  The AD server logs each of the attempts as a bad password. Squid

appears to log it as Empty LM password supplied for user ...
No-Auth.  (Only verified for some users)  The only thing I can find in
common between these users would be password that are over 14 characters
in length.  Is this a possible source of the errors/constant password
prompt?  From doing some reading it appears that the LanMan hash value
becomes NULL after 14 chars are inputed as a password.  I'm at a loss for
a solution short of telling my users that they need to use shorter
passwords.  Any thoughts are appreciated.  Thanks,


What NTLM helper ?

LM based helpers like ntlm_auth provided with Squid are limited to 14 
characters password.

This is a LM protocol limit.

Regards

Guido



-

Guido Serassio
Acme Consulting S.r.l. - Microsoft Certified Partner
Via Lucia Savarino, 1   10098 - Rivoli (TO) - ITALY
Tel. : +39.011.9530135  Fax. : +39.011.9781115
Email: [EMAIL PROTECTED]
WWW: http://www.acmeconsulting.it/



[squid-users] Clock sync accuracy importance?

2008-04-15 Thread Jon Drukman
I'm trying to run a squid accelerator on a server in India, accelerating 
an origin host in the USA.  I don't have a ton of experience with ntpd 
but I think I have it running properly on both sites.  For whatever 
reason, they are always 20 seconds out of sync.  Squid is not appearing 
to cache items on the India server.  It's always contacting the origin 
server on every request, and I assume this is because of the clock 
discrepancy.


Is there any way to tell squid that a minute or two drift in either 
direction is OK?


Also, is there any way to find out exactly what decisions squid is 
making so I can tell for sure if it's the clock issue or something else? 
 Maybe my headers aren't correct?


HTTP/1.1 200 OK
Date: Tue, 15 Apr 2008 17:32:23 GMT
Server: Apache/2.0.61 (Unix) PHP/4.4.7 mod_ssl/2.0.61 OpenSSL/0.9.7e 
mod_fastcgi/2.4.2 DAV/2 SVN/1.4.2

X-Powered-By: PHP/5.2.3
Cache-Control: max-age=300, stale-while-revalidate, stale-on-error
Vary: Accept-Encoding
Content-Type: text/html

Should I throw an Expires header in there?

-jsd-



Re: [squid-users] High availability based on Squid process

2008-04-15 Thread BJ Tiemessen

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Look at Linux HA (www.linux-ha.org), very nice software.

BJ

Nick Duda wrote:
| I might need to take this elsewhere, but curious if anyone is doing
this already.
|
| I need to have a failover Squid proxy server in the event the primary
goes down...when I say down, I mean Squid is not working. Is there any
linux high availability (fault tolerance) software solutions that would
failover if the squid process is not running?
|
| - Nick

- --
BJ Tiemessen
eSoft Inc.
303-444-1600 x3357
[EMAIL PROTECTED]
www.eSoft.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFIBO9oxD4S8yzNNMMRAqTHAJ0fXLOxQgA1ney43aoNh19MjwBjegCfQ10I
l3AEH0WOEf7bhxRUJ+BxkKM=
=fXMa
-END PGP SIGNATURE-


Re: [squid-users] Site filtering issue

2008-04-15 Thread Felix Lazaro Carbonell Carbonell
Shelton, may be the tag
http_access allow our_network
should go after and not before (or may be you don't need it at all)
http_access denied custom_denied_domains dst etc/squid/denied_domains.acl


hope to be helpful.
i'm a beginner.
Regards,
Felix Lazaro Carbonell
 Site filtering issue

 I am having issues with filtering of my websites. I have setup squid
 2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
 Squid seems to log all sites that are going out from other stations
 but does not filter and of the sites. They all go through.
 My denied_domains.acl has
 .youtube.com
 .hotmail.com
 .live.com
 But these sites don't seem to get blocked out.  I had also issues this
 command thinking that it was to do with Iptables
 iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
 192.168.1.1:3128
 iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
 --to-port 3128

 Initially squid wouldn't work; everything would be blocked so I
 disable the firewall which allowed access. SO I put a custom allow to
 port 3128 which opened it up but to all sites.

 --
 squid.conf
 --
 visible_hostname vanderpolgroup

 http_port 3128

 maximum_object_size 32768 KB
 maximum_object_size_in_memory 128 KB

 cache_mem 256 MB
 cache_dir ufs /var/spool/squid 7 32 512

 cache_access_log /var/log/squid/access.log
 cache_log /var/log/squid/cache.log

 acl all src 0.0.0.0/0.0.0.0
 acl manager proto cache_object
 acl localhost src 127.0.0.1/255.255.255.255
 acl our_network src 192.168.10.0/24
 acl to_localhost dst 127.0.0.0/8

 acl SSL_ports port 443  # SSL
 acl Safe_ports port 80  # http
 acl Safe_ports port 21  # ftp
 acl Safe_ports port 443 # https
 acl Safe_ports port 70  # gopher
 acl Safe_ports port 210 # wais
 acl Safe_ports port 1025-65535  # unregistered ports
 acl Safe_ports port 280 # http-mgmt
 acl Safe_ports port 488 # gss-http
 acl Safe_ports port 591 # filemaker
 acl Safe_ports port 777 # multiling http
 acl Safe_ports port 563 70
 acl CONNECT method CONNECT


 acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
 acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl

 acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
 acl aggressive_blacklist dstdom_regex
 /etc/squid/blacklist/aggressive/domains
 acl audio-video_blacklist dstdom_regex
 /etc/squid/blacklist/audio-video/domains
 acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
 acl gambling_blacklist dstdom_regex
 /etc/squid/blacklist/gambling/domains
 acl hacking_blacklist dstdom_regex
 /etc/squid/blacklist/hacking/domains
 acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
 acl porn_blacklist dstdom_regex /etc/squid/blacklist/porn/domains
 acl proxy_blacklist dstdom_regex /etc/squid/blacklist/proxy/domains
 acl redirector_blacklist dstdom_regex
 /etc/squid/blacklist/redirector/domains
 acl spyware_blacklist dstdom_regex
 /etc/squid/blacklist/spyware/domains
 acl suspect_blacklist dstdom_regex
 /etc/squid/blacklist/suspect/domains
 acl violence_blacklist dstdom_regex
 /etc/squid/blacklist/violence/domains
 acl warez_blacklist dstdom_regex /etc/squid/blacklist/warez/domains
 acl networking_blacklist dstdom_regex
 /etc/squid/blacklist/networking/domains

 http_access allow manager localhost
 http_access deny manager
 http_access deny !Safe_ports
 http_access deny CONNECT !SSL_ports
 http_access allow our_network
 http_access deny all
 icp_access allow all
 #miss_access allow all

 http_access allow custom_allowed_domains
 http_access deny custom_denied_domains

 http_access deny ads_blacklist
 http_access deny aggressive_blacklist
 http_access deny audio-video_blacklist
 http_access deny drugs_blacklist
 http_access deny gambling_blacklist
 http_access deny hacking_blacklist
 http_access deny mail_blacklist
 http_access deny porn_blacklist
 http_access deny proxy_blacklist
 http_access deny redirector_blacklist
 http_access deny spyware_blacklist
 http_access deny suspect_blacklist
 http_access deny violence_blacklist
 http_access deny warez_blacklist
 http_access deny networking_blacklist

 cache_mgr [EMAIL PROTECTED]


 Thanks
 Sheldon







Fwd: Re: [squid-users] Site filtering issue

2008-04-15 Thread Felix Lazaro Carbonell Carbonell

Shelton, may be the tag
http_access allow our_network
should go after and not before (or may be you don't need it at all)
http_access denied custom_denied_domains dst etc/squid/denied_domains.acl


hope to be helpful.
i'm a beginner.
Regards,
Felix Lazaro Carbonell
 Site filtering issue

 I am having issues with filtering of my websites. I have setup squid
 2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
 Squid seems to log all sites that are going out from other stations
 but does not filter and of the sites. They all go through.
 My denied_domains.acl has
 .youtube.com
 .hotmail.com
 .live.com
 But these sites don't seem to get blocked out.  I had also issues this
 command thinking that it was to do with Iptables
 iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
 192.168.1.1:3128
 iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
 --to-port 3128

 Initially squid wouldn't work; everything would be blocked so I
 disable the firewall which allowed access. SO I put a custom allow to
 port 3128 which opened it up but to all sites.

 --
 squid.conf
 --
 visible_hostname vanderpolgroup

 http_port 3128

 maximum_object_size 32768 KB
 maximum_object_size_in_memory 128 KB

 cache_mem 256 MB
 cache_dir ufs /var/spool/squid 7 32 512

 cache_access_log /var/log/squid/access.log
 cache_log /var/log/squid/cache.log

 acl all src 0.0.0.0/0.0.0.0
 acl manager proto cache_object
 acl localhost src 127.0.0.1/255.255.255.255
 acl our_network src 192.168.10.0/24
 acl to_localhost dst 127.0.0.0/8

 acl SSL_ports port 443  # SSL
 acl Safe_ports port 80  # http
 acl Safe_ports port 21  # ftp
 acl Safe_ports port 443 # https
 acl Safe_ports port 70  # gopher
 acl Safe_ports port 210 # wais
 acl Safe_ports port 1025-65535  # unregistered ports
 acl Safe_ports port 280 # http-mgmt
 acl Safe_ports port 488 # gss-http
 acl Safe_ports port 591 # filemaker
 acl Safe_ports port 777 # multiling http
 acl Safe_ports port 563 70
 acl CONNECT method CONNECT


 acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
 acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl



 http_access allow custom_allowed_domains
 http_access deny custom_denied_domains




 Thanks
 Sheldon






[squid-users] cache_peer_domain help

2008-04-15 Thread Nick Duda
I have a transparent 2.6 stable 19 setup. Its working for just one port 80 
redirect but now I need it to redirect to a different internal server on port 
80 for a certain domain only.

Can someone help me with a basic config, here is what I need:

- Squid listening in transparent mode on port 80
- Any request to port 80 goes to a cache_peer (internal IIS server) at 
192.168.1.10
- Any request to port 80 for a certain url www.example.com goes to a cache_peer 
(another internal IIS server) at 192.168.1.20

Here is what I suspect the config would have looked like, but didn't work 
(example):

http_port 80 accel vhost

cache_peer 192.168.1.10 parent 80 no-query originserver
cache_peer 192.168.1.20 parent 80 no-query originserver

cache_peer_domain 192.168.1.20 example.com

What do I need to do for this to happen?

- Nick







Re: [squid-users] change in source code

2008-04-15 Thread Henrik Nordstrom
tis 2008-04-15 klockan 11:19 -0500 skrev Guy Helmer:

  Another way is to build Squid without --disable-internal-dns. This will
  remove this directive as it isn't even relevant any more..

 Is this true for 2.6?

Yes.

 I had to enable dns_defnames in response to a customer requirement,
 and found that I had to rebuild 2.6.18 with --disable-internal-dns so
 dns_defnames would work.

dns_defnames should with with the default internal dns client as well.
But you need to have the search path defined in resolv.conf using the
search directive.

Regards
Henrik



Re: [squid-users] change in source code

2008-04-15 Thread Guy Helmer

Henrik Nordstrom wrote:

tis 2008-04-15 klockan 11:19 -0500 skrev Guy Helmer:

  

Another way is to build Squid without --disable-internal-dns. This will
remove this directive as it isn't even relevant any more..
  
  

Is this true for 2.6?



Yes.

  

I had to enable dns_defnames in response to a customer requirement,
and found that I had to rebuild 2.6.18 with --disable-internal-dns so
dns_defnames would work.



dns_defnames should with with the default internal dns client as well.
But you need to have the search path defined in resolv.conf using the
search directive.

Regards
Henrik
  
Does this mean it ignores the domain directive in resolv.conf?  That's 
what I use on my systems instead of search.


Thanks,
Guy

--
Guy Helmer, Ph.D.
Chief System Architect
Palisade Systems, Inc.



Re: [squid-users] Site filtering issue.... Resolved

2008-04-15 Thread Sheldon Carvalho
Thanks for the reply Felix. I guess that must have helped. I did as
you said but that seem to block all the sites. Which made me think
that the order of the commands make a difference. Which is why, I
followed the default squid.conf file and put the commands in some what
the same order as it is on there. I also had to add in some other
syntax's
Well, I have a working squid now. It was just the order that was
messing up everything.
Here is the working config.
I will try to setup SARG along with squid. Lets hope it goes well.

squid.conf
--

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl our_network src 192.168.10.0/24
acl to_localhost dst 127.0.0.0/8

acl SSL_ports port 443 # SSL
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl

acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
acl aggressive_blacklist dstdom_regex /etc/squid/blacklist/aggressive/domains
acl audio-video_blacklist dstdom_regex
/etc/squid/blacklist/audio-video/domains
acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
acl gambling_blacklist dstdom_regex /etc/squid/blacklist/gambling/domains
acl hacking_blacklist dstdom_regex /etc/squid/blacklist/hacking/domains
acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
acl porn_blacklist dstdom_regex /etc/squid/blacklist/porn/domains
acl proxy_blacklist dstdom_regex /etc/squid/blacklist/proxy/domains
acl redirector_blacklist dstdom_regex /etc/squid/blacklist/redirector/domains
acl spyware_blacklist dstdom_regex /etc/squid/blacklist/spyware/domains
acl suspect_blacklist dstdom_regex /etc/squid/blacklist/suspect/domains
acl violence_blacklist dstdom_regex /etc/squid/blacklist/violence/domains
acl warez_blacklist dstdom_regex /etc/squid/blacklist/warez/domains
acl networking_blacklist dstdom_regex /etc/squid/blacklist/networking/domains
acl torrent_blacklist dstdom_regex /etc/squid/blacklist/torrent/domains

http_access allow custom_allowed_domains
http_access deny custom_denied_domains

http_access deny ads_blacklist
http_access deny aggressive_blacklist
http_access deny audio-video_blacklist
http_access deny drugs_blacklist
http_access deny gambling_blacklist
http_access deny hacking_blacklist
http_access deny mail_blacklist
http_access deny porn_blacklist
http_access deny proxy_blacklist
http_access deny redirector_blacklist
http_access deny spyware_blacklist
http_access deny suspect_blacklist
http_access deny violence_blacklist
http_access deny warez_blacklist
http_access deny networking_blacklist
http_access deny torrent_blacklist

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow our_network
http_access deny all
icp_access allow all

http_port 3128

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache_mem 256 MB
cache_dir ufs /var/spool/squid 7 32 512
cache_access_log /var/log/squid/access.log
cache_log /var/log/squid/cache.log
cache deny QUERY
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
coredump_dir /var/spool/squid

maximum_object_size 32768 KB
maximum_object_size_in_memory 128 KB

cache_mgr [EMAIL PROTECTED]

Thanks
Sheldon
---

On Tue, Apr 15, 2008 at 12:49 PM, Felix Lazaro Carbonell Carbonell
[EMAIL PROTECTED] wrote:


 Shelton, may be the tag
 http_access allow our_network
 should go after and not before (or may be you don't need it at all)
 http_access denied custom_denied_domains dst etc/squid/denied_domains.acl


 hope to be helpful.
 i'm a beginner.
 Regards,
 Felix Lazaro Carbonell
  Site filtering issue

  I am having issues with filtering of my websites. I have setup squid
  2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
  Squid seems to log all sites that are going out from other stations
  but does not filter and of the sites. They all go through.
  My denied_domains.acl has
  .youtube.com
  .hotmail.com
  .live.com
  But these sites don't seem to get blocked out.  I had also issues this
  command thinking that it was to do with Iptables
  iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
  192.168.1.1:3128
  iptables -t nat 

Re: [squid-users] change in source code

2008-04-15 Thread Henrik Nordstrom
tis 2008-04-15 klockan 15:44 -0500 skrev Guy Helmer:
 Does this mean it ignores the domain directive in resolv.conf?  That's 
 what I use on my systems instead of search.

Yes, it seems so (from reading the source).

Regards
Henrik



Re: [squid-users] Site filtering issue.... Resolved

2008-04-15 Thread Sheldon Carvalho
Thanks for the reply Felix. I guess that must have helped. I did as
you said but that seem to block all the sites. Which made me think
that the order of the commands make a difference. Which is why, I
followed the default squid.conf file and put the commands in some what
the same order as it is on there. I also had to add in some other
syntax's
Well, I have a working squid now. It was just the order that was
messing up everything.
Here is the working config.
I will try to setup SARG along with squid. Lets hope it goes well.

squid.conf
--

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl our_network src 192.168.10.0/24
acl to_localhost dst 127.0.0.0/8

acl SSL_ports port 443 # SSL
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

acl custom_allowed_domains dstdomain /etc/squid/allowed_domains.acl
acl custom_denied_domains dstdomain /etc/squid/denied_domains.acl

acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
acl aggressive_blacklist dstdom_regex /etc/squid/blacklist/aggressive/domains
acl audio-video_blacklist dstdom_regex
/etc/squid/blacklist/audio-video/domains
acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
acl gambling_blacklist dstdom_regex /etc/squid/blacklist/gambling/domains
acl hacking_blacklist dstdom_regex /etc/squid/blacklist/hacking/domains
acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
acl torrent_blacklist dstdom_regex /etc/squid/blacklist/torrent/domains

http_access allow custom_allowed_domains
http_access deny custom_denied_domains

http_access deny ads_blacklist
http_access deny gambling_blacklist
http_access deny hacking_blacklist
http_access deny mail_blacklist
http_access deny torrent_blacklist

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow our_network
http_access deny all
icp_access allow all

http_port 3128

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache_mem 256 MB
cache_dir ufs /var/spool/squid 7 32 512
cache_access_log /var/log/squid/access.log
cache_log /var/log/squid/cache.log
cache deny QUERY
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
coredump_dir /var/spool/squid

maximum_object_size 32768 KB
maximum_object_size_in_memory 128 KB

cache_mgr [EMAIL PROTECTED]

Thanks
Sheldon
---

On Tue, Apr 15, 2008 at 12:49 PM, Felix Lazaro Carbonell Carbonell
[EMAIL PROTECTED] wrote:


 Shelton, may be the tag
 http_access allow our_network
 should go after and not before (or may be you don't need it at all)
 http_access denied custom_denied_domains dst etc/squid/denied_domains.acl


 hope to be helpful.
 i'm a beginner.
 Regards,
 Felix Lazaro Carbonell
  Site filtering issue

  I am having issues with filtering of my websites. I have setup squid
  2.6.STABLE17 over a Fedora 8 machine. Below is my squid.conf file.
  Squid seems to log all sites that are going out from other stations
  but does not filter and of the sites. They all go through.
  My denied_domains.acl has
  .youtube.com
  .hotmail.com
  .live.com
  But these sites don't seem to get blocked out.  I had also issues this
  command thinking that it was to do with Iptables
  iptables -t nat -A PREROUTING -i eth1 -p tcp --dport 80 -j DNAT --to
  192.168.1.1:3128
  iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
  --to-port 3128

  Initially squid wouldn't work; everything would be blocked so I
  disable the firewall which allowed access. SO I put a custom allow to
  port 3128 which opened it up but to all sites.

  --
  squid.conf
  --
  visible_hostname vanderpolgroup

  http_port 3128

  maximum_object_size 32768 KB
  maximum_object_size_in_memory 128 KB

  cache_mem 256 MB
  cache_dir ufs /var/spool/squid 7 32 512

  cache_access_log /var/log/squid/access.log
  cache_log /var/log/squid/cache.log

  acl all src 0.0.0.0/0.0.0.0
  acl manager proto cache_object
  acl localhost src 127.0.0.1/255.255.255.255
  acl our_network src 192.168.10.0/24
  acl to_localhost dst 127.0.0.0/8

  acl SSL_ports port 443  # SSL
  acl Safe_ports port 80  # http
  acl Safe_ports port 21  # ftp
  acl Safe_ports port 443 # https
  acl Safe_ports port 70  # 

Re: [squid-users] Configuring cache_peer to use ssl

2008-04-15 Thread Chris Robertson

Janis wrote:

Quoting Chris Robertson [EMAIL PROTECTED]:


So the child Squid is trying to negotiate an SSL connection with a port
on the Parent that's not set up to accept it.  See
http://www.squid-cache.org/Versions/v3/3.0/cfgman/https_port.html for
the proper directive to terminate an SSL connection.


so, on the parent should be the line(s?):

http_port IP:PORT1

for non-ssl connections and

https_port IP:PORT2 cert=self_s_cert.pem key=key.pem 
sslflags=NO_DEFAULT_CA NO_SESSION_REUSE


for ssl connections


That looks reasonable to me.



and on secondary proxy - as was written before?


Just be sure on the secondary proxy to set the cache_peer line to use 
PORT2 on the peer if you want to use SSL connections.  Also be aware 
that if you try to use two ports on the same peer, you are going to have 
to use the name directive on each cache_peer line like...


cache_peer parent.my.domain parent 3128 3130 proxy-only name=port-3128
cache_peer parent.my.domain parent 3129 3130 proxy-only ssl 
sslcert=[blah, blah] name=port-3129




Janis


Chris


Re: [squid-users] Accessing cachemgr.cgi

2008-04-15 Thread Chris Robertson

hdkutz wrote:

Hello List,
pretty new to squid 3.0.
Tried to configure cachemgr.cgi.
Problem:
Squid is not listening to his standard port 3128.
It is configured to Listen on port 80.
Apache Webserver is configured to use port 3128.
If I try to access http://proxy:3128/cgi-bin/cachemgr.cgi I'll get
snip
connect 127.0.0.1:80: (111) Connection refused
snip

snipy
[EMAIL PROTECTED] etc]# grep manager squid.conf
acl manager proto cache_object
http_access allow manager localhost 
http_access deny manager

[EMAIL PROTECTED] etc]# grep localhost squid.conf
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
http_access allow manager localhost
http_access allow localhost
[EMAIL PROTECTED] etc]# grep 127.0.0.1 cachemgr.conf 
127.0.0.1

127.0.0.1:80
snipy

Am I missing something?
  


My guess would be that either you have specified an IP address on the 
port line of your squid.conf, which forces Squid to only bind to the 
interface where that IP is assigned, or something is preventing local 
communication (be it SELinux, firewall rules...).


Chris


Re: [squid-users] cache_peer_domain help

2008-04-15 Thread Chris Robertson

Nick Duda wrote:

I have a transparent 2.6 stable 19 setup. Its working for just one port 80 
redirect but now I need it to redirect to a different internal server on port 
80 for a certain domain only.

Can someone help me with a basic config, here is what I need:

- Squid listening in transparent mode on port 80
- Any request to port 80 goes to a cache_peer (internal IIS server) at 
192.168.1.10
- Any request to port 80 for a certain url www.example.com goes to a cache_peer 
(another internal IIS server) at 192.168.1.20

Here is what I suspect the config would have looked like, but didn't work 
(example):

http_port 80 accel vhost

cache_peer 192.168.1.10 parent 80 no-query originserver
cache_peer 192.168.1.20 parent 80 no-query originserver

cache_peer_domain 192.168.1.20 example.com

What do I need to do for this to happen?
  


You would need two cache_peer_domain lines...

cache_peer_domain 192.168.1.20 .example.com
cache_peer_domain 192.168.1.10 !.example.com

...which would steer all example.com requests (note the leading period 
in the cache_peer_domain line) to 192.168.1.20 and everything else at 
192.168.1.10.



- Nick
  


Chris


Re: [squid-users] Site filtering issue.... Resolved

2008-04-15 Thread Chris Robertson

Sheldon Carvalho wrote:

Thanks for the reply Felix. I guess that must have helped. I did as
you said but that seem to block all the sites. Which made me think
that the order of the commands make a difference. Which is why, I
followed the default squid.conf file and put the commands in some what
the same order as it is on there. I also had to add in some other
syntax's
Well, I have a working squid now. It was just the order that was
messing up everything.
Here is the working config.
I will try to setup SARG along with squid. Lets hope it goes well.

squid.conf
--

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl our_network src 192.168.10.0/24
acl to_localhost dst 127.0.0.0/8

  

SNIP

acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
acl torrent_blacklist dstdom_regex /etc/squid/blacklist/torrent/domains

http_access allow custom_allowed_domains
  


This should probably be...

http_access allow our_network custom_allowed_domains

...so you don't end up being an open proxy for anything in your 
custom_allowed_domains file.  Have a look at the FAQ 
(http://wiki.squid-cache.org/SquidFaq/SquidAcl) for more details.


Chris


Re: [squid-users] How to check the cache_peer sibling is working?

2008-04-15 Thread Henrik Nordstrom
tis 2008-04-15 klockan 16:23 +0800 skrev John Lui:
 One object has been cached by  server A and not cached by server B.
 when i want to get the object from server B,the X-Cache from server B
 showed that the object was read from the original server not the
 server A.
 What is wrong with my operation?

Try if setting icp_query_timeout 500 makes a difference

Regards
Henrik



Re: [squid-users] Squid NTLM Auth Failing on Long Passwords

2008-04-15 Thread Henrik Nordstrom
tis 2008-04-15 klockan 11:38 -0400 skrev [EMAIL PROTECTED]:

 appears to log it as Empty LM password supplied for user ... 

Which ntlm helper are you using?

The ntlm_auth helper from Samba is recommended. Avoid the helpers
shipped with Squid, those are not very good and only supports now
obsolete LM hashes...

Regards
Henrik



Re: [squid-users] cache_peer_domain help

2008-04-15 Thread Henrik Nordstrom
tis 2008-04-15 klockan 15:51 -0400 skrev Nick Duda:
 cache_peer 192.168.1.10 parent 80 no-query originserver
 cache_peer 192.168.1.20 parent 80 no-query originserver
 
 cache_peer_domain 192.168.1.20 example.com

The above should be

cache_peer_domain 192.168.1.20 .example.com

and you also need

cache_peer_domain 192.168.1.10 !.example.com


The . is to match the whole domain, it not it only matches example.com,
not www.example.com

Regards
Henrik



Re: [squid-users] change in source code

2008-04-15 Thread Amos Jeffries
 Henrik Nordstrom wrote:
 mån 2008-04-14 klockan 20:13 -0700 skrev Anil Saini:

 actually its is thr in squid.conf fine that max limit is 32
 but i increased the limit to 60..and no of dns processes
 increases...but i
 dont know it will effect the squid or not


 If the number of helpers increased then it worked.

 The limit of 32 was in old versions of Squid, and it seems the comment
 in the dns_children description is a leftover.

 Another way is to build Squid without --disable-internal-dns. This will
 remove this directive as it isn't even relevant any more..

 Regards
 Henrik

 Is this true for 2.6?  I had to enable dns_defnames in response to a
 customer requirement, and found that I had to rebuild 2.6.18 with
 --disable-internal-dns so dns_defnames would work.

Yes. That option is available for the internal-DNS engine. You just need
to configure the OS searchpath properly. Squid will append the domain
suffixes and test the resulting FQDNs in DNS.

In Linux/BSDs the /etc/resolv.conf file has a search option taking the
search paths in the sequence to be tested. First-match is used.

Amos




Re: [squid-users] change in source code

2008-04-15 Thread Amos Jeffries
 Adrian Chadd wrote:
 On Tue, Apr 15, 2008, Guy Helmer wrote:


 Is this true for 2.6?  I had to enable dns_defnames in response to a
 customer requirement, and found that I had to rebuild 2.6.18 with
 --disable-internal-dns so dns_defnames would work.


 Doesn't it pick that stuff up from /etc/resolv.conf w/ the external DNS
 code?

 Adrian

 I expected it to do so, but browsing to unqualified local domain names
 didn't work on 2.6 until I rebuilt with --disable-internal-dns.

 I think it worked as I expected in 3.0 (without having to rebuild with
 --disable-internal-dns), which is where I initially enabled dns_defnames
 for my own testing.


Interesting, the code there is identical for both 2.6 and 3.x.
A run of internal DNS with debug options ALL,1 78,6 should show what the
problem is in 2.6.

Amos




RE: [squid-users] Squid2-only plugin from Secure Computing

2008-04-15 Thread Adam Carter
  I think SmartFilter patches the squid source, so is tied to specific
  versions. It certainly adds another option to the configure script.
  You can download it for free from SecureComputing's website and have
  look. Sorry I cant be more helpful but I'm not a developer.
 
  Smartfilter 4.2.1 works with squid 2.6-17.
 
  http://www.securecomputing.com/index.cfm?skey=1326

 FYI: We have started talking to Secure Computing regarding Squid3
 compatibility of the SmartFilter plugin. I will keep you updated.

Thanks Alex, good to hear. Hopefully you can some up with a model that will 
allow us to apply squid bigfixes without compromising SecureComputing support.


Re: [squid-users] Squid2-only plugin from Secure Computing

2008-04-15 Thread Adrian Chadd
On Wed, Apr 16, 2008, Adam Carter wrote:

   I think SmartFilter patches the squid source, so is tied to specific
   versions. It certainly adds another option to the configure script.
   You can download it for free from SecureComputing's website and have
   look. Sorry I cant be more helpful but I'm not a developer.
  
   Smartfilter 4.2.1 works with squid 2.6-17.
  
   http://www.securecomputing.com/index.cfm?skey=1326
 
  FYI: We have started talking to Secure Computing regarding Squid3
  compatibility of the SmartFilter plugin. I will keep you updated.
 
 Thanks Alex, good to hear. Hopefully you can some up with a model that will 
 allow us to apply squid bigfixes without compromising SecureComputing support.

Well, we could also talk to them about rolling their existing patches
into the Squid-2 codebase.




Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -


Re: [squid-users] Site filtering issue.... Resolved

2008-04-15 Thread Amos Jeffries
 Thanks for the reply Felix. I guess that must have helped. I did as
 you said but that seem to block all the sites. Which made me think
 that the order of the commands make a difference. Which is why, I
 followed the default squid.conf file and put the commands in some what
 the same order as it is on there. I also had to add in some other
 syntax's
 Well, I have a working squid now. It was just the order that was
 messing up everything.
 Here is the working config.
 I will try to setup SARG along with squid. Lets hope it goes well.

There is one more point to fix.
From your example all those 'dstdomain_regex' should be just 'dstdomain'
ACL. Much faster and scalable for longer lists.


 acl ads_blacklist dstdom_regex /etc/squid/blacklist/ads/domains
 acl aggressive_blacklist dstdom_regex
 /etc/squid/blacklist/aggressive/domains
 acl audio-video_blacklist dstdom_regex
 /etc/squid/blacklist/audio-video/domains
 acl drugs_blacklist dstdom_regex /etc/squid/blacklist/drugs/domains
 acl gambling_blacklist dstdom_regex
 /etc/squid/blacklist/gambling/domains
 acl hacking_blacklist dstdom_regex /etc/squid/blacklist/hacking/domains
 acl mail_blacklist dstdom_regex /etc/squid/blacklist/mail/domains
 acl porn_blacklist dstdom_regex /etc/squid/blacklist/porn/domains
 acl proxy_blacklist dstdom_regex /etc/squid/blacklist/proxy/domains
 acl redirector_blacklist dstdom_regex
 /etc/squid/blacklist/redirector/domains
 acl spyware_blacklist dstdom_regex /etc/squid/blacklist/spyware/domains
 acl suspect_blacklist dstdom_regex /etc/squid/blacklist/suspect/domains
 acl violence_blacklist dstdom_regex
 /etc/squid/blacklist/violence/domains
 acl warez_blacklist dstdom_regex /etc/squid/blacklist/warez/domains
 acl networking_blacklist dstdom_regex
 /etc/squid/blacklist/networking/domains
 acl torrent_blacklist dstdom_regex /etc/squid/blacklist/torrent/domains


Amos




[squid-users] squid-2 fork - cacheboy (again)

2008-04-15 Thread Adrian Chadd
G'day,

For those of you who don't remember, about two years ago I became annoyed at
trying to maintain a local Squid-2.5 installation with half a dozen different
patches which everyone used to 'fix' Squid-2.5 to perform well.

I called this derivative cacheboy and after a few weeks (and a few users!)
the resulting patchsets were pulled out from my repository and fed into what
became Squid-2.6.

(The original email: 
http://www.squid-cache.org/mail-archive/squid-users/200604/0615.html)

Well, I'm at that point again, where I'd like to do some large-scale work to
the Squid-2 tree to fix a whole range of longstanding performance and codebase
issues to help the codebase move forward. Unfortunately this clashes with the
general Squid project direction of developing Squid-3.

The Squid-2 codebase has been in maintainence mode for a number of years now
and in the interval between Squid-3's announcement and today there have been
a large number of replacement HTTP proxy/cache/routing open source projects, all
of which perform much better in one particular area than Squid.

So I've decided to fork the Squid-2 project again into another cacheboy
derivative seperate from the Squid project. I'm going to pursue a different
set of short-term and medium-term goals whilst focusing on maintaining the
relative maturity of the Squid-2 codebase.

I'm going to continue offering commercial Squid support and development for
the foreseeable future.

You can find the current project details at http://code.google.com/p/cacheboy .

I'll set up a proper project homepage to coincide with the first stable release,
which I hope will be in a couple of weeks. This will be based on Squid-2.HEAD
with the initial code reorganisation I've been working on. It should be just
as stable as Squid-2.HEAD but the reorganisation should lead to further
short-term improvements.

I wish everyone working on Squid-3 the best of luck for the future.



Adrian

-- 
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -