[squid-users] Basic Pam authentification problem with on mandrake 9.0
I have an understanding problem with squid-2.4 on red Hat 9.0. My settings are: For squid authentification: authenticate_program /usr/lib/squid/pam_auth authenticate_children 25 authenticate_ttl 1 hour authenticate_ip_ttl 7200 seconds authenticate_ip_ttl_is_strict on acl localnet1 src 192.168.20.0/255.255.255.0 acl abonnes proxy_auth REQUIRED http_access allow localnet1 abonnes http_access deny all For Pam(/etc/pam.d/squid): auth required /lib/security/pam_stack.so service=system-auth account required /lib/security/pam_stack.so service=system-auth But some unix accounts authentication succeeded while others failed. More if i do the test on the server with /usr/lib/squid/pam_auth the problem is the same. This version(squid-2.4.STABLE7-2mdk) does it have a problem with Mandrake 9.0? What can i do? Please HELP. -- Heureux ceux qui sont conscients de leur pauvreté spirituelle. - Mat 5:3(Les Saintes Ecritures - Traduction du monde nouveau Ce message a été envoyé depuis le serveur de messagerie de l'Université de Lomé. Université de Lomé BP 1515 Lomé TOGO.
Re: [squid-users] Multiple instances of Squid necessary for multiple IP's?
Adrian Chadd wrote: On Sat, Nov 03, 2007, Reid wrote: I'm running Squid on a server that has 3 IP addresses, and clients can connect to the proxy using any of the 3 IP's. Currently the OUTgoing IP always appears as a single IP, but I want the outgoing IP to appear as the IP that the client connected to. Is that possible without running 3 different instances of Squid on the server? I think you can use ACLs to match the outgoing IP selection based on the incoming IP. http://www.squid-cache.org/Versions/v2/2.6/cfgman/tcp_outgoing_address.html So you could do something like: http_port 1.1.1.1:3128 http_port 2.2.2.2:3128 http_port 3.3.3.3:3128 acl dstip1 myip 1.1.1.1 acl dstip2 myip 2.2.2.2 acl dstip3 myip 3.3.3.3 tcp_outgoing_address 1.1.1.1 dstip1 tcp_outgoing_address 2.2.2.2 dstip2 tcp_outgoing_address 3.3.3.3 dstip3 Maybe? I haven't tested it. Yes, that is the way to do it. Amos
Re: [squid-users] Basic Pam authentification problem with on mandrake 9.0
Edjé wrote: I have an understanding problem with squid-2.4 on red Hat 9.0. My settings are: For squid authentification: authenticate_program /usr/lib/squid/pam_auth authenticate_children 25 authenticate_ttl 1 hour authenticate_ip_ttl 7200 seconds authenticate_ip_ttl_is_strict on acl localnet1 src 192.168.20.0/255.255.255.0 acl abonnes proxy_auth REQUIRED http_access allow localnet1 abonnes http_access deny all For Pam(/etc/pam.d/squid): auth required /lib/security/pam_stack.so service=system-auth account required /lib/security/pam_stack.so service=system-auth But some unix accounts authentication succeeded while others failed. More if i do the test on the server with /usr/lib/squid/pam_auth the problem is the same. This version(squid-2.4.STABLE7-2mdk) does it have a problem with Mandrake 9.0? What can i do? Try a recent version of squid. 2.6 has been the current stable release for a few years and 3.0 is almost out the door real soon now. If the problem is still around after that, ask again. Amos
Re: [squid-users] Basic Pam authentification problem with on mandrake 9.0
ok i'll try it. Selon Amos Jeffries [EMAIL PROTECTED]: Edjé wrote: I have an understanding problem with squid-2.4 on red Hat 9.0. My settings are: For squid authentification: authenticate_program /usr/lib/squid/pam_auth authenticate_children 25 authenticate_ttl 1 hour authenticate_ip_ttl 7200 seconds authenticate_ip_ttl_is_strict on acl localnet1 src 192.168.20.0/255.255.255.0 acl abonnes proxy_auth REQUIRED http_access allow localnet1 abonnes http_access deny all For Pam(/etc/pam.d/squid): auth required /lib/security/pam_stack.so service=system-auth account required /lib/security/pam_stack.so service=system-auth But some unix accounts authentication succeeded while others failed. More if i do the test on the server with /usr/lib/squid/pam_auth the problem is the same. This version(squid-2.4.STABLE7-2mdk) does it have a problem with Mandrake 9.0? What can i do? Try a recent version of squid. 2.6 has been the current stable release for a few years and 3.0 is almost out the door real soon now. If the problem is still around after that, ask again. Amos -- Heureux ceux qui sont conscients de leur pauvreté spirituelle. - Mat 5:3(Les Saintes Ecritures - Traduction du monde nouveau). Ce message a été envoyé depuis le serveur de messagerie de l'Université de Lomé. Université de Lomé BP 1515 Lomé TOGO.
[squid-users] Squid cluster - flat or hierarchical
Hi, I have 4 Squid 2.6 reverse proxy servers sitting behind an LVS loadbalancer with 1 public IP address. In order to improve the hit rate all 4 servers are all peering with eachother using ICP. squid1 - sibling squid{2,3,4} squid2 - sibling squid{1,3,4} squid3 - sibling squid{1,2,4} squid4 - sibling squid{1,2,3} This works fine, apart from lots of warnings about forwarding loops in the cache.log I would like to ensure that the configs are optimized for an up and coming big traffic event. Can I disregard these forwarding loops and keep my squids in a flat structure or should I break them up into parent sibling relationships. Will the forwarding loop errors I am experiencing cause issues during a quick surge in traffic? Thanks, John
Re: [squid-users] Trying to trouble-shot a squid redirector error
On Fri, 2007-11-02 at 12:01 -0500, ying lcs wrote: I am trying to setup squid redirector on squid 2.6 STABLE 16 based on content type. i.e. if squid sees content type == text/plain redirects to 'http://127.0.0.1/dummy.txt'. A kind person helped me with this configuration for my needs: acl plain_content rep_mime_type -i text/plain redirect_program /usr/local/bin/myscript redirector_access allow plain_content #http_reply_access allow all icp_access allow all But I am still not able to get that to work. For example, when i request 127.0.0.1/plain.txt via squid, I expect I get the content of 127.0.0.1/dummy.txt (since squid redirect it based on content type), but I am getting content of 127.0.0.1/plain.txt. Here is my access.log: 1193971388.409 15 127.0.0.1 TCP_MISS/200 357 GET http://127.0.0.1/plain.txt - DIRECT/127.0.0.1 text/plain Redirector programs work on HTTP requests. The reply content type is an HTTP response property. By the time Squid knows the type, it is too late to redirect anything. Alex.
Re: [squid-users] squid3 WindowsUpdate failed
On Sun, 2007-11-04 at 19:30 +1300, Amos Jeffries wrote: I have just had the opportunity to do WU on a customers box and managed to reproduce one of the possible WU failures. This one was using WinXP, and the old WindowsUpdate (NOT MicrosoftUpdate, teht remains untested). With squid configured to permit client access to: # Windows Update / Microsoft Update # redir.metaservices.microsoft.com images.metaservices.microsoft.com c.microsoft.com windowsupdate.microsoft.com # # WinXP / Win2k .update.microsoft.com download.windowsupdate.com # Win Vista .download.windowsupdate.com # Win98 wustat.windows.com crl.microsoft.com AND also CONNECT access to www.update.microsoft.com:443 PROBLEM: The client box detects a needed update, then during the Download Updates phase it says ...failed! and stops. CAUSE: This was caused by a bug in squid reading the ACL: download.windowsupdate.com ... .download.windowsupdate.com - squid would detect that download.windowsupdate.com was a subdomain of .download.windowsupdate.com and .download.windowsupdate.com would be culled off the ACL as unneeded. - That culled bit held the wildcard letting v4.download.* and www.download.* be retrieved later in the process. - BUT, specifying JUST .download.windowsupdate.com would cause download.windowsupdate.com/fubar to FAIL under the same circumstances. during the WU process requests for application at www.download.windowsupdate.com/fubar and K/Q updates at v(3|4|5).download.windowsupdate.com/fubar2 would result in a 403 and thus the FAIL. SOLUTION: Changing the wildcard match to an explicit for fixes this and WU succeeds again. OR, Changing the wildcard to .windowsupdate.com also fixes the problem for this test. Can other folks experiencing Windows Update troubles with Squid3 confirm that their setup does not have the same ACL problem? In general, if we do not find a way to get more information about the Windows Update problem, we would have to assume it does not exist in most environments and release Squid3 STABLE as is. If you want the problem fixed before the stable Squid3 release, please help us reproduce or debug the problem. Thank you, Alex.
[squid-users] Quick question about an cache.log issue
In my cache.log I am getting 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'all' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' In my conf I only see two lines that have password in them # Use domain authentication (-G for domain global group) external_acl_type win_domain_group ttl=120 %LOGIN e:/squid/libexec/mswin_check_lm_group.exe -G # Users must be in the ProxyUsers group in AD (individual users no groups) acl ProxyUsers external win_domain_group ProxyAccess acl NoProxyUsers external win_domain_group NoProxyAccess # Require password for user account acl password proxy_auth REQUIRED http_access allow password ProxyUsers Do I have conflicting lines in my conf that would cause this behavior or are these normal entries for cache.log? Eric Young Senior Network Engineer Tully's Coffee Corporation 206.695.6504
[squid-users] Full domain block
Alas, it was all so perfectly planned. Grab some blacklists from Shalla - http://www.shallalist.de/ - and hook the domain lists into squid using dstdomain. Unfortunately, it seems squid's interpretation of domain names is incredibly literal, so rather than youtube.com blocking *.youtube.com, we in fact find that while youtube.com is blocked, www.youtube.com is just hunky dory because nsquid literally is blocking nothing but youtube.com. Since I'm running squidNT I am in a position where getting squidguard to run is a bit of a pain since I'll nened to get cygwin up and running and then it all feels like a bit of a hack. Is squidguard my only route here, or is there a way to tell squid to be rather more expansive in its domain name interpretation? Ideally this is something I need to get in place quickly. Paul IT Systems Admin TNT Post is the trading name for TNT Post UK Ltd (company number: 04417047), TNT Post (Doordrop Media) Ltd (00613278), TNT Post Scotland Ltd (05695897),TNT Post North Ltd (05701709) and TNT Post South West Ltd (05983401). Emma's Diary and Lifecycle are trading names for Lifecycle Marketing (Mother and Baby) Ltd (02556692). All companies are registered in England and Wales; registered address: 1 Globeside Business Park, Fieldhouse Lane, Marlow, Buckinghamshire, SL7 1HY.
Re: [squid-users] Full domain block
Paul Cocker wrote: Alas, it was all so perfectly planned. Grab some blacklists from Shalla - http://www.shallalist.de/ - and hook the domain lists into squid using dstdomain. Unfortunately, it seems squid's interpretation of domain names is incredibly literal, so rather than youtube.com blocking *.youtube.com, we in fact find that while youtube.com is blocked, www.youtube.com is just hunky dory because nsquid literally is blocking nothing but youtube.com. Since I'm running squidNT I am in a position where getting squidguard to run is a bit of a pain since I'll nened to get cygwin up and running and then it all feels like a bit of a hack. Is squidguard my only route here, or is there a way to tell squid to be rather more expansive in its domain name interpretation? Ideally this is something I need to get in place quickly. Paul IT Systems Admin You could probably pre-process the domain blacklist file by doing some variant of the following pseudocode while ( x = read from blacklist file) do print $x print *.$x done and send the output of that into your new domain blacklist. Cheers, /Jason
[squid-users] Optimal maximum cache size
Is there such a thing as too much disk cache? Presumably squid has to have some way of checking this cache, and at some point it takes longer to look for a cached page than to serve it direct. At what point do you hit that sort of problem, or is it so large no human mind should worry? :) Paul IT Systems Admin TNT Post is the trading name for TNT Post UK Ltd (company number: 04417047), TNT Post (Doordrop Media) Ltd (00613278), TNT Post Scotland Ltd (05695897),TNT Post North Ltd (05701709) and TNT Post South West Ltd (05983401). Emma's Diary and Lifecycle are trading names for Lifecycle Marketing (Mother and Baby) Ltd (02556692). All companies are registered in England and Wales; registered address: 1 Globeside Business Park, Fieldhouse Lane, Marlow, Buckinghamshire, SL7 1HY.
RE: [squid-users] Full domain block
You'll have to modify each domain entry for squid dstdomain. The line containing youtube.com has to be .youtube.com in order for squid to block the entire domain. Thomas J. Raef e-Based Security, LLC www.ebasedsecurity.com 1-866-838-6108 You're either hardened, or you're hacked! -Original Message- From: Paul Cocker [mailto:[EMAIL PROTECTED] Sent: Monday, November 05, 2007 12:36 PM To: squid-users@squid-cache.org Subject: [squid-users] Full domain block Alas, it was all so perfectly planned. Grab some blacklists from Shalla - http://www.shallalist.de/ - and hook the domain lists into squid using dstdomain. Unfortunately, it seems squid's interpretation of domain names is incredibly literal, so rather than youtube.com blocking *.youtube.com, we in fact find that while youtube.com is blocked, www.youtube.com is just hunky dory because nsquid literally is blocking nothing but youtube.com. Since I'm running squidNT I am in a position where getting squidguard to run is a bit of a pain since I'll nened to get cygwin up and running and then it all feels like a bit of a hack. Is squidguard my only route here, or is there a way to tell squid to be rather more expansive in its domain name interpretation? Ideally this is something I need to get in place quickly. Paul IT Systems Admin TNT Post is the trading name for TNT Post UK Ltd (company number: 04417047), TNT Post (Doordrop Media) Ltd (00613278), TNT Post Scotland Ltd (05695897),TNT Post North Ltd (05701709) and TNT Post South West Ltd (05983401). Emma's Diary and Lifecycle are trading names for Lifecycle Marketing (Mother and Baby) Ltd (02556692). All companies are registered in England and Wales; registered address: 1 Globeside Business Park, Fieldhouse Lane, Marlow, Buckinghamshire, SL7 1HY.
[squid-users] squidGuard 1.3.0 released
We are pleased to announce the availability of the release 1.3.0 of squidGuard. squidGuard-1.3.0 is based on the original squidguard-1.2.0 codebase, but has many new publicly available enhancements and features which have been developed over the last six years after squidGuard-1.2.0 was released, and these have now been rolled into this formal squidguard-1.3.0 release. This version also adds native Windows support using the MSYS+MinGW build environment. This new release can be downloaded from the squidGuard Sourceforge project: http://sourceforge.net/project/showfiles.php?group_id=184120 The most important new additions in this squidGuard-1.3.0 release are: * Imported squidguard-sed.patch from K12LTSP project. This allow squidGuard to rewrite the Google URL with the safe=active tag * Updated the redirector protocol to Squid 2.6 version * Imported netdirect-squidGuard-full.patch based on work of Chris Frey and Adam Gorski * Native Windows port using MSYS+MinGW environment We openly welcome and encourage bug reports should you run into any issues with the new release. Bug reports can be entered into the squidGuard Bug Tracker at: http://sourceforge.net/tracker/?group_id=184120atid=907981 This squidGuard-1.3.0 software was brought to you by Guido Serassio and Norbert Szasz, and is mainly based on many third-party contributions made available over the years. Many thanks to all contributors who have submitted new features. This works is not related in any way with the so called official squidGuard project at the new www.squidguard.org. Note: If there is interest in becoming an official sponsor for the ongoing squidGuard maintenance or development efforts please contact using the project forum at http://sourceforge.net/forum/?group_id=184120 Best regards Guido Serassio Norbert Szasz
[squid-users] FreeBSD, enable or not memory_pools
Hello !! Wich is best for FreeBSD, enable or disable memory_pools ? freebsd 6.2 amd64 regards !! -- Sds. Alexandre J. Correa Onda Internet / OPinguim.net http://www.ondainternet.com.br http://www.opinguim.net
Re: [squid-users] Quick question about an cache.log issue
In my cache.log I am getting Looks to me like: 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' someone forgot their password. or browsers first request for the item. 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' remembered password, or browser passed it on this time. 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'all' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' In my conf I only see two lines that have password in them # Use domain authentication (-G for domain global group) external_acl_type win_domain_group ttl=120 %LOGIN e:/squid/libexec/mswin_check_lm_group.exe -G # Users must be in the ProxyUsers group in AD (individual users no groups) acl ProxyUsers external win_domain_group ProxyAccess acl NoProxyUsers external win_domain_group NoProxyAccess # Require password for user account acl password proxy_auth REQUIRED http_access allow password ProxyUsers Do I have conflicting lines in my conf that would cause this behavior or are these normal entries for cache.log? Everything in cache.log is normal for cache.log. Whether they are normal entries under your configuration is a more knotty question. Without knowing the rest of the squid.conf we can't answer whether any of the unknown lines are conflicting. The one access line you have listed could only cause: - allowed because of 'ProxyUsers' - denied because of 'ProxyUsers' give http://squid.treenet.co.nz/cf.check/ a post me the Ref:. Amos
Re: [squid-users] Optimal maximum cache size
Is there such a thing as too much disk cache? Presumably squid has to have some way of checking this cache, and at some point it takes longer to look for a cached page than to serve it direct. At what point do you hit that sort of problem, or is it so large no human mind should worry? :) Paul IT Systems Admin Disk cache is limited by access time and ironically RAM. Squid holds an in-memory index of 10MB-ram per GB-disk. With large disk caches this can fill RAM pretty fast, particularly if the cache is full of small objects. Large objects use less index space more disk. Some with smaller systems hit the limit at 20-100GB, others in cache farms reach TB. As for the speed of lookup vs DIRECT. If anyone has stats, please let us know. Amos
RE: [squid-users] Quick question about an cache.log issue
Thanks Amos, the tool (http://squid.treenet.co.nz/cf.check/) was very ... enlightening. I think this squid newbie will work through some of the errors and warnings before I post back. :) Thanks Eric Young -Original Message- From: Amos Jeffries [mailto:[EMAIL PROTECTED] Sent: Monday, November 05, 2007 3:33 PM To: Eric Young Cc: squid-users@squid-cache.org Subject: Re: [squid-users] Quick question about an cache.log issue In my cache.log I am getting Looks to me like: 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' someone forgot their password. or browsers first request for the item. 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' remembered password, or browser passed it on this time. 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request GET http://cp.slalom.com/ is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The reply for GET http://cp.slalom.com/ is ALLOWED, because it matched 'all' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is DENIED, because it matched 'password' 2007/11/05 09:23:42| The reply for CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'password' 2007/11/05 09:23:42| The request CONNECT cp.slalom.com:443 is ALLOWED, because it matched 'ProxyUsers' In my conf I only see two lines that have password in them # Use domain authentication (-G for domain global group) external_acl_type win_domain_group ttl=120 %LOGIN e:/squid/libexec/mswin_check_lm_group.exe -G # Users must be in the ProxyUsers group in AD (individual users no groups) acl ProxyUsers external win_domain_group ProxyAccess acl NoProxyUsers external win_domain_group NoProxyAccess # Require password for user account acl password proxy_auth REQUIRED http_access allow password ProxyUsers Do I have conflicting lines in my conf that would cause this behavior or are these normal entries for cache.log? Everything in cache.log is normal for cache.log. Whether they are normal entries under your configuration is a more knotty question. Without knowing the rest of the squid.conf we can't answer whether any of the unknown lines are conflicting. The one access line you have listed could only cause: - allowed because of 'ProxyUsers' - denied because of 'ProxyUsers' give http://squid.treenet.co.nz/cf.check/ a post me the Ref:. Amos
RE: [squid-users] Quick question about an cache.log issue
Thanks Amos, the tool (http://squid.treenet.co.nz/cf.check/) was very ... enlightening. I think this squid newbie will work through some of the errors and warnings before I post back. :) It's new code and still undergoing some extensions. Some items listed as 'not present version X' actually are present, but have not had checks added to the tool yet. Also, some ACL logics are beyond its ability to discern for now. So I still like to check the output manually and both fix it and give you slightly better feedback than it would. Amos
Re: [squid-users] FreeBSD, enable or not memory_pools
Hi Alexandre, Alexandre Correa wrote: Hello !! Wich is best for FreeBSD, enable or disable memory_pools ? freebsd 6.2 amd64 The default value seems to work fine for me. But you are free to experiment with it and report back your results! regards !! -- With best regards and good wishes, Yours sincerely, Tek Bahadur Limbu System Administrator (TAG/TDG Group) Jwl Systems Department Worldlink Communications Pvt. Ltd. Jawalakhel, Nepal http://www.wlink.com.np http://teklimbu.wordpress.com
Re: [squid-users] FreeBSD, enable or not memory_pools
i´m using memory_pools on memory_pools_limit 16 MB working fine.. :) On Nov 6, 2007 3:31 AM, Tek Bahadur Limbu [EMAIL PROTECTED] wrote: Hi Alexandre, Alexandre Correa wrote: Hello !! Wich is best for FreeBSD, enable or disable memory_pools ? freebsd 6.2 amd64 The default value seems to work fine for me. But you are free to experiment with it and report back your results! regards !! -- With best regards and good wishes, Yours sincerely, Tek Bahadur Limbu System Administrator (TAG/TDG Group) Jwl Systems Department Worldlink Communications Pvt. Ltd. Jawalakhel, Nepal http://www.wlink.com.np http://teklimbu.wordpress.com -- Sds. Alexandre J. Correa Onda Internet / OPinguim.net http://www.ondainternet.com.br http://www.opinguim.net
Re: [squid-users] Squid cluster - flat or hierarchical
John Moylan wrote: Hi, I have 4 Squid 2.6 reverse proxy servers sitting behind an LVS loadbalancer with 1 public IP address. In order to improve the hit rate all 4 servers are all peering with eachother using ICP. squid1 - sibling squid{2,3,4} squid2 - sibling squid{1,3,4} squid3 - sibling squid{1,2,4} squid4 - sibling squid{1,2,3} This works fine, apart from lots of warnings about forwarding loops in the cache.log I would like to ensure that the configs are optimized for an up and coming big traffic event. Can I disregard these forwarding loops and keep my squids in a flat structure or should I break them up into parent sibling relationships. Will the forwarding loop errors I am experiencing cause issues during a quick surge in traffic? The CARP peering algorithm has been specialy designed and added to cope efficiently with large arrays or clusters of squid. IFAIK it's as simple as adding the 'carp' option to your cache_peer lines in place of other such as round-robin. http://www.squid-cache.org/Versions/v2/2.6/cfgman/cache_peer.html Amos
RE: [squid-users] Full domain block
Thanks, chaps. Should be easy enough as there's a line break prior to each name so a simple search replace should nail them all. Paul Cocker IT Systems Administrator TNT Post -Original Message- From: Thomas Raef [mailto:[EMAIL PROTECTED] Sent: 05 November 2007 19:12 To: squid-users@squid-cache.org Subject: RE: [squid-users] Full domain block You'll have to modify each domain entry for squid dstdomain. The line containing youtube.com has to be .youtube.com in order for squid to block the entire domain. Thomas J. Raef e-Based Security, LLC www.ebasedsecurity.com 1-866-838-6108 You're either hardened, or you're hacked! -Original Message- From: Paul Cocker [mailto:[EMAIL PROTECTED] Sent: Monday, November 05, 2007 12:36 PM To: squid-users@squid-cache.org Subject: [squid-users] Full domain block Alas, it was all so perfectly planned. Grab some blacklists from Shalla - http://www.shallalist.de/ - and hook the domain lists into squid using dstdomain. Unfortunately, it seems squid's interpretation of domain names is incredibly literal, so rather than youtube.com blocking *.youtube.com, we in fact find that while youtube.com is blocked, www.youtube.com is just hunky dory because nsquid literally is blocking nothing but youtube.com. Since I'm running squidNT I am in a position where getting squidguard to run is a bit of a pain since I'll nened to get cygwin up and running and then it all feels like a bit of a hack. Is squidguard my only route here, or is there a way to tell squid to be rather more expansive in its domain name interpretation? Ideally this is something I need to get in place quickly. Paul IT Systems Admin TNT Post is the trading name for TNT Post UK Ltd (company number: 04417047), TNT Post (Doordrop Media) Ltd (00613278), TNT Post Scotland Ltd (05695897),TNT Post North Ltd (05701709) and TNT Post South West Ltd (05983401). Emma's Diary and Lifecycle are trading names for Lifecycle Marketing (Mother and Baby) Ltd (02556692). All companies are registered in England and Wales; registered address: 1 Globeside Business Park, Fieldhouse Lane, Marlow, Buckinghamshire, SL7 1HY. TNT Post is the trading name for TNT Post UK Ltd (company number: 04417047), TNT Post (Doordrop Media) Ltd (00613278), TNT Post Scotland Ltd (05695897),TNT Post North Ltd (05701709) and TNT Post South West Ltd (05983401). Emma's Diary and Lifecycle are trading names for Lifecycle Marketing (Mother and Baby) Ltd (02556692). All companies are registered in England and Wales; registered address: 1 Globeside Business Park, Fieldhouse Lane, Marlow, Buckinghamshire, SL7 1HY.