Re: [squid-users] Newbie and can't figure ACL out
dear jess, Your acl formation may be the problem on this. Use as, acl news1 dstdomain www.foxnews.com www.cnn.com www.espn.com acl news1 dstdomain .foxnews.com .cnn.com .espn.com http_access deny news1 http_access allow all I have restarted the Squid Service several times but to no avail. For something that seems so simple, I'm just not getting it. Can you try now. It will work. regards -muthu
Re: [squid-users] Error-page
like this client http://www.kyk.gov.tr:7779 squid prints ERR_ACCESS_DENIED but when somebody tries https like client https://www.tspakb.org.tr:8445 server prints only some lines of the ERR_ACCESS_DENIED page I could not find the reason if sb knows mail me Give your squid.conf configurations for ACL + HTTP Acess settings as getting as, grep -E '^[ ]*acl|^[ ]*http_access' squid.conf Access will be blocked with http_access settings. Regards -Muthu
Re: [squid-users] blocking some ip's for some websites.
I am using 172.16.0.0 in my internal network and I want to block these ip's 172.16.20.12 172.16.20.14 172.16.20.23 172.16.20.120 172.16.20.45 172.16.20.67 172.16.20.89 Make an acl as, acl blockip src 172.16.20.12/32 172.16.20.14/32 172.16.20.23/32 172.16.20.120/32 172.16.20.45/32 172.16.20.67/32 172.16.20.89/32 from accessing mail.yahoo.com ,hotmail.com (May be regex required for *.mail.yahoo.* , *.hotmail.* , and *.messenger.msn.com ) and MSN Messenger.. Again make a new acl as, acl blocksite dstdomain .mail.yahoo.com .hotmail.com .messenger.msn.com How should I proceed with this setup so that only these users are effected and rest of all are allowed to brows any website on Internet cloud. Make http_access rule as, http_access deny blockip blocksite Best Regards -Muthukumar
Re: [squid-users] FEATURE-Request - client ip and port or username
Use diff -Nur oldfile newfile to make a patch. Refer more here, http://www.squid-cache.org/Devel/guidelines.html Best Regards -Muthukumar
Re: [squid-users] Number of users currecntly connected?
Is there any way I can find out how many users are currently connecte to squid. It should give dynamic status of cunnected users. You can use access.log file to get web-request made username / ip-address in the given time duration with simple perl script or shell script and automate it's execution with cron. OR If squid can not do this, is there any open source utility which do this ? Let this try and make it to be be used by ALL :-) Best Regards -Muthukumar
[squid-users] web-caching survery - business type
Dear All, Wishes for New Year 2005. We are analyzing web-caching usage pattern report based on business types like, Schools collegs, Small to Medium Business, Enterprises and ISP's. I have tried to get the survey with the following links, http://workshop97.ircache.net/minutes.html http://www.avantisworld.com/02_cddvd_cds_faqs.asp Business Type ==Cacheable request % Schools, Colleges 45% - 50% Small to Medium business 35% - 45% Enterprise 25% - 35% ISP's20% - 30% IS there any survey report on this? thanks Muthu
[squid-users] capacity planning
Hello All, We are planning to make a tool to give capacity planning to SQUID. Objective of tool: 1. Based on number of users, hardware + squid configuration will be suggested. 2. Based on hardware configuration ( HDD, RAM, CPU), 1. number of users can be serviced from squid 2. updation needed on hardware to reach required users count Using polygraph benchmark tool, we are getting number of req / sec satisfaction on squid cache server ( 2.5, 3.0 ). We are preparing metrics for designed hardware set ( CPU, HDD, RAM). I hope some of us may be done benchmarking on squid-2.5, 3.0 in different hardware setups. So that, We are need in of prepared req / sec rate satisfaction, hardware configuration, squid configuration tunning, linux tunning details to make metrics. Based on calculated metrics, We are planning to make GUI to get capacity planning. thanks muthukumar.
[squid-users] squid-2.5 s7 polygraph benchmarking
Hello All, When I tried to benchmark squid 2.5 stable 7, getting problem with TIME_WAIT on polygraph server. setup: polclt == squid == polyserver 10.1.1.1-5010.1.129.1-250 TIME_WAIT TIME_WAIT Polygraph server: model name : Pentium III (Coppermine) cpu MHz : 927.748 Polyclt, squid running with configuration: model name : AMD Athlon(tm) XP 2400+ cpu MHz : 2001.015 cache_dir 4096 MB cache_mem 40 MB Questions: 1. what is the problem to get X-Squid-Error: ERR_CONNECT_FAIL 113 / HTTP/1.0 503 Service Unavailable? 2. Do we have to tune kernel parameters for benchmarking squid? Status report from polyclient (2.8.1) error as, = 000.89| Xaction.cc:74: error: 16/16 (c19) unsupported HTTP status code 1101808153.584234# obj: http://10.1.130.98:18256/w0b4af796.794153cd:021a/t04/_0001 flags: basic,GET, size: 0/-1 xact: 0b4af796.794153cd:044e HTTP/1.0 503 Service Unavailable Server: vicache/2.5.STABLE7 Mime-Version: 1.0 Date: Tue, 30 Nov 2004 09:48:22 GMT Content-Type: text/html Content-Length: 1293 Expires: Tue, 30 Nov 2004 09:48:22 GMT X-Squid-Error: ERR_CONNECT_FAIL 113 X-Cache: MISS from polyclt2 Proxy-Connection: keep-alive !DOCTYPE HTML PUBLIC -//W3C//DTD HTML 4.01 Transitional//EN http://www.w3.org/TR/html4/loose.dtd; HTMLHEADMETA HTTP-EQUIV=Content-Type CONTENT=text/html; charset=iso-8859-1 TITLEERROR: The requested URL could not be retrieved/TITLE STYLE type=text/css!--BODY{background-color:#ff;font-family:verdana,sans-serif}PRE{font-family:sans-serif}--/STYLE /HEADBODY H1ERROR/H1 H2The requested URL could not be retrieved/H2 HR noshade size=1px P While trying to retrieve the URL: A HREF=http://10.1.130.98:18256/w0b4af796.794153cd:021a/t04/_0001;http://10.1.130.98:18256/w0b4af796.794153cd:021a/t04/_0001/A P The following error was encountered: UL LI S Suggest easy good way of benchmarking squid! Thanks -Muthu
Re: Re[2]: [squid-users] squid3 pre3-20041102: WARNING: transparent proxying not supported (visolve not help)
B Same was the problem when I first time created my squid box from B scratch. I had actualy been missing.. B httpd_accel_host virtual B httpd_accel_port 80 B httpd_accel_with_proxy on B httpd_accel_uses_host_header on B enabled in my squid.conf The above configurations are based on squid-2.5 series. Visolve t-proxy white paper details only squid-2.5 B or may some other problem with your paricular config file. I'm using squid3. These commands have no sense there: /var/log/squid# /etc/init.d/squid start Starting proxy server: 2004/11/03 09:42:15| parseConfigFile: 'squid.conf' line 1 unrecognized: 'httpd_accel_host virtual' 2004/11/03 09:42:15| parseConfigFile: 'squid.conf' line 2 unrecognized: 'httpd_accel_port 80' 2004/11/03 09:42:15| parseConfigFile: 'squid.conf' line 3 unrecognized: 'httpd_accel_with_proxy on' 2004/11/03 09:42:15| parseConfigFile: 'squid.conf' line 4 unrecognized: 'httpd_accel_uses_host_header on' Your squid.conf file contains squid-2.5 tag informations there. I need to specify: http_port 10.2.254.1:3113 transparent It is correct for squid-3.0. Are you redirecting all incoming 80 requests to 3113 port? check squid.conf file and /etc/init.d/squid startup SQUID version. networks acls are also good... that's why I don't understand the problem. --Muthu --- Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.786 / Virus Database: 532 - Release Date: 10/29/2004
Re: [squid-users] squid + epoll polygraph test
Hai Gonzalo, I've been using squid3 with epoll support for a couple of months. In my case, squid with poll/select did consume up to 100% CPU. With epoll, CPU usage dropped to less than 10%. It seems to be great. How many requests are being generated per second? Are you using squid-3.0-pre3+latest patch for epoll(). I am on analysis of squid-3.0pre3 + epoll() requests satisfaction / second there. Compilation: ./configure --prefix=/home/muthu/squidepoll --enable-epoll --with-aufs-threads=32 --with-descriptors=32768 --with-pthreads --enable-storeio=null,ufs,aufs --disable-poll --disable-select --disable-kqueue Configuration: cache_mem 90 MB( 200 MB RAM ) cache_dir null /dev/null cache_access_log none cache_store_log none Long term average max CPU usage: http://webs.uolsinectis.com.ar/garana/x/cpu.4.png With epoll, CPU usage over the last 24 hours: http://webs.uolsinectis.com.ar/garana/x/cpu.png Thanks for informations. Regards --Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.786 / Virus Database: 532 - Release Date: 10/29/2004
[squid-users] squid + epoll polygraph test
Hello All, I am preparing epoll() I/O method benchmarking with Polygraph (Polygraph 2.5.5) with setup as, squid + epoll(): Linux host 2.6.5-1.358 #1 i686 athlon i386 GNU/Linux Squid Cache: Version 3.0-PRE3 configure options: '--prefix=/home/muthu/squidepoll' '--enable-epoll' '--with-aufs-threads=32' '--with-descriptors=32768' '--with-pthreads' '--enable-storeio=null,ufs,aufs' '--disable-poll' '--disable-select' '--disable-kqueue' Polygraph server-1: Linux host 2.4.18-14 #1 i686 i686 i386 GNU/Linux Polygraph 2.5.5 Polygraph client-1: Linux host 2.4.18-14 #1 i686 i686 i386 GNU/Linux Polygraph 2.5.5 Is there anyone benchmarked squid+epoll() on polygraph? How may I expect requests satisfaction limit on Linux host 2.6.5-1.358 #1 i686 athlon i386 GNU/Linux platform? During polygraph testing, I am getting errors as, 004.03| ./Xaction.cc:79: error: 1/1 (267) unsupported HTTP status code 004.03| ./Xaction.cc:79: error: 2/2 (267) unsupported HTTP status code --- Polygraph configuration -- Bench benchPolyMix3 = { peak_req_rate = undef(); // must be set client_addr_mask = '10.1.0.0';// may be adjusted server_addr_mask = '10.1.0.0:18256'; // may be adjusted max_client_load = 800/sec; // maximum load per Polygraph PC max_robot_load = 0.4/sec;// maximum robot request rate client_host_count = undef(); // number of polyclts in the bench }; ObjLifeCycle olcStatic = { birthday = now + const(-1year); // born a year ago length = const(2year); // two year cycle variance = 0%; // no variance with_lmt = 100%;// all responses have LMT expires = [nmt + const(0sec)]; // everything expires when modified }; ObjLifeCycle olcHTML = { birthday = now + exp(-0.5year); // born about half a year ago length = logn(7day, 1day); // heavy tail, weekly updates variance = 33%; with_lmt = 100%;// all responses have LMT expires = [nmt + const(0sec)]; // everything expires when modified }; ObjLifeCycle olcImage = { birthday = now + exp(-1year); // born about a year ago length = logn(30day, 7day);// heavy tail, monthly updates variance = 50%; with_lmt = 100%; // all responses have LMT expires = [nmt + const(0sec)]; // everything expires when modified }; // object life cycle for Download content ObjLifeCycle olcDownload = { birthday = now + exp(-1year); // born about a year ago length = logn(0.5year, 30day); // almost no updates variance = 33%; with_lmt = 100%; // all responses have LMT expires = [nmt + const(0sec)]; // everything expires when modified }; // object life cycle for Other content ObjLifeCycle olcOther = { birthday = now + exp(-1year); // born about half a year ago length = unif(1day, 1year); variance = 50%; with_lmt = 100%; // all responses have LMT expires = [nmt + const(0sec)]; // everything expires when modified }; // PolyMix-1 content Content cntPolyMix_1 = { kind = polymix-1; // just a label mime = { type = undef(); extensions = []; }; size = exp(13KB); obj_life_cycle = olcStatic; cachable = 80%; }; Content cntImage = { kind = image; mime = { type = undef(); extensions = [ .gif, .jpeg, .png ]; }; obj_life_cycle = olcImage; size = exp(4.5KB); cachable = 80%; }; Content cntHTML = { kind = HTML; mime = { type = undef(); extensions = [ .html : 60%, .htm ]; }; obj_life_cycle = olcHTML; size = exp(8.5KB); cachable = 90%; may_contain = [ cntImage ]; embedded_obj_cnt = zipf(13); }; Content cntDownload = { kind = download; mime = { type = undef(); extensions = [ .exe: 40%, .zip, .gz ]; }; obj_life_cycle = olcDownload; size = logn(300KB, 300KB); cachable = 95%; }; Content cntOther = { kind = other; obj_life_cycle = olcOther; size = logn(25KB, 10KB); cachable = 72%; }; Phase phWait = { name = wait; goal.xactions = 1; log_stats = false; }; Phase phCool = { name = cool; goal.duration = 1min; load_factor_end = 0; log_stats = false; }; Bench TheBench = benchPolyMix3; // start with the default settings TheBench.peak_req_rate = 200/sec; size ProxyCacheSize = 12GB; rate FillRate = 90%*TheBench.peak_req_rate; TheBench.client_host_count = clientHostCount(TheBench); // robots and servers will bind to these addresses addr[] rbt_ips = robotAddrs(TheBench); // or ['127.0.0.1' ** 2 ]; addr[] srv_ips = serverAddrs(TheBench); // or ['127.0.0.1:8080', '127.0.0.1:' ]; // popularity model for the robots PopModel popModel = { pop_distr = pmUnif(); hot_set_frac = 1%; // fraction of WSS (i.e., hot_set_size / WSS) hot_set_prob = 10%; // prob. of req. an object from the hot set }; // describe PolyMix-3 server Server S = { kind = PolyMix-3-srv; contents = [ cntImage: 65%, cntHTML: 15%, cntDownload: 0.5%, cntOther ]; direct_access = [ cntHTML, cntDownload, cntOther ]; xact_think = norm(2.5sec, 1sec); pconn_use_lmt = zipf(16); idle_pconn_tout = 15sec;
Re: [squid-users] failed to make swap
i try to reinstall squid and i have this error /usr/local/squid/sbin/squid -z FATAL: Failed to make swap directory /usr/local/squid/var/cache/00: (13) Permission denied Squid Cache (Version 2.5.STABLE7): Terminated abnormally. Are you trying to execute squid -z as squid user? Change permission to squid/var/* directory to squid:squid. It will work. --Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
Re: [squid-users] squid not starting
linux-pour-lesnuls var]# /usr/local/squid/sbin/squid -z 2004/10/30 08:41:05| Creating Swap Directories i start squid /usr/local/squid/sbin/squid Start squid in debug mode as, /usr/local/squid/sbin/squid -NCd 10 ?? what is saying --Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
Re: [squid-users] flat file parsing vs db filter rules parsing
Hello Henrik, Thanks for youe detailed explanations. We are trying to attain good performace compared to DB filters, so that which database will be appropriate to again this. Selection list is as, MySQL, .. BDB... As I said, most peole needing performance in this kind of applications selects Berkeley DB. We are having two way of processing to get filter rules from DB as, 1.strtokFile() -- reads filter rules from DB file. (acl test urlpath_regex -i /etc/database.db) Processed filter rules are stored in system memory with splay tree's data sturcture / linked structures. Reads stored filter rules from system memory and process client requests. It requires, Automatic updation on DB needed. Reconfiguration of squid will keep new changes. 2. strtokFile() --- reads filter rules from FLAT file (/etc/urlsites) ( acl test urlpath_regex -i /etc/urlsites ) Processed filter rules are then moved into a database as CONTIGUOUS manner ( Marshelling on BDB ). Reads stored filter rules from DB, store into the system memory (UnMarshelling on BDB) Then process every client requests. It requires, Automative updation of FLAT files so that DB changes will be modified based on it. Reconfiguration of squid will keep new changes. In this, which design will give performance differance. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
Re: [squid-users] flat file parsing vs db filter rules parsing
Will squid only parse 256 characters of filter rules in that file? what will happened when the pattern limit exceeds 256 length? This is the line limit, not the limit of the file. Yes, what will happened, when the pattern line length increased more than 256? Is there any default settings in system to limit a line length with 256. Do we have a need to increase Line length when we use pattern more than 256? How Did developers select 256 characters for maximum length to a line ? We are needing your another guide on selecting Database. Which will be good to use? We are now progress with MySQL. A local MySQL database may be fine, but most applications doing things like this selects to use a Berkerly DB file.. We are trying to attain good performace compared to DB filters, so that which database will be appropriate to again this. Selection list is as, MySQL, .. BDB... Can you prefer fastest and efficient DB? Thanks for your help. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
Re: [squid-users] Problem with ACL ???
I want my users download software on weekend only, I set a ACL as following: ACL download urlpath_regex \.exe$ \.EXE$ Are you using ACL ??! Try as, acl download urlpath_regex -i \.exe$ -i will be used for case in-sensitive Http_access allow download weekend Http_access deny all http_access not Http_access. Based on above two rules, it will only allow to access web on week-end for download. All other access will be denied. what is the weekend acl info's? can you post your squid.conf ( removing ^# (comments) and blank lines ) -muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
Re: [squid-users] flat file parsing vs db filter rules parsing
If you are doing this inside Squid then whatever you do should fulfill the non-blocking property. You do not want Squid to stop processing requests only because it is waiting for an response from the DB system. We have started analysis on making flat file filter rules to DB Based. Our Objective is to make as, acl aclname acltype FLAT FILE -- acl Aclname acltype DB We are planning to use MySQL as DB, because of good API support to coding. How the flat file FLAT FILE filter rules are parsed and stored in linked list structure? I have tried to start squid with debug_options ALL,9 to get some useful informations regarding flat file parsing, But FLATFILE parsing is not being in debug output as, grep 'testsite' cache.log 2004/10/28 09:34:29| aclMatchAcl: checking 'acl site dstdomain -i /usr/local/squidsauth/etc/testsite' 2004/10/28 09:34:29| aclMatchAcl: checking 'acl site dstdomain -i /usr/local/squidsauth/etc/testsite' Which file parses FLATFILE details and storing into system memory on squid? Thanks for your validation and inputs. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.783 / Virus Database: 529 - Release Date: 10/25/2004
[squid-users] flat file parsing vs db filter rules parsing
Hello All, We can parse and make filter rules with flat file manner ( squid configuration file parsing ) and database oriented parsing and make filter rules ( squidguard ). Is it correct? what is the difference between these? Is there any performance, time rate and difference factors between them? Can we get parsing configuration files on flat file and D/B's from any documents? Currently, We are on-progress with *cf* source files. It is good to know about difference between those. Thanks in advance for your sharing. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.779 / Virus Database: 526 - Release Date: 10/19/2004
Re: [squid-users] flat file parsing vs db filter rules parsing
Hello Henrik, Thanks once again for your reply on this. I heard Performance and parsing time using db based are fast. Can we adopt db based filter rules parsing on squid-2.5 series without using any redirectors there. How the squid-3.0 adaptation will be differed from now? The difference is the parsing time. The lookup time is the same. When using a db based filter parsing is done when building the db, when using a flat file parsing is done when reading the configuration. lookup time is determined mainly by the type of acl, not how it is stored. So look up time to get the type of ACL is same. But the Regex pattern's and information strings related to acl filters parsing time is the difference factor. Can you prefer, how we can know filter rules parsing with flat files or db based conceptually? We are on-going with source files of squid's *cf* .c and .h files. We are on the analysis to improve squid filter rules parsing and filter adapation. squid only have the flat file approach to acl specifications, parsing the whole acl each time the configuration file is read, storing the parsed result in memory for optimal lookup time. It should be noted that when the acl is parsed it is no longer a flat file but using other structures (depending on the acl type). So squid is doing filters parsing with FLAT files. Is squid developement team having an idea to deploy DB based FILTER pattern parsing Recognition rules as like squidGuard? squidGuard have the option to select db based or flat file. As said earlier the lookup performance is identical, but the startup performance (parsing) is significantly different for very large lists. We are on the analysis, to deploy DB based filter parsing with SQUID. Is it good and efficient based on parsing time performance.? Does any one had analysed DB filter parsing comparision with FLAT File parsing? Is squid 3.0 Series going to support DB Based filters as like squidGuard. (or) Going to give more supportivity to filter redirectors to get varied on performace for very large lists of filters? Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.779 / Virus Database: 526 - Release Date: 10/19/2004
Re: [squid-users] flat file parsing vs db filter rules parsing
Hello Henrik, Thanks for your great replies and detailed analysis. We are having requirements as 1. To analyse performace difference between flat files vs db based one 2. Adaptability of DB configuration files on SQUID 3. Code coverage of filter rules parsing 4. Make a filter which adapted with DB files. Regex patterns does not allow for an efficient lookup. All regex based ACLs will by definition use a linear list where lookup time is linear to the number of entries in the acl. This applies to both Squid and SquidGuard. So Filter rules lookup are linearly based on pattern rule entries count. You need to use one of the more structured acls to benefit from structured lookups. In Squid dstdomain is a good example of a structured acl as domains have a certain structure (hierarchical, alphabetically sortable) which makes it possible to use a very efficient lookup mechanism to determine if any given domain is matched by the acl or not. Lookup mechanism will be based on information availability stuctures as Flat files or database with indexing, etc.. If we are going to use really big filter rules then performance of lookup will be differed more. In Squid splay trees are used in this kind of ACLs. I don't remember what SquidGuard is using for their equivalent acl type.. I had a lookup on splay tree concepts. But I tried to know about squidguard but squidguard domain is not reachable? We are having a objective as To make a filter with DB based configuration informations. It has to reach efficient performace with adaptation with squid. We are on the analysis TODO this. Do we have to follow any RFC format to make a filter with DB based one? Thanks for your review. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.779 / Virus Database: 526 - Release Date: 10/19/2004
Re: [squid-users] Configuration Guide
On http://www.squid-cache.org/Doc/ there is a link to Configuration Guide that just takes me to the front of some site. I looked around and couldn't find a guide there. It sounds like they want to charge for the guide? You can configuration guide from, http://www.visolve.com/squid/configuration_manual.htm for squid-2.4 stable releases. Henrick, Change Configuration guide link on http://www.squid-cache.org/Doc/ of http://www.visolve.com/ to http://squid.visolve.com/squid/configuration_manual.htm or http://squid.visolve.com/squid/index.htm Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.779 / Virus Database: 526 - Release Date: 10/19/2004
Re: [squid-users] Bypassing Squid for local address destination
I want all requests from (local) my clients to be directly forwarded to the destination, in case of local destination, in order to minimize SQUID's load. Can anybody tell me how to configure SQUID for such purpose ? To redirect all local requests to their corresponding domain then, 1.Browser side: Configure as Don't use proxy for local Address. 2. If you are sending ALL requests via squid without configuring browser not use squid for localhosts then, we can do at the minimum of no_cache ( Not caching those requests) to that particular request. Else, use a redirector to send all in-coming requests based LOCALHOST directly without using SQUID. Regards Muthu. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] store.log
What is function of store.log file. It will tell what objects with size are stored in squid cache directory. See more on squid.conf.default file cache_store_log TAG. It has become to big in size. /var is getting short of space. It is good to make cache_store_log none there in squid.conf file, if you need that store.log informations. If you are getting problem with logfiles size, then logrotate / make zip those log files to use them in future. IF you do tar + bzip2 then, it will be more reduced there. Regards Muthu. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] Problems with Authenticator
has authenticated himself. If the authentication fails, the authenticator returns ERR. It is correct. My first try is this dummy-authenticator: When I use this authenticator, I am prompted for a username and password. After authenticating with any username and password, the proxy says: Yes it is asking and working. I have tried this in my Cache server for authentication type and it is working there. Cache Access Denied. http://www.google.com/ It seems your http_access rule is not allowing to access cache there. auth_param basic program /usr/src/null_auth Are you having the authenticator in this location with executable permission there? Try on command line as, /usr/src/null_auth test test OK OK .. auth_param basic children 20 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 1 minutes acl password proxy_auth REQUIRED http_access allow password It is correct. Since I can find the OK in my cache.log, I assue that the authenticator is used by squid. Why does the authencation fail? Are you using any more http_access rules there.? And did you reconfigure your running squid.? after making changes? Try to stop your squid and use, configuration as, auth_param basic program /usr/src/null_auth auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours auth_param basic casesensitive off acl password proxy_auth REQUIRED Go to the starting of # TAG: http_reply_access. # http rule http_access allow auth http_access deny all Try now. Are you okie. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] disabling request header's log
Hi there. Recently, my Squid server are logging quite a number of Max Request Header logs in cache.log and /var/log/messages. I cant really remember all the logs as im not in front of the server when im typing this email. But it seems that users are requesting the HTTP headers/body larger than the default value, which is 10k. Is there any solution to disable those loggings? If you are getting any problem then don't try to disable log informations. request_header_max_size will be with 10KB size. It is good to keep that limit. By the way, how can i troubleshoot what exactly the users are up to? Are they trying to attacking the Squid server or it just some sort of stupid applications trying to make use of Squid? Yes. They may be virus / malicious attacks there to squid server. Try to find which requests made and block those. IT is good to keep some filter settings to avoid problem making web requests. Collect filter blacklists then block them with acls / http rules there. _muthu_ --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] Blocked Yahoo Messenger but want to open for some users in local group
Could you please tell me how to allow for some IP's before denying others IP's please check my ACL and give me directions to drive. acl yahoo url_regex -i ^http://shttp.msg.yahoo.com acl yahoo url_regex -i ^http://pgq.yahoo.com acl yahoo url_regex -i ^http://mtab.games.yahoo.com acl yahoo url_regex -i ^http://insider.msg.yahoo.com acl yahoo url_regex -i ^http://address.yahoo.com acl yahoo url_regex -i ^http://insider.msg.yahoo.com/ycontent/? acl yahoo url_regex -i ^http://us.i1.yimg.com http_access deny yahooCollect All source ip-address with src acl as, # IP-Address to be allowed acl srcip src ip-address / netmask acl srcip src ip-address1-ip-addressN/netmask acl srcip src ip-address1/netmask ip-addressN/netmask ip-address5/netmask Refer more: http://squid.visolve.com/squid/squid24s1/access_controls.htm#acl Define rules as, http_access deny !srcip yahoo http_reply_access deny yahoo__Muthu__---=== It is a Virus Free Mail ===Checked by AVG anti-virus system (http://www.grisoft.com).Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004 --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] NTLM Auth Problem.
I turned on log_mime_hdrs as you asked, and here's the output: 1098069200.802 1 10.0.1.8 TCP_DENIED/407 1747 GET http://www.google.com/ - NONE/- text/html [Accept: image/gif, image/x-xbitmap, image/jpeg, image/pjpeg, application/vnd.ms-powerpoint, application/vnd.ms-excel, application/msword, application/x-shockwave-flash, */*\r\nAccept-Language: en-au\r\nCookie: PREF=ID=17238ed846c9d38d:CR=1:TM=1096527005:LM=1096527005:S=kyLy_3fTUQxpLp2g \r\nUser-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.1.4322)\r\nHost: www.google.com\r\nProxy-Connection: Keep-Alive\r\n] [HTTP/1.0 407 Proxy Authentication Required\r\nServer: squid/2.5.STABLE6\r\nMime-Version: 1.0\r\nDate: Mon, 18 Oct 2004 03:13:20 GMT\r\nContent-Type: text/html\r\nContent-Length: 1320\r\nExpires: Mon, 18 Oct 2004 03:13:20 GMT\r\nX-Squid-Error: ERR_CACHE_ACCESS_DENIED 0\r\nProxy-Authenticate: Basic realm=Pandora Squid Test Proxy blah blah blah\r\nProxy-Authenticate: NTLM\r\n\r] I hope if you use NTLM + BASIC authentication with winbind on samba then it will make one tcp_denied on access.log there when we start testing there. But authentication will generate requests there without fail there for that. Can you post successive logs from access.log there. You can turn off log_mime_hdrs there. The dummy username used was restricted and the password was password. This user worked with basic auth after the NTLM auth failed. You can verify this as, by removing basic authentication and use only NTLM authentication. It will make one TCP_DENIED message, but web requests will be generated there on browser. Check this out. Regards Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.777 / Virus Database: 524 - Release Date: 10/14/2004
Re: [squid-users] IP sequence question
acl aclname src IP-address-start-IP-end/netmask acl TRAINING_PC src 192.168.193.3-9/24 Change netmask 32 or 255.255.255.255 as, acl TRAINING_PC src 192.168.193.3-9/32 24 will be respresenting network there. Refer http://squid.visolve.com/squid/squid24s1/access_controls.htm#acl for examples. REgards Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.775 / Virus Database: 522 - Release Date: 10/8/2004
[squid-users] Fw: squid guard + BerkeleyDB problem
Hello ALL, I had a try to install squid + squidgurdian. I have configured / installed /usr/local/BerkeleyDB.4.2 with encryption and /usr/local/BerkeleyDB.4.1 without encryption successfully with default options ( ./configure, make, make install) When I try to install squidgurdian-1.2.0 then, # ./configure --prefix=/usr/local/squidGuard --with-db=/usr/local/BerkeleyDB.4.2 or 4.1 .. snip... checking for sigaction... yes checking for signal... yes ** The Berkley DB library version 2.6.4 or newer is required. Get it from http://www.sleepycat.com use --with-db=DIR or --with-db-inc=DIR, --with-db-lib=DIR to specify its location (default is /usr/local/BerkeleyDB) I have analysed configure file there. But I couldn't get any informations / problem there. I did a search on google on this problem. I found only the questions not the replies to this problem. How to resolve this problem.? Thanks for your time. Regards, -Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.774 / Virus Database: 521 - Release Date: 10/7/2004
Re: [squid-users] aclParseIpData: WARNING: Netmask masks away part of the specified IP in '172.16.4.1/12' ....Plz help
ACL part of squid.conf is.. acl a src 172.16.4.1/12 acl b src 172.16.1.0/12 acl b src 172.16.3.0/12 Try as, acl a src 172.16.4.1/32 so that individual ip-address has to be with netmask as 32 there. acl leisure time 16:30-18:00 http_access allow a http_access allow b http_access allow c http_access allow !a !b !c leisure Execute ./squid -k parse to check acl settings again. Regards Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.774 / Virus Database: 521 - Release Date: 10/7/2004
[squid-users] squid - dansguardian testing
Hello All, I did a setup as, client browser --- NCSA authentication DANSguardian 8080 --- Squid --- Real web world so that my web requests are resolved. I had a problem when testing exceptionuserlist and filtergroups on dansguardian there. I tried as, /etc/dansguardian/banneduserlist test --- /etc/dansguardian/exceptionuserlist --- test Now If I try to get web requests then it is blocked there without getting override blocking filter as bannerurllist - exceptionurlist and bannedsitelist - exceptionsitelist. How are you configured for multiple groups using filter groups there. Thanks for your help. Regards Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.772 / Virus Database: 519 - Release Date: 10/1/2004
Re: [squid-users] multiple ACL
for example, my ip address is 152.118.37.35 and i'm accessing www.abc.com (using squid as proxy) and i have these ACL on my squid : acl smallClient src 152.118.37.35/255.255.255.255 acl netAdmins src 152.118.37.0/255.255.255.0 Okie Fine. acl bannedSite dstdomain www.abc.com It will not work there. Try to change as, acl bannedSite dstdomain .abc.com ( dot . ) must be there on domain, so that it will control *.abc.com there. REgards Muthukumar --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.772 / Virus Database: 519 - Release Date: 10/1/2004
Re: [squid-users] mime types
how could i block all streaming audio on squid (like online radio). could i do this with mime acl or any other way) plz help me out and also if some one do have any documents on mime types or any link plz tell me mime.conf file contains mime informations details there. Search with audio keyword on mime.conf It is good to block mime with dansgurdian setup to make filtering effectively. Reply mime types can be known only after getting from internet so that we can not control with rep_mime_type there with http_access. Use http_reply_access to control these. REgards Muthukumar --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.772 / Virus Database: 519 - Release Date: 10/1/2004
[squid-users] access.log redirection to mysql
Hai Squid-techies, what will happen if u make squid to enter the access logs directly in a my-sql database instead of writting it in a flat file. Will it decrease the performance ?? How can we directly write the access logs on my-sql database there.? Regards -Muthu. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.760 / Virus Database: 509 - Release Date: 9/10/2004
Re: [squid-users] Maxconn problem
I 'am using squid 2.5 stable6 . I want limit the number of connection by user . We can collect the user connection limit with as, acl aclname maxconn number Set the user informations with IP-Address as, acl user-ip src ip-address or ip-address-limit/netmask Set the http_access setting as, http_access deny user_ip !limit I look at the FAQ and I did what is written at there but it did not work . #304;f someone knows how I can do it please mail me. See squid.conf.default file TAG: acl aclname maxconn Regards -Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.756 / Virus Database: 506 - Release Date: 9/8/2004
Re: [squid-users] Pages
I have a problem with some pages that use some kind of authentification my problem is that all requets come from squid . Then your squid is doing it's job efficiently :-) Is there a way to tell squid that not take what you are expecting from squid with out getting the cached requests from it?! Regards -Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.756 / Virus Database: 506 - Release Date: 9/8/2004
Re: [squid-users] how to block sites
I want my one specific user to be able to access only 2 sites. Please tell me wht should I do and how. Get the specific user acl as, # Acl settings acl user src IP/netmask acl sites url_regex ^http://www.yahoo.com ^http://www.hotmail.com (or) acl sites dstdomain .yahoo.com .hotmail.com # http_access rule http_access deny user !sites -Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.754 / Virus Database: 504 - Release Date: 9/6/2004
Re: [squid-users] ACL is not working for specific IPs
I want to block msn messenger on some client machine and allow them at specific time. So i have define ACL for this, acl client_acl src 192.0.0.210 192.0.0.200 acl time_acl time 16:00-20:00 No date settings here for time acl?? acl time_acl M T W H F 16:00-20:00 Check more on acl time settings there. acl msnmessenger url_regex -i gateway.dll http_access deny msnmessenger !time_acl client_acl Ok. but its not working. Can any one tell if there is wrong in the defined ACL's Try now. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.752 / Virus Database: 503 - Release Date: 9/3/2004
Re: [squid-users] forwarded_for
Is it possible to apply 'forwarded_for' directive on acl, like: acl somewhere_someone dst ip-address/netmask forwarded_for allow somewhere_someone We can do this with tcp_outgoing_aaddress TAG - Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.752 / Virus Database: 503 - Release Date: 9/3/2004
Re: [squid-users] Bad Requests
The issue is on certain sites, e.g. carsoup.com, Vikings.com, ebay.com the users will get a bad request or page can't be displayed when attempting to visit the site. Is your Squid setup serving other pages or not? If it is then what is your squid.conf settings? I go around squid and the page loads fine. Anyone have an idea of a setting(s) I can check or modify to stop this? If your squid is working normal to other sites, then it may be blocked by Admin's there. Post your consolidated squid.conf file ( grep -v ^# squid.conf squid.conf.mlist ) -Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.752 / Virus Database: 503 - Release Date: 9/3/2004
Re: [squid-users] blocking file extension
block.txt -- \.cab\?.*$ \.rm\?.*$ \.mp\?.*$ \.mpg\?.*$ \.mpeg\?.*$ \.mp3\?.*$ \.wmv\?.*$ \.wma\?.*$ \.mov\?.*$ \.avi\?.*$ -- squid conf acl block urlpath_regex -i /etc/squid/block.txt http_access deny block It will block the list of files with the extensions in block.txt to ALL. but some time i try to sign in or sign off, it gives me access deny error, what you are specifying as sign in and sign off ?? Regards Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.740 / Virus Database: 494 - Release Date: 8/16/2004
Re: [squid-users] help on squid proxy
i am trying to use the squid proxy / cache server in my setup . I want to know whether the squid server can cache windows media files (wmv,asf) so that when a client makes a http request for a media file , it is served from the cache instead of the origin server . Do you want to cache stored media files on the web servers not at being LIVE / streaming. If webserver is allowing to cache those then we can cache them at all. Create your own refresh_pattern for wmv,asf files to get cached. i tried looking for the same info over the net , i could not get much info, i appretiate it if someone can provide me with some info on the same . And more, see the thread, http://www.squid-cache.org/mail-archive/squid-users/200202/0228.html I got it over google only :) Is it also possible to cache mms / stream requests , either on mms or rtsp protocols ? We can not cache stream requests. I hope rtsp protocols can be cached... Regards Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.740 / Virus Database: 494 - Release Date: 8/16/2004
Re: [squid-users] squid and outlook express problem
My configuration: eth0: 192.168.1.97 (My Public IP) eth1: 192.168.5.1 (My Lan IP) I have declared 25 and 110 as the safe ports also but still :(( Based on therad i saw at linux solve, I tried to do /sbin/iptables -t nat -A POSTROUTING -o eth1 -s 192.168.0.0/24 -j MASQUERADE Your iptables setting is making the problem here. If you are using MASQUERADE with POSTROUTING chains on nat table you have to specify the --to-ports option. Set your ip-tables as, # POP Requests /sbin/iptables -t nat -A POSTROUTING -o eth1 -p tcp -dport smtp -s 192.168.0.0/24 -j MASQUERADE # SMTP Requests /sbin/iptables -t nat -A POSTROUTING -o eth1 -p tcp -dport 110 -s 192.168.0.0/24 -j MASQUERADE Check /etc/services file as, smtp 25/tcp # Simple Mail Transfer Protocol pop3 110/tcp pop-3 # Post Office Protocol - Version 3 See more here at, http://squid.visolve.com/squid/trans_caching.htm and iptables manpage (MASQUERADE part) But this also having no effect. plz guide me to make my pop3 and smtp connections as transparent. I am attaching my squid.conf file with the mail. No attachment of squid.conf file with this mail :) Regads Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.732 / Virus Database: 486 - Release Date: 7/29/2004
Re: [squid-users] Fw: timeĀ“s quotas??????
IĀ“ve 30 users using squid proxy (squid-2.4STABLE7-4), in a LAN, but IĀ“ve only 64 Kbps of band width. I want to assign timeĀ“s quotas but not using acl time, I just want to give 100 hours/month for every user. Is it Possible? You can do it with squid2mysql. look at http://evc.fromru.com/squid2mysql/features.html - Muthu --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.732 / Virus Database: 486 - Release Date: 7/29/2004
Re: [squid-users] caching specific sites
We can do caching or non-caching of objects in a fixed manner Who is 'we' ? I did not get ur(marc) question!? Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.718 / Virus Database: 474 - Release Date: 7/9/2004
Re: [squid-users] USERS WITH MAC
i want to make ACL in which users access via there MAC address (not IP address) and user name and password. You have to compile the squid with -enable-arp-acl. If you successfully compile the squid,then you are having the capability of doing arp based acl setup. Refer: http://www.squid-cache.org/Doc/FAQ/FAQ-10.html 10.20 part. Limitation of ARP ACL: We can not use this acl for multiple subnets. second, if they want, they can change their password also from their computer. It is based on authentication method. Use NCSA for this. Example: acl arp1 arp 01:02:03:04:05:06 acl user1 proxy_auth test # allow only arp1 user to use the test user http_access allow arp1 !test Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.717 / Virus Database: 473 - Release Date: 7/8/2004
Re: [squid-users] squid ftp proxying clarification
I Plan to swithch over to OpenBSD 3.5 Proxy and Firewall machine. So now I have OpenBSD 3.5 installed on a computer with Squid Proxy Installed on it. I also managed to get Squid configured and the LAN users can access the Websites on the Internet through it. There is no problem with http access and it is much faster I think. Squid is used to cache the http requests not the ftp requests. We can use the ftp with http requests as a passive request. I am a bit confused about configuring the ftp proxy part of it. I got a bit confused from the documentation about transparent proxying and all. Could You please tell me what are the parameters I should change in the squid.conf and what values I should give them so that the users in the LAN can access the FTP sites as earlier? Are you trying on squid-2.5.x versions, then If you are firewall setting is not giving support for passive ftp requests, then use ftp_passive off There are few more parameters with the keyword ftp. Is it possible to restrict users and specify which users can access which sites? We can give the access based on users,domains, sites, etc using the acl and http_access for http requests. There is two more applications avaialable as frox an wget for ftp proxying, http://frox.sourceforge.net/ http://www.gnu.org/software/wget/ You can control the users to access the ftp requests using frox or wget based on Squid acl's type ( Refer TAG: external_acl_type ) Note: The Squid Proxy is installed on an OpenBSD 3.5 computer with 2 NICs. One NIC has an Internet static IP address and is connected to an ADSL router. The other NIC has an Internal IP Address and is connected to the LAN switch. Redirect all local users http or ftp requests to squid's internal IP address whose connection is with LAN switch. Forward all redirected requests to squid through the ADSL router IP-Address connected IP. Could you also please refer me to some resource on the internet which explains what transparent proxying is and what passive mode and active mode is. A transparant proxy basic details and linux implementation is available over here, http://squid.visolve.com/squid/trans_caching.htm See http://slacksite.com/other/ftp.html to know active vs passive mode ftp requests If I enable packet filtering in OpenBSD are there specific issues that I should be careful about while using Squid Proxy? I am not known with OpenBSD* Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.726 / Virus Database: 481 - Release Date: 7/22/2004
Re: [squid-users] caching specific sites
Is it possible to ask squid to cache specific sites for specific amount of time no matter what the site contents are? We can do caching or non-caching of objects in a fixed manner not as ur requirement.And we can control the access of sites with time acls. To do this, Use two squid.conf files as squid.conf.cache and squid.conf.nocache write a script to start squid with different files at different time(caching and non-caching time). (Stop other squid process and start ./squid -f squid.conf.cache) Simulate it with cron job. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.718 / Virus Database: 474 - Release Date: 7/9/2004
Re: [squid-users] Access to a range of IP addresses
I tried: acl test4login url_regex ^http://192.168.0. We can not use url*_regex acl's for ip-address matching. You have to use dst or dstdomain or dstdom_regex acl's for this. With this i thought that the users could access any address that starts with 192.168.0.x but i get an access denied We have to use all the ip-address for blocking separately. Write into a file with all ip-address and Use that in acl type. acl test dstdomani -i path to file --- ip file -- 192.16.1.1 .. It will do that. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.717 / Virus Database: 473 - Release Date: 7/8/2004
Re: [squid-users] proxy bypass for one client
I have a squid proxy server with DSL on Mandrake10 with eth0 as gateway and eth1 for LAN. I want one client to get direct connection bypassing the proxy. If squid is running in proxy mode, make your client ( browser ) not use the proxy settings with the proxy server on http port. How can the client recieve internet access then? Squid is used for caching the internet contents not for giving the internet access. If you try the net with out squid it will not be cached for further use again. If you don't have net access, there is no work for Squid. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.716 / Virus Database: 472 - Release Date: 7/5/2004
Re: [squid-users] Porn help
*-acl porn url_regex /usr/local/squid/etc/porn1* *-http_access deny porn all * what is the contents of /usr/local/squid/etc/porn1 file. Are you using ( ^ ) before the full url's as like, ^http://www.porn.com/ ^http://www.testporn.com/ Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.713 / Virus Database: 469 - Release Date: 6/30/2004
RE: [squid-users] my squid can not start
my squid get trouble ,and i can not restart squid .but after i reboot it ,i can start squid server. the message is below Jun 25 02:00:03 squid01 squid[29411]: Squid Parent: child process 29413 exited due to signal 6 Jun 25 01:59:59 squid01 squid[29411]: Squid Parent: child process 29441 started Jun 25 02:00:00 squid01 (squid): msgget failed Which version of squid/platfrom you are using. It may be a squid bug. SIGNAL 6 is used to ABORT the execution of squid process. Use the http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.19 section to debug the squid problem. Regards, Muthukumar. _ Get Ready Cash instantly. http://go.msnserver.com/IN/51407.asp In 72 hrs !
Re: [squid-users] squid -2.5 stable5 automatic boot
for a good reply now i get the slution of my problum wht is -Hsn can u pleas explain it Ulimit command is used to set or get limitations on the sys-tem resources available to the current shell and its descen-dents ulimit -HSn ( not ulimit -Hsn) n - it is used to set or get maximum file descriptor H - Hard limit S - soft limit Check there at http://bama.ua.edu/cgi-bin/man-cgi?ulimit+1 Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Allow connect specify https website
[acl SSL_Ports 443 acl host_allowed src /squid/etc/host_allowed.txt acl passwd proxy_auth REQUIRED acl special_domain dstdomain /squid/etc/special_domain.txt [http access section] http_access allow SSL_Ports special_domain http_access allow host_allow passwd File [special_domain.txt] tac.co.th File [host_allow.txt ] = all Intranet ip address === Change the special_domain.txt file contents as .tac.co.th (dot) is important to notify the domain --- special_domain.txt -- tac.co.th Are you using the ip-address with netmasks there in /squid/etc/host_allowed.txt file # acl settings acl SSL_ports port 443# https other acl rules # http access rules http_access deny SSL_ports !special_domain# It will deny all the https requests other than requests from *.tac.co.th domain to ALL http_access allow host_allow passwd# It will require the authentication for all clients After configuring there,start the squid -k reconfigure or squid -k parse then restart it. Check the squid for the https://*.tac.co.th/ Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
RE: [squid-users] Allow connect specify http website without ncsa authentication
acl domains dstdomain .oracle.com acl user_dl proxy_auth download Are you using only the download user to be queried for authentication ? Else use REQUIRED to get all user authentication informations. http_access allow domains It will allow the *.oracle.com to all. http_access allow user_dl !domains It will allow the authenticated users to access all sites in-stead of *.oracle.com domain urls. Regards, Muthukumar. _ Transfer home loans now! http://go.msnserver.com/IN/51405.asp Home loans with 7.5% only.
RE: [squid-users] Blocking msn file sharing
Is it possible to block msn messenger file sharing in Squid? Msn and sending / receiving messages should be allowed, filesharing not. There is a test and analysis needed on this one. Enable the log_mime_hdrs to on. Use the msn with the proxy and get the access.log results. *_mime_* acls are there to control the mime type's. Use the mime.conf.default for the regexp for mime's. It is good to proceed with the testing results. Regards, Muthukumar _ Looking for something? Cant find it anywhere? http://go.msnserver.com/IN/50756.asp Log onto baazee.com
Re: [squid-users] Problem with NCSA authentification
Actually, squid does not ask for more an re-authentification for these sites. How we can configure squid so that he realizes this. Client(browser) caches the visited pages.It will be kept in the browser cache and If you test with the same opened browser (client),it won't ask the authentication again. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Problem with NCSA authentification
so it's impossible to have a second authentification on a website after proxy authentification? Authentication will be valid upto credentialsttl on that particular client. It will ask authentication after that time,if you activate that client. can I configure my browser without any cache? Depends upon client(browser) cache options. On I.E , Tools -- Internet Options --- General tab --- Temporary internet files -- settings --- never Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Winbind authentication cannot work on squid
Here is the log : [2004/06/22 13:00:01, 1] utils/ntlm_auth.c:manage_squid_request(1592) fgets() failed! dying. errno=0 (Success) [2004/06/22 13:00:01, 1] utils/ntlm_auth.c:manage_squid_request(1592) fgets() failed! dying. errno=0 (Success) [2004/06/22 13:00:01, 1] utils/ntlm_auth.c:manage_squid_request(1592) fgets() failed! dying. errno=0 (Success) [2004/06/22 13:00:01, 1] utils/ntlm_auth.c:manage_squid_request(1592) fgets() failed! dying. errno=0 (Success) [2004/06/22 13:00:02, 1] utils/ntlm_auth.c:manage_squid_request(1592) fgets() failed! dying. errno=0 (Success) 2004/06/22 13:00:06| Starting Squid Cache version 2.5.STABLE5 for i586-pc-linux- gnu... Did you try to reconfigure or restart the squid. Above messages are the warning for that action. Check this discussion here at, http://www.mail-archive.com/[EMAIL PROTECTED]/msg01950.html Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] acl help
acl banned-clients src -i /usr/local/squid/etc/banned-clients.txt banned-clients.txt :: 146.141.59.230 146.141.59.231 146.141.59.232 146.141.59.233 146.141.59.234 146.141.59.235 146.141.59.236 If you use the src acl type,you have use the subnetmask with that ip-address. If you are using class C type then define as, 146.141.59.230/32 You can very well use the limit of ip-address as like, 146.141.59.230-146.141.59.236/32 acl files url_regex -i /usr/local/squid/etc/banned-sites.txt banned-sites.txt :: www.highveld.co.za/planet947/streaming.asp 209.245.59.17 streamload.com For url_regex acl ,you have to use ( ^ ) in the urls as ^http://www.highveld.co.za/planet947/streaming.asp We can not use the IP-Address based urls in the url_regex acl type. You have to use the dstdomain acl type for that one. The domain entry streamload.com can not be parsed by url_regex, Use the dstdom_regex for that as, acl domain dstdom_regex .streamload.com where ( . ) dot is important acl files url_regex -i /usr/local/squid/etc/banned-files.txt banned-files.txt :: \.mpeg$ \.mpg$ \.avi$ \.wmv$ \.mp3$ \.rm$ \.asf$ If you want to search regexp(attern) in the url, use the urlpath_regex acl. working like a charm! If you want to know more about the acl settings and http_access rules,refer FAQ 10. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Allow connect specify https website
[acl SSL_Ports 443 acl host_allowed src /squid/etc/host_allowed.txt acl passwd proxy_auth REQUIRED acl special_domain dstdomain /squid/etc/special_domain.txt [http access section] http_access allow SSL_Ports special_domain http_access allow host_allow passwd File [special_domain.txt] tac.co.th File [host_allow.txt ] = all Intranet ip address == By default Squid is using snews (563) port as SSL_Ports. Do you want to use only the HTTPS with *.tac.co.th extension for the Secure Socker Layer connection. Then change the http_access setting slight bit as, http_access deny SSL_Ports !special_domain http_access allow host_allow passwd http_access deny all Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] safe ports range for a single external host
My squid.conf has been tightened to only allow connections to a small range of Safe ports (http, https, and a few others). I have users in my network that need to connect to a single and unique external host on a range of ports (2048 to 3048). I'd like to open this range for all my clients *BUT* only to this one external IP. Yes. You can. Set an acl for external IP. Make the arrangements on access rules. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] SSL site filterting
I wish to only allow our internal network specific SSL sites (443). In the squid.conf file I am allowing the CONNECT proto for only port 443, now to allow access to a specific site only. I would have to use dstdom_regex or dstdomain right ?? Because the url is not yet known ... You can use dst acl for this,when you are trying to IP based url's. dstdom_regex and dstdomain acls will do the reverse look-up to resolve the the ip-address based url's. You can use urlpath_regex, if you are not known about the exact url. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Mulitiple ACL
Can I use Mulitple ACl for Groups? Multiple acls' for groups?? It is to know your requirement with some example(s) Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] [PATCH] Raw URL path ACL
The attached patch against squid-2.5.STABLE5 adds a new ACL type called urlpath_raw_regex. The creation of a patch is good one. It works in exactly the same way as urlpath_regex except no unescaping of the URI is done first, which makes it possible to filter specific attacks that escape some characters in the URI without blocking legitimate requests. If you use the uri_whitespace option with strip mode,it will be like that. I.e. you can filter URIs containing %2easp (the signature of some attacks) without blocking legitimate requests for .asp We can use allow or encode mode there. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] only one user per login
is there any way to tell squid to ONLY allow one user connected per login? I am using NTLM auth, and i want squid to restrict squid to allow http access to ONLY one user per login, i mean if user_01 is logged and surfing, another user using the same login should be rejected. See the max_user_ip acl type. It will allow the limited access of connections. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] Winbind authentication cannot work on squid
You need to use Netscape version 2.0 or greater, or Microsoft Internet Explorer 3.0, or an HTTP/1.1 compliant browser for this to work. Please contact the cache administrator if you have difficulties authenticating yourself or change your default password. You tried to check the authentication with the I.E 6.0 browser. Did you check it by enabling http 1.1 Check authentication with some more general links.Bcas google is not cacheable one (cache control is private) Your command line test is successful. Fine. Any messages in the cache.log entires for authentication related. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.708 / Virus Database: 464 - Release Date: 6/18/2004
Re: [squid-users] website access list
squid and users are using Windows desktop machines, our network uses DHCP for IP assigning. What I was wondering is, is there a way that You are not in need to limit the web access to the users with the IP or users and You want to keep track the informations about the user web access only. I could track website access from the IP that is logged by Squid asking the DHCP or WINS server who had or has this IP at a particular time. - Are your systems using the default hostname. - Can we get the client hostname informations from dhcp which is having the particular ip at now - We can get the ip-address lease time from DHCP If so,Use the time values in the access.log. Make a script to get the particular clients usage in the lease time and create a report for that. I know it is much easier by just using SARG and usernames, but what management does not want is to burden users with another password and username to remember and also have to type it in everytime they request a webpage. Ok. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.707 / Virus Database: 463 - Release Date: 6/15/2004
Re: [squid-users] website access list
Our company has just made it policy for everyone to have internet access, but they need to know who goes to what sites. I am running squid and users are using Windows desktop machines, our network uses DHCP for IP assigning. What I was wondering is, is there a way that I could track website access from the IP that is logged by Squid asking the DHCP or WINS server who had or has this IP at a particular time. I know it is much easier by just using SARG and usernames, but what management does not want is to burden users with another password and username to remember and also have to type it in everytime they request a webpage. You can use the MAC address of every machine to give the access to them. But you to recompile the squid with --enable-arp-acl option. Check more at http://www.squid-cache.org/Doc/FAQ/FAQ-10.html#ss10.20 Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.707 / Virus Database: 463 - Release Date: 6/15/2004
Re: [squid-users] Group based ACL
I am trying to work out the syntax for an acl, using wbinfo_group.pl, to only allow a specifed Active Directory Group access to the Internet. I have looked at squid.conf.default and the Squid FAQ, but I could not work it out from these. I am running samba 3.0.4 and Squid 2.5 STABLE 5. Use the external_acl_type to define the external program settings. Set the acl settings with acl aclname external ... Use the http_access rules to give the access to the groups based. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.707 / Virus Database: 463 - Release Date: 6/15/2004
Re: [squid-users] Allow connect specify https website without ncsa authentication
[authen section] auth_param basic program /squid/bin/ncsa_auth /squid/etc/passwd [acl section] acl SSL_Ports 443 acl host_allowed src /squid/etc/host_allowed.txt acl passwd proxy_auth REQUIRED acl special_domain dstdomain /squid/etc/special_domain.txt [http access section] http_access allow yy . http_access allow x http_access allow host_all passwd File [special_domain.txt] .tac.co.th File host_allow = all Intranet ip address The above details configuration file of squid.conf. My requirements want everyone connect to ssl website namely https://sonic2.tac.co.th/bp3/bin/Index without a authentication. The defaut users must authenication using last rule of http_access. The last rule base that is http_access allow host_allowed passwd means require user name [Reply] I hope your authentication method is working fine. Else check this with command line procedure. Change your squid.conf as squid.conf # http_access settings . other settings without effecting the following settings # Allow all to access https sites http_access allow SSL_Ports # Authentication for Intranet ip address with the default usership http_access allow host_allowed passwd # Deny access by default to end up http_access deny all Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.705 / Virus Database: 461 - Release Date: 6/12/2004
Re: [squid-users] Allow connect specify https website without ncsa authentication
I would allow https for special_domain only. For another domain,the user must authentication. But, The above config allows all domain for ssl service. Change the setting of http_access allow SSL_Ports as # allow ssl connection to special domain http_access allow SSL_Ports special_domain # Authentication for Intranet ip address with the default usership http_access allow host_allowed passwd # Deny access by default to end up http_access deny all Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.705 / Virus Database: 461 - Release Date: 6/12/2004
Re: [squid-users] Enabling WCCP Version 2 On Squid 2.5Stable5
I would like to know which value I should assign to the wccp_version parameter in order to enbale WCCP version 2 on Squid 2.5Stable5 ? In order to enable WCCP version 1 I have assigned the value 4. Use the wccp_version as same as 4 if you are not having the Cisco Internetwork Operating System 11.2. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.705 / Virus Database: 461 - Release Date: 6/12/2004
Re: [squid-users] can not get data from cgi function
for add, these problem appears on my squid : can not get data from cgi function of web page. i.e : http://google.com/search?blablablabla There is a problem on accessing this page using proxy. Check the caching problem on http://www.web-caching.com/cacheability.html by giving the above url. how to make squid auto restart when assuming the parent proxy is dead ? We can found the timeout of parent proxy using the TAG: peer_connect_timeout and connect option on cache_peer. what do you mean by auto restart ? Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] About transparent mode
We want to using transparent proxy that it did not substitute clients real IP to external IP of proxy Use the setting of redirect_rewrites_host_header as off. It will do your requirement. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] assertion failed: errorpage.c:292: mem-inmem_hi == 0 - squid exiting on signal 6
Squid 2.5STABLE5 on FreeBSD 4.6.2 (working on upgrading this) From caceh.log: 2004/06/14 11:49:17| assertion failed: errorpage.c:292: mem-inmem_hi == 0 From messages: Jun 14 11:49:17 s067sqd01 /kernel: pid 31939 (squid), uid 65534: exited on signal 6 Jun 14 11:49:17 s067sqd01 squid[22769]: Squid Parent: child process 31939 exited due to signal 6 Squid is compiled with snmp. Any ideas on where to start? There is bug and fix on this problem. Check here at http://www.squid-cache.org/Versions/v2/2.5/bugs/#squid-2.5.STABLE5-post_assert I read something about incompatibilities with a Symantec virus scanner... we have the Squid box forwarding all requests to a Trend Micro AV proxy server. Could this be causing the problem? Make the changes of squid souce with that patch and recompile the squid. Check the squid now. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.705 / Virus Database: 461 - Release Date: 6/12/2004
Re: [squid-users] ERROR thrown by squid while browsing
The requested URL could not be retrieved While trying to retrieve the URL: http://www.winehq.org/site/forums It is working fine to my normal setting proxy. The following error was encountered: * Connection Failed The system returned: (101) Network is unreachable The remote host or network may be down. Please try the request again. I hope the problem may be because of TIMEOUTS tags. Change the timeout's of connect_timeout, request_timeout. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.705 / Virus Database: 461 - Release Date: 6/12/2004
RE: [squid-users] Allow connect specify https website without ncsa authentication
Regards, Muthukumar Kandasamy. File [squid.conf] acl SSL_Ports 443 563 acl special_domain dstdomain /usr/local/squid/etc/special_domain.txt http_access allow x http_access allow host_allowed passwd File [special_domain.txt] .tac.co.th acl settings of SSL_Ports and special_domain are good. What are the settings of acl's host_allowed passwd? It is good to know about that acls to create the http_access for your new requirement. what are you specifying with without ncsa authentiaction.(default user must authen)? Regards, Muthukumar. _ Contact brides grooms FREE! http://www.shaadi.com/index.php?ptnr=hmltag Only on www.shaadi.com. Register now!
Re: [squid-users] Delay pools RH7
From all these docs, it seems as if delay pools is by default DISABLED on the squid that gets installed on the system (Red Hat 7.3 squid-2.4.STABLE6). Is this assumption correct? or how can I check whether delay pools is enabled or disabled? Yes. To check that go to the squid binary location and execute ./squid -v You will get congifugration informations of your squid. If you have the option as --enable-delay-pools then you are having it. Else you to have to recongure the squid with that option along with others. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] TCP_MISS/503 Errors.
While trying to retrieve the URL: http://www.genetech.co.za/ The following error was encountered: * Connection Failed The system returned: (111) Connection refused The remote host or network may be down. Please try the request again. Your cache administrator is [EMAIL PROTECTED] and in the /var/log/squid/access.log file I see the following. 1086939227.215 5936 10.240.1.208 TCP_MISS/503 1023 GET http://www.genetech.co.za/ marc NONE/- - The problem may be because of network service. 503 code indicates Service unavailable. If you get that message ,remote the proxy setting in browser and check again. If you can access the net then it may be problem on proxy. Else the problem is with the service. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] IE search from address bar + Squid
I recently had a user complain that when they would type a search term (like dell instead of www.dell.com) in their address bar that squid would return an error page stating that it was an invalid URL instead of IE going to the MSN search page. Upon further investigation I found that IE will only return the search page if the proxy server sends a 502 - bad gateway error, but not a 503 - service unavailable error (the default squid behavior). = [RE] Yes. It is not only for dell,if you type yahoo or hotmail ,it will try to resolve as http://hotmail/ or http://yahoo/ . It will expect the keyword of .com in the end. = To correct this I modified forward.c and changed the following: err = errorCon(ERR_DNS_FAIL, HTTP_SERVICE_UNAVAILABLE); to err = errorCon(ERR_DNS_FAIL, HTTP_BAD_GATEWAY); = [RE] Squid code understanding and changing to your requirement is a good one. But there is a option in squid.conf file called append_domain. Use that option as # It is used to resolve urls with out .com keyword append_domain .com [RE] Now it will try resolve for dell with the appeneded domain of .dom as http://dell.com append_domain option will resolve your problem. It can be used for if you directly type the url as http://dell.com = IE will now go to the search page if a user enters an invalid hostname in the URL (and has the search functionality enabled). I'm not sure if this change will ever make it into the squid stable branch but it's worth noting that most commercial proxy servers return a 502 error in this situation. == == [RE] For Changing the codes on squid, you can very well subscribe on [EMAIL PROTECTED] list (and / or) cc to [EMAIL PROTECTED] Henrick Norstrom = Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] I only get TCP_MISS/200
I have recently installed squid as a transparent proxy. I see every http request that goes through squid but it doesn't cache anything. Well, all I see in the access.log is TCP_MISS/200. Any suggestions?? Are using the transparent squid (httpd_accel_host virtual httpd_accel_port 80 ) with the following settings, httpd_accel_with_proxy on httpd_accel_uses_host_header on If you did not use this,then that is the problem. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] Log that user download or upload
It's possible ( using squid 2.5 stable 5 ) to log that the user upload or download ?? We can use the req_mime_type acl amd rep_mime_type to detect the upload or download of files. The user visit the www.hotmail.com and create a new message with an attachment. I would like that the name of the attachment is write into log file. Someone have an idea ?? Use the log_mime_hdrs on setting and check now. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] superfluous DNS queries
I set up SQUID to forward all HTTP traffic through a parent proxy (bound to internet) except when URL matches certain suffix domains (intranet). Upon receiving internet URL like www.thepurists.com, SQUID however queries DNS servers for www.squid-cache.org, www.squid-cache.org.sub.my.org, ... What is your dns_testnames settings in the Squid. Did you test the samples in the starting of squid or after some requests. Indeed, i dont know anyone who who type in a browser: http://www.squid-cache.org. instead of http://www.squid-cache.org So i believe it would be nice if SQUID processed URL having at least one dot as if there were fully-qualified. # cat squid.conf (excerpt) acl DIRECT dstdomain /usr/local/squid/etc/acl/direct.dstdom # cat /usr/local/squid/etc/acl/direct.dstdom my.org intranet.my The problem may be here. For dstdomain acl ,you have to include (.) dot before the domains as like # cat /usr/local/squid/etc/acl/direct.dstdom my.org intranet.my cache_peer outproxy.my.org parent 8080 0 no-query proxy-only always_direct allow DIRECT never_direct allow all dns_nameservers 10.1.1.1 10.5.1.1 visible_hostname intraproxy.sub.my.org # tcpdump -vs0 dst port 53 local.29297 10.1.1.1.domain: [udp sum ok] 62439+ A? www.squid-cache.org. [|domain] (DF) (ttl 255, id 43955, len 64) 10.1.1.1.domain local.29297: [udp sum ok] 62439 NXDomain* 0/1/0 (99) (ttl 29, id 8065, len 127) local.29298 10.1.1.1.domain: [udp sum ok] 62440+ A? www.squid-cache.org.sub.my.org. [|domain] (DF) (ttl 255, id 43956, len 76) 10.0.1.1.domain local.29298: [udp sum ok] 62440 NXDomain* 0/1/0 (108) (ttl 29, id 8070, len 136) Requests are suffixed with the first proxy domain's in the visible hostname of .sub.my.org local.29299 10.0.1.1.domain: [udp sum ok] 62441+ A? .squid-cache.org.my.org. [|domain] (DF) (ttl 255, id 43957, len 71) 10.0.1.1.domain local.29299: [udp sum ok] 62441 NXDomain* 0/1/0 (103) (ttl 29, id 8074, len 131) Now the requests are suffixed with the outer proxy's domains in the visible hostname. Check the proxy with the modified acl settings,dns_testnames. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] how ccan i block aplication and media file downloading in squid
snip acl mylan src 10.1.1.1-10.1.1.150/255.255.255.255 acl dlb urlpath_regex -i \.exe$ \.mp3$ \.mov$ \.mpg$ \.mp?$ \.avi$ \.rm$ \.wma$ \.mpeg$ snip It is normal setting except the two acls dlb and mylan. http_access allow mylan http_access deny dlb It will allow allow all requests of mylan (10.1.1.1-10.1.1.150/255.255.255.255) client requests. It will allow the dlb acl as normally. So all users can access including the download files. Change that line as # Restrict the mylan users from downloading dlb files http_access deny mylan dlb http_reply_access allow all It will work now. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] various problems
I have delay pools setup like this: delay_pools 2 delay_class 1 2 delay_class 2 2 You are having 2 delay pools and using the delay pools for class C (255.255.255.0 family) ip-address. delay_parameters 1 -1/-1 7000/8000 delay pool number 1 is a class 2 delay pool to restict each host with the limit 7 - 8 k bytes delay_parameters 2 -1/-1 3000/4000 This produces some strange behavior, the statement delay_parameters 2 -1/-1 3000/4000 does not seem to work correctly, instead of shaping the download to 3-4k/s it behaves just like the first delay pool, shaping the download to 7-8k. Maybe a bug in squid 2.5stable3 ? You have already given 7 - 8 k bytes limit usage to all hosts. So it won't work. You have to change the delay_parametes settings. It is good to have the following, - How much total load you are having? - what are all the acls settings? delay_access 1 deny nolimit delay_access 1 allow files delay_access 1 allow clienti-64k delay_access 2 deny nolimit delay_access 2 allow files delay_access 2 allow clienti-32k delay_access 2 allow clienti-limit Normally don't use more then two rules per delay pool type. Change the above as delay_access 1 allow clienti-64k files !nolimit delay_access 1 deny all oops.You are using files users in two delay types!! The second problem that I have is when the clients behind the proxy try to login to some forum, sourceforge.net or any other php forum, immediately after login they are presented with the message invalid session or the login screen appears again. I am not sure with the above problem. It is good to change the complete setting and check again. Send your requirement with the details. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] hotmail
Why my users can't access to hotmail or any https site since i already declare and allow the acl port 443 ? Are you using squid in proxy mode? what is your comments removed squid.conf settings? Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] Re:squid authentication password and access list
I changed squid.conf to use ncsa auth and PASSWORD so users are using username and password to connect to squid. Is your ncsa auth method working correctly. Check that ncsa auth method in command line. but i am having problems getting my previous access list to a specific time. acl WORKING time MTWHF 07:30-15:30 acl special_url url_regex and http_access deny special_url WORKING It will deny the specific url access on WORKING. Are you using any other access rules? need help to get this running with acl password proxy_auth REQUIRED If you configured the authentication method and tested successfully,it will ask authentication for all users. If you want to specifically allow the users as like (user1,user2,..,) - squid.conf settings - acl spl_users proxy_auth user1 user2 http_access allow spl_users special_url WORKING If you want to block all users to access the urls at the working time then - squid.conf settings - snip # Authentication method settings .. snip ... # acl rules acl WORKING time MTWHF 07:30-15:30 acl special_url url_regex # acl to retrieve the users through the ncsa auth method acl password proxy_auth REQUIRED http_access deny special_url WORKING http_access allow all snip do that. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] Where are the DNS entries stored?
Not transparant = good = keep it that way. Why? Transparent proxy has some disadvantages. Check there in http://squid.visolve.com/squid/trans_caching.htm We can not use the authentication method also. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] how ccan i block aplication and media file downloading in squid
my acl is for blocking downloading acl dlb urlpath_regex -i \.exe$ \.mp3$ \.mov$ \.mpg$ \.mp?$ \.avi$ \.rm$ \.wma$ \.mpeg$ http_access deny dlb but it is not working Are you using anymore http_access rules with this. You have to be very careful while setting the http_access rules. how can i block down loading in a time range Check out the time acl type. If you are going to make the squid more acl settings,then be care at the step of setting the http_access rules. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] Denying specific browser clients from using Squid
What I would like to do is ban certain browser clients (specifically Internet Explorer) from using the squid proxy, We can do it by using the browser acl's type. They are using user-agent header to retrieve the browser types. For mozila client it is,MOZILA . I am not known the keyword for I. explorer. May be IE?! squid.conf setting acl client_type browse ... http_access deny client_type a secondary aim would be to redirect them to a page telling them to use a different client. There is a directory as errors. It will contain some example error messages. You can write your own message. To give the error message to specify as Use a different client as like ..., use the deny_info TAG. squid.conf setting deny_info ERROR_MESSAGE client_type Can someone direct me to some good documentation on how to do this? I hope http://www.clavister.com/support/kb/10026/ is good for this requirement. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] Connection Refused in a Very Basic Set Up -- Now What?
acl our_networks src 192.168.0.0/24 http_access allow our_networks # And finally deny all other access to this proxy http_access deny all You are setting an acl for allowing 192.16.*.* machines only to access the internet. I set the http_port to 8080 Squid is running on 8080 port. What is the Squid running machine's ip-address.? and then configured firefox as follows: HTTP Proxy : 127.0.0.1 Port: 8080 SSL Proxy : 127.0.0.1 Port :8080 and there is no proxy for localhost, 127.0.0.1 It will not work. What the client tries to do means,it tries to connect the localhost not to the squid running machines local host! So change the exact ip-address in the browser setting. It will work now! Squid is running fine. When I try to open a web page, I now get a message that says The connection was refused when attempting to contact the proxy server you have configured. Please check your proxy settings and try again. Because of the incorrect proxy server ip-address setting. Are you trying to invoke the squid in proxy or trans. proxy. If you are trying in trans. proxy, there is no need to set the browser settings and all. If you are running the squid on the gateway to the 192.16.*.* machines ,trans proxy is simple and effective. I am also running a firewall script as follows, could this be part of the problem? Did you enable the httpd_accel_* options. Check this document., http://squid.visolve.com/squid/trans_caching.htm more to know about transparent proxy. #!/bin/bash # # Basic script to keep the nasties out of slack-lap # First we make the default policy to drop everything iptables -P INPUT DROP iptables -P FORWARD DROP # Allow established connections and programs that use loopback iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT iptables -A INPUT -s 127.0.0.0/8 -d 127.0.0.0/8 -i lo -j ACCEPT # Lets allow ssh to connect iptables -A INPUT -p tcp --dport 22 -i ppp0 -j ACCEPT #end scr There is a PREROUTING with REDIRECT method needed to redirect all http requests to squid. And POSTROUTING with MASQUERADE method is useful for ftp,... connections to Internet. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.701 / Virus Database: 458 - Release Date: 6/7/2004
Re: [squid-users] I want to block some page of sex?
I want to block some web sitie of sex, Do you have fixed set of web urls in that specific kind? If so then to make the task is easy. for example if any user want to load www.sexy.com, the squid donĀ“t let see this web, I want to use filter expresion regular. If you don't want to allow the user to access a url which contains a word sex then set an acl as acl wordsex urlpath_regex sex acl blockuser src ip-addess/netmask http_access deny blockuser wordsex If you want to block the specific url to a user, acl sexurl url_regex ^http://www.sexy.com or acl sexurl dstdomain .sexy.com http_access deny user sexurl It is good to have the set of urls you want to block and the blocking to the specific users or to all. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.698 / Virus Database: 455 - Release Date: 6/2/2004
Re: [squid-users] Help to Configuring Squid
desire to restrict the users for have only MSN and no access to the Web, only the access WEB and no Msn. Define the two set of acls to collect the MSN and Webusers acl msnuser src ip-address/netmask Or acl msnuser src ip-address-ip-address/netmask acl webusers src ip-address-ip-address/netmask MSN messenger is using the ip-address as 64.4.13.0/24 acl msnip dst 64.4.13.0/24 Set the http_access rule as # Allow the web users http_access deny webusers msnip http_access deny msnuser !msnip In addition, I have the following problem, the users who use the Squid, becomes disconnected constantly of the MSN. Did you get any errors in the cache.log? I am not sure with the problem. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.698 / Virus Database: 455 - Release Date: 6/2/2004
Re: [squid-users] squid acl
hello good morning i am setting squid with the ncsa_auth plugin What is your authentication program setting in squid.conf file? Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.698 / Virus Database: 455 - Release Date: 6/2/2004
Re: [squid-users] squid acl
after i add in my webmin's authentification plugin /usr/lib/squid/ncsa_auth /etc/squid/usersUsers Change the permission of the /etc/squid/usersUsers file to cache_effective_user setting user permission. You have to put a line auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/usersUsers then your acl Users proxy_auth REQUIRED referer to authentication program or to be clear acl ncsa proxy_auth REQUIRED Now you have to create who will be yours clients for authentication. Check it with command line as /usr/lib/squid/ncsa_auth /etc/squid/usersUsers user1 password1 (where user1 - username passowrd1 - password for user1) OK or ERR If you get Ok,the you are having the user in the system and your ncsa_auth method is good to patch in Squid. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.698 / Virus Database: 455 - Release Date: 6/2/2004
Re: [squid-users] ACL to protect squid box
Hi, I'm getting a lot of requests for this url. What are the requests? Can you specify that? I think this request from some virus. You can scan every http requests using viralator tool. http://viralator.loddington.com/ I want to block these kinds of wrong urls. And implement more security on my squid box. You can use squidguard for more security. http://www.squidguard.org/ Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.693 / Virus Database: 454 - Release Date: 5/31/2004
Re: [squid-users] file descriptor problem
For some reason i am unable to mail replies to the list..it complains of my MIME version..however i can send my first mail .strange (??) It is a problem with your outlook express. Go to Tools options send Change the mail sending format to Plain text. Click the Plain text options and change to Uuencode from MIME version. I have compiled Squid 2.5 stable 5 on a Redhat 9 server Linux version 2.4.20-31.9smp ([EMAIL PROTECTED]) (gcc version 3.2.2 20030222 (Red Hat Linux 3.2.2-5)) #1 SMP Tue Apr 13 17:40:10 EDT 2004 but I get a msg saying What is your default ulimit -HSn saying? checking for strerror... (cached) yes checking Default FD_SETSIZE value... 256 checking Maximum number of filedescriptors we can open... 256 WARNING: 256 may not be enough filedescriptors if your If so stop the compilation. Change the #ulimit -Hsn Then configure with the modifications,can you now get the changes on FD_SETSIZE value. Are you configuring and compiling the squid with root permission? grep FD_SETSIZE typesizes.h #define __FD_SETSIZE1024 define __FD_SETSIZE157272 what is your /proc/sys/net/ipv4/ip_local_port_range settings? why you are going for this 157272 limit ? It is enough to have 32768 or 32768*2 limit __FD_SETSIZE as the maximum requirement. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.693 / Virus Database: 454 - Release Date: 5/31/2004
Re: [squid-users] file descriptor problem
What is your default ulimit -HSn saying? how do i check for the default? Execute on shell #ulimit -HSn ,It will give the value of the FD did this previously but it has not made a difference. and yes, all installs are done at root. Do this, #ulimit -HSn 32768 Can you see the 32768 on the execution of #ulimit -Hsn [EMAIL PROTECTED] fs]# more /proc/sys/net/ipv4/ip_local_port_range 32768 61000 It is good. Do the above on both kernels. If you get the modified FD,then configure the squid. Else there is a problem more to look-out. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.693 / Virus Database: 454 - Release Date: 5/31/2004
Re: [squid-users] acl of type dstdomain and CONNECT not working together with dstdomain?
acl ports port 443 acl domains dstdomain .foo.com acl CONNECT method CONNECT http_access allow CONNECT ports domain http_access deny all When I try to connect to www.foo.com I get a denied access. For dstdomain acltype , a reverse lookup is done for the ip-based urls. If the look up fail none will be return. You can check in the access.log for that request in the request method field (6th field). It is not succeeded at the point.So you are getting denied access. When on the other hand I do (1.2.3.4 is www.foo.com's address) acl ports port 443 acl hosts dst 1.2.3.4 acl CONNECT method CONNECT http_access allow CONNECT ports hosts http_access deny all I do get access. dst acl type is resolving the destination address directly. So you are not having the problem to access it. Compare the two request methods from the access.log. It will give you the difference. Regards, Muthukumar. --- === It is a Virus Free Mail === Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.692 / Virus Database: 453 - Release Date: 5/28/2004