RE: [squid-users] Authentication related query
Hi, Thanks for the response.I am herewith pasting the squid.conf for ur perusal. http_port 3128 hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache deny QUERY acl all src 0/0 no_cache deny all cache_dir null /usr/local/squid/var/ # The below options are for redirect #debug_options ALL,1 61,9 33,5 redirect_program /usr/local/ContentFilter/filter redirect_children 5 auth_param basic program /usr/local/squid/sbin/pam_auth auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 10 minutes authenticate_ttl 10 minutes authenticate_ip_ttl 10 minutes refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl authenticated proxy_auth REQUIRED acl servers dst 10.10.10.47 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT http_access allow servers http_access allow authenticated http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_reply_access allow all icp_access allow all coredump_dir /usr/local/squid/var/cache redirector_bypass off Kindly tell me if there is any problem in the conf. Regards and TIA, Deepa --- Adam Aube <[EMAIL PROTECTED]> wrote: > > No,the browser is configured to use the proxy. > > Then the problem could be with your browser. Squid > requires the > browser to send the authentication credentials for > every request made. > It is the browser alone that controls when and how > to get the > authentication info from the user. > > Post your squid.conf so we can make sure nothing's > wrong there - be > sure to remove any blank lines or comments. > > Adam > Yahoo! India Matrimony: Find your partner online. Go to http://yahoo.shaadi.com
RE: [squid-users] setting up a blacklist
> A few problems here: > > 1) The first porn acl should be url_regex, not dstdom_regex > (guessing from the > file name) - dstdom_regex won't match anything after the hostname > 2) The 3rd porn acl is missing the acl type (suggest url_regex or > urlpath_regex) > 3) Since you're referencing files, you might have to make those 3 > porn acls > porn1, porn2, and porn3. (You definitely will if they're not the same acl > type) Ok ... I can see that. > 4) The "http_access deny porn" is after you've already allowed your local > network, so it won't have any effect > Oops :-) > I don't see anything that would give the symptoms you report > (excessive CPU > utilization on startup and shutdown). Having too many patterns in Check my top output ... it was memory bog not cpu. > the files > can cause high CPU utilization, but I would expect that to be fairly > constant. Maybe someone else has more insight. > I'm now in the process of setting up squidGuard based on the suggestion Gareth. Thanks for your suggestions too. Bill --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.518 / Virus Database: 316 - Release Date: 9/11/2003
Re: [squid-users] setting up a blacklist
>> Can you post your squid.conf (without comments or blank lines)? > acl homenet src 192.168.212.0/24 > http_access allow homenet > http_access allow localhost > http_access deny all > acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls" > acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains" > acl porn "/usr/share/squid/blacklists/porn/expressions" > deny_info ERR_NO_PORNO porn > http_access deny porn A few problems here: 1) The first porn acl should be url_regex, not dstdom_regex (guessing from the file name) - dstdom_regex won't match anything after the hostname 2) The 3rd porn acl is missing the acl type (suggest url_regex or urlpath_regex) 3) Since you're referencing files, you might have to make those 3 porn acls porn1, porn2, and porn3. (You definitely will if they're not the same acl type) 4) The "http_access deny porn" is after you've already allowed your local network, so it won't have any effect I don't see anything that would give the symptoms you report (excessive CPU utilization on startup and shutdown). Having too many patterns in the files can cause high CPU utilization, but I would expect that to be fairly constant. Maybe someone else has more insight. Adam
Re: [squid-users] setting up a blacklist
>> Can you post your squid.conf (without comments or blank lines)? > Here ya go ... I think you're missing a few things - like cache_dir and cache_mem. What are those lines in your squid.conf? Adam
[squid-users] help:url not retrieved
Hello. After configuring a browser to connect to squid at localhost, I receive an error page of url not retrieved because forwarding denied. It specify that the cache will not forward the request because it is trying to enforce a sibling relationship. Have you some hints to resolve it? Could you suggest me the part of configuration guide where I have to detect? Thanks in advance
RE: [squid-users] setting up a blacklist
Why don't you save yourself the headache and use squidGuard or DanGuardian www.squidGuard.org http://dansguardian.org/
Re: [squid-users] setting up a blacklist
Bill, -- acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls" acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains" acl porn "/usr/share/squid/blacklists/porn/expressions" -- As far as I know this is not correct. Other Squid users: Pls correct me if I'm wrong. rgrds, Bart Bill McCormick wrote: Squid brings my dual Xeon Dell to it's knees on startup and shutdown. Can you post your squid.conf (without comments or blank lines)? Adam Here ya go ... hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache deny QUERY auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports acl homenet src 192.168.212.0/24 http_access allow homenet http_access allow localhost http_access deny all acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls" acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains" acl porn "/usr/share/squid/blacklists/porn/expressions" deny_info ERR_NO_PORNO porn http_access deny porn http_reply_access allow all icp_access allow all visible_hostname billinux coredump_dir /var/spool/squid --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.518 / Virus Database: 316 - Release Date: 9/11/2003
RE: [squid-users] setting up a blacklist
> > > Squid brings my dual Xeon Dell to it's knees on startup and > shutdown. > > Can you post your squid.conf (without comments or blank lines)? > > Adam > > Here ya go ... hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache deny QUERY auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports acl homenet src 192.168.212.0/24 http_access allow homenet http_access allow localhost http_access deny all acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls" acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains" acl porn "/usr/share/squid/blacklists/porn/expressions" deny_info ERR_NO_PORNO porn http_access deny porn http_reply_access allow all icp_access allow all visible_hostname billinux coredump_dir /var/spool/squid --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.518 / Virus Database: 316 - Release Date: 9/11/2003
RE: [squid-users] setting up a blacklist
> Squid brings my dual Xeon Dell to it's knees on startup and shutdown. Can you post your squid.conf (without comments or blank lines)? Adam
[squid-users] setting up a blacklist
Hello all, I'm a squid newbie trying to use a blacklist and am having some problems: Squid brings my dual Xeon Dell to it's knees on startup and shutdown. One thing that might help is if my system had more RAM (only 128 meg) but that'll have to wait. I think I've got some squid.conf configuration issues, but pooring over on-line docs reveals the solution NOT. Here are the relevant squid.conf items: acl porn dstdom_regex "/usr/share/squid/blacklists/porn/urls" acl porn dstdom_regex "/usr/share/squid/blacklists/porn/domains" acl porn "/usr/share/squid/blacklists/porn/expressions" deny_info ERR_NO_PORNO porn http_access deny porn and the files: -rw-rw-r--1 bill bill 807446 Sep 3 19:16 domains -rw-rw-r--1 bill bill 802 Jun 17 2002 expressions -rw-rw-r--1 bill bill 746410 Sep 3 19:28 urls Here's a top (well .. the top part of it): 14:25:33 up 1:08, 3 users, load average: 9.09, 6.38, 2.98 81 processes: 79 sleeping, 2 running, 0 zombie, 0 stopped CPU0 states: 0.7% user 10.115% system0.0% nice 0.0% iowait 89.133% idle CPU1 states: 0.10% user 11.29% system0.0% nice 0.0% iowait 88.216% idle CPU2 states: 0.3% user 10.118% system0.0% nice 0.0% iowait 89.134% idle CPU3 states: 0.11% user 10.244% system0.0% nice 0.0% iowait 89.0% idle Mem: 125396k av, 123252k used,2144k free, 0k shrd,1316k buff 107368k actv, 880k in_d,1440k in_c Swap: 257000k av, 256996k used, 4k free2576k cached PID USER PRI NI SIZE RSS SHARE STAT %CPU %MEM TIME CPU COMMAND 1165 root 15 0 146M 99M 144 D13.3 81.4 0:17 3 squid --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.518 / Virus Database: 316 - Release Date: 9/11/2003
[squid-users] xmalloc and out of memory errors in messages log
Hi, I am currently having these messages in my /var/log/messages: Sep 20 00:09:58 blar out of memory [28008 Sep 20 00:09:59 blar squid[29539]: Squid Parent: child process 28008 exited due to signal 6 Sep 20 01:39:21 blar (squid): xmalloc: Unable to allocate 87380 bytes! Sep 20 01:39:21 blar squid[29539]: Squid Parent: child process 28435 exited due to signal 6 During peak hours, I'm getting about 8 xmalloc errors per hour. out of memory are less frequent at about twice a day. I've read the FAQ. Its states 2 possible reasons for OS to have xmalloc error: 1) out of swap 2) data segment size reached I am sure that the machine never ran out of swap, using monitoring tools. And the below shows that I've set the data segment size to unlimited. [EMAIL PROTECTED] squid]# ulimit -a core file size(blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files(-n) 1024 pipe size (512 bytes, -p) 8 stack size(kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes(-u) 7168 virtual memory(kbytes, -v) unlimited What else can I do for the xmalloc and out of memory errors? my caching disk partitions(reiserfs): /dev/sdb1 34G 22G 12G 64% /cdata1 /dev/sdc1 34G 22G 12G 64% /cdata2 /dev/sdd1 34G 22G 12G 64% /cdata3 linux kernel 2.4.20-19.8 DELL 2.4 Xeon SP, 2G RAM, 2G swap, 2X34G mirror for OS, 3X34G volume for cache About 120 req/s(non-peak) and 250 req/s(peak). Squid Cache: Version 2.5.STABLE3(patched deny_info err) configure options: --prefix=/usr/local/squid --enable-dlmalloc --enable-removal-policies=heap --enable-storeio=aufs --enable-underscores --enable-async-io=50 --enable-snmp squid.conf(important sections) == cache_mem 64 MB cache_swap_low 92 # cache_swap_high 95 maximum_object_size 1024 KB maximum_object_size_in_memory 100 KB cache_replacement_policy heap GDSF memory_replacement_policy heap GDSF cache_dir aufs /cdata1 23800 56 256 cache_dir aufs /cdata2 23800 56 256 cache_dir aufs /cdata3 23800 56 256 quick_abort_min -1 KB half_closed_clients off memory_pools off One more thing: I've another squid(identical in everything except cache space) with 16G X 3 of caching space and they exhibit the same freq of errors. Pls help. Let me know if I've left out anything. Thanks. -- Wolf __ Do You Yahoo!? Play now and stand a chance to win cash prizes! http://yahoo.com.sg/millionaire
Re: [squid-users] IE SSL Timeout
Hmm ... I tried it with Apache/2.0.43 (Win32) works fine ... I am really puzzled :-/ - claus
[squid-users] R: [squid-users] Block mp3 mpg ecc.....
Ok work it ! thanks Andrea Soccal IT Sistema Ufficio [EMAIL PROTECTED] -Messaggio originale- Da: Henrik Nordstrom [mailto:[EMAIL PROTECTED] Inviato: venerdì 19 settembre 2003 17.22 A: Soccal Andrea Cc: '[EMAIL PROTECTED]' Oggetto: Re: [squid-users] Block mp3 mpg ecc. On Fri, 19 Sep 2003, Soccal Andrea wrote: > It's true ? > Acl download urlpath_regex \.mp3 \.avi .. ? Yes.. have you tried? Regards Henrik
Re: [squid-users] IE SSL Timeout
Claus wrote: > The problem does not happen when: > Using XP/IE - WinGate - UBS > Using XP/IE - ISA - SQUID - UBS Sorry I did not remember my present config ... it is works Using XP/IE - ISA - UBS Only ISA and not a cascade with SQUID ... Sorry! - claus
Re: [squid-users] Debug Settings
On Fri, 19 Sep 2003, Kent, Mr. John wrote: > What would be the recommended debug_options settings to see the IP# Squid is > refusing to > make a connection for? None really needed. This you see in access.log. > The background: trying to use cachemger.cgi. When I call it, after the login > password page > get > Cache Manager Error > connect: (111) Connection refused Then you are not specifying the correct hostname and/or port where cachemgr should connect. This request never reached your Squid at all. Regards Henrik
Re: [squid-users] Block mp3 mpg ecc.....
On Fri, 19 Sep 2003, Soccal Andrea wrote: > It's true ? > Acl download urlpath_regex \.mp3 \.avi .. ? Yes.. have you tried? Regards Henrik
Re: [squid-users] consequent SIGSEGVs
On Fri, 19 Sep 2003, oleg-s wrote: > squid dies very often at the random daytime. Then please open a bug report and include a backtrace of the error. Note: The backtraces printed in your cache.log is not valid backtraces as they do not contain any symbol information. Is your Squid binary stripped? Regards Henrik
RE: [squid-users] squid acl help needed
> The time acl is an exception that you can only list a single time per > line, but you can still list multiple lines. That I did not know - thanks for the correction. Adam
Re: [squid-users] R: [squid-users] Block site with a specific word in the entire ur l
On Fri, 19 Sep 2003, Soccal Andrea wrote: > Ok > But if i have a word (ex) similar at the word (sex) > With > Acl rule1 url_regex -i sex Don't do it like this. The above is not a regex for the word sex, it is a regex for the letter 's' followed by 'e' followed by 'x', and will match any URL where this sequence of letters occurs. See previous response. Regards Henrik
Re: [squid-users] Block site with a specific word in the entire url
On Fri, 19 Sep 2003, Soccal Andrea wrote: > I want block the site with a SPECIFIC words in the url ...for example: You can use url_regex for this, but you must be careful to tell it that it is words you want to block. If using GNU regex or Linux (or any regex library derived from the GNU regex or compatible library) then you can use url_regex patterns like \bfuck\b to match the word "fuck". (\b matches a boundary between what looks like two words) > With -i option ??what's the mean of -i option ?? case-insensitive, A == a Regards Henrik
Re: [squid-users] Cluster of caches
On Fri, 19 Sep 2003, Wilhelm Farrugia wrote: > How would the performance in peering of caches effect the users - latency > consideration ? Usually improvement, but it depends a bit on your situation. > If I get page from a peered cache, this info will be saved on the cache > making the request ? You can select via the cache_peer directive. The default is to save copies. > Squid make checks with peering caches and directly to the relevant server > when it has a miss ? Will serve page from the first who replies ? Usually cache-revalidation goes to the server. It is not very reliable to make cache revalidation via neighbor caches as it can easily create a ping-pong effect and none of the peers realise the content the ping-pong to each other is old.. Regards Henrik
Re: [squid-users] squid acl help needed
On Fri, 19 Sep 2003, Payal Rathod wrote: > Wowww! I thought that three acls by the same name might create a > problem. Not as long as you always stuff the same type of content into the acl. For most ACLs you can list as many things as you want to match on the same line, or on multiple lines. The time acl is an exception that you can only list a single time per line, but you can still list multiple lines. Regards Henrik
Re: [squid-users] Authentication related query
On Fri, 19 Sep 2003, Deepa D wrote: > No,the browser is configured to use the proxy. Then I do not know what the cause may be... have you tried with another browser? Regards Henrik
[squid-users] Debug Settings
Greetings, Love Squid! It helped serve forecasts and photos on hurricane Isabel. Trying to eek out better performance and so want to use the cachemger.cgi. So far no luck. I think my question is: What would be the recommended debug_options settings to see the IP# Squid is refusing to make a connection for? The background: trying to use cachemger.cgi. When I call it, after the login password page get Cache Manager Error connect: (111) Connection refused Generated Fri, 19 Sep 2003 14:51:00 GMT, by cachemgr.cgi/[EMAIL PROTECTED] I have set my config file in accordance with the FAQ: acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl example src 199.9.2.136/255.255.255.255 acl example src 199.9.2.137/255.255.255.255 acl all src 0.0.0.0/0.0.0.0 http_access allow manager localhost http_access allow manager example http_access deny manager http_access allow all Thank you for your help, John Kent Webmaster Naval Research Laboratory Monterey, CA http://www.nrlmry.navy.mil/tc_pages/tc_home.html
Re: [squid-users] Block mp3 mpg ecc.....
Should work It would be interesting to change them to \.avi$ \.mp3$ ... $ means 'at the end'. And, of course, dont forget to use some http_access deny with your ACL !!! Simply creating the acl will do nothing, you have to use it in some http_access rule. Sincerily, Leonardo Rodrigues - Original Message - From: "Soccal Andrea" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 11:44 AM Subject: [squid-users] Block mp3 mpg ecc. > It's true ? > Acl download urlpath_regex \.mp3 \.avi .. ? > > thanks >
RE: [squid-users] Block mp3 mpg ecc.....
> It's true ? > Acl download urlpath_regex \.mp3 \.avi .. ? Yes. You may want to use -i (makes the match case-insensitive) and end each attachment with a $ (makes it match at the end of the URL). Adam
[squid-users] Block mp3 mpg ecc.....
It's true ? Acl download urlpath_regex \.mp3 \.avi .. ? thanks Andrea Soccal IT Sistema Ufficio [EMAIL PROTECTED]
[squid-users] IE SSL Timeout
Hello all, I get a problem using of a frame timing out when using XP Pro latest hotfix with IE 6.0.280.1106 + SQUID 2.5 Stable 3 (I also tried a 3 devel version and using Mozilla I do not remember the version) when using the following link https://telebank1.ubs.com/classic?initiate The link points to the tele banking system of UBS. There is a "Demo" mode that I try on which the problem happens as well. Could anyone with a perfectly running Squid and IE XP try the following and tell me if it works? Click Demo (change language if you need to, on the upper nav bar) Click Account (in English; Konto in German; Compte in French on the left nav frame) Click Last transactions (the first entry below Account on the left nav frame) This link refreshes the parent I get the "last transactions" on the right hand frame and a blank frame on the left hand side. When I stop Squid the frame appears ... The problem does not happen when: Using XP/IE - WinGate - UBS Using XP/IE - ISA - SQUID - UBS Very strange. I fiddled around with some Squid parameters (like ssl_unclean_shutdown ) that could have an impact but with no success. Sorry for being so intrusive but it is really the only place where I could reproduce it. Thanks and regards, Claus
[squid-users] consequent SIGSEGVs
hello. squid dies very often at the random daytime. -- 2003/09/19 14:37:36| Starting Squid Cache version 2.5.STABLE4 for i586-pc-linux-gnu... 2003/09/19 14:37:36| Process ID 24805 2003/09/19 14:37:36| With 1024 file descriptors available 2003/09/19 14:37:36| DNS Socket created at 0.0.0.0, port 35288, FD 4 2003/09/19 14:37:36| Adding nameserver 217.107.132.22 from squid.conf 2003/09/19 14:37:36| helperOpenServers: Starting 10 'squidGuard' processes 2003/09/19 14:37:40| helperOpenServers: Starting 7 'auth_md5_wo_ip' processes 2003/09/19 14:37:41| helperOpenServers: Starting 8 'ip_acl' processes 2003/09/19 14:37:43| Unlinkd pipe opened on FD 35 2003/09/19 14:37:43| Swap maxSize 2682880 KB, estimated 206375 objects 2003/09/19 14:37:43| Target number of buckets: 10318 2003/09/19 14:37:43| Using 16384 Store buckets 2003/09/19 14:37:43| Max Mem size: 49152 KB 2003/09/19 14:37:43| Max Swap size: 2682880 KB 2003/09/19 14:37:43| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec 2003/09/19 14:37:43| Store logging disabled 2003/09/19 14:37:43| Rebuilding storage in /cache (CLEAN) 2003/09/19 14:37:43| Using Least Load store dir selection 2003/09/19 14:37:43| Set Current Directory to /cache/ 2003/09/19 14:37:43| Loaded Icons. 2003/09/19 14:37:51| Accepting HTTP connections at 192.168.7.1, port 3128, FD 36. 2003/09/19 14:37:51| Ready to serve requests. 2003/09/19 14:37:51| Store rebuilding is 2.0% complete [0x80972b4] /lib/libc.so.6(sigaction+0x268)[0x48c8dc68] [0x80a523f] [0x80746e4] [0x8063671] [0x80808ee] /lib/libc.so.6(__libc_start_main+0xff)[0x48c879cb] [0x804a9c1] FATAL: Received Segment Violation...dying. 2003/09/19 14:37:53| Not currently OK to rewrite swap log. 2003/09/19 14:37:53| storeDirWriteCleanLogs: Operation aborted. CPU Usage: 1.550 seconds = 1.040 user + 0.510 sys Maximum Resident Size: 0 KB Page faults with physical i/o: 302 Memory usage for squid via mallinfo(): total space in arena:4204 KB Ordinary blocks: 4202 KB 1 blks Small blocks: 0 KB 0 blks Holding blocks: 524 KB 2 blks Free Small blocks: 0 KB Free Ordinary blocks: 1 KB Total in use:4726 KB 112% Total free: 1 KB 0% 2003/09/19 14:37:56| Starting Squid Cache version 2.5.STABLE4 for i586-pc-linux-gnu... 2003/09/19 14:37:56| Process ID 9229 2003/09/19 14:37:56| With 1024 file descriptors available 2003/09/19 14:37:56| DNS Socket created at 0.0.0.0, port 35288, FD 4 2003/09/19 14:37:56| Adding nameserver 217.107.132.22 from squid.conf 2003/09/19 14:37:56| helperOpenServers: Starting 10 'squidGuard' processes 2003/09/19 14:37:57| helperOpenServers: Starting 7 'auth_md5_wo_ip' processes 2003/09/19 14:38:01| helperOpenServers: Starting 8 'ip_acl' processes 2003/09/19 14:38:05| Unlinkd pipe opened on FD 35 2003/09/19 14:38:05| Swap maxSize 2682880 KB, estimated 206375 objects 2003/09/19 14:38:05| Target number of buckets: 10318 2003/09/19 14:38:05| Using 16384 Store buckets 2003/09/19 14:38:05| Max Mem size: 49152 KB 2003/09/19 14:38:05| Max Swap size: 2682880 KB 2003/09/19 14:38:05| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec 2003/09/19 14:38:05| Store logging disabled 2003/09/19 14:38:05| Rebuilding storage in /cache (DIRTY) 2003/09/19 14:38:05| Using Least Load store dir selection 2003/09/19 14:38:05| Set Current Directory to /cache/ 2003/09/19 14:38:05| Loaded Icons. 2003/09/19 14:38:06| Accepting HTTP connections at 192.168.7.1, port 3128, FD 36. 2003/09/19 14:38:06| Ready to serve requests. 2003/09/19 14:38:08| Store rebuilding is 2.0% complete [0x80972b4] /lib/libc.so.6(sigaction+0x268)[0x49eccc68] [0x80a523f] [0x80746e4] [0x8063671] [0x80808ee] /lib/libc.so.6(__libc_start_main+0xff)[0x49ec69cb] [0x804a9c1] FATAL: Received Segment Violation...dying. 2003/09/19 14:38:08| Not currently OK to rewrite swap log. 2003/09/19 14:38:08| storeDirWriteCleanLogs: Operation aborted. CPU Usage: 0.570 seconds = 0.390 user + 0.180 sys Maximum Resident Size: 0 KB Page faults with physical i/o: 302 Memory usage for squid via mallinfo(): total space in arena:2868 KB Ordinary blocks: 2867 KB 1 blks Small blocks: 0 KB 0 blks Holding blocks: 524 KB 2 blks Free Small blocks: 0 KB Free Ordinary blocks: 0 KB Total in use:3391 KB 118% Total free: 0 KB 0% 2003/09/19 14:38:11| Starting Squid Cache version 2.5.STABLE4 for i586-pc-linux-gnu... 2003/09/19 14:38:11| Process ID 7420 2003/09/19 14:38:11| With 1024 file descriptors available 2003/09/19 14:38:11| DNS Socket created at 0.0.0.0, port 35288, FD 4 2003/09/19 14:38:11| Adding nameserver 217.107.132.22 from squid.conf 2003/09/19 14:38:11| helperOpenServers: Starti
Re: [squid-users] R: [squid-users] Block site with a specific word in the entire url
If you define 'acl rule1 url_regex -i sex', this rule will deny EVERYTHING that contains the expression 'sex', NO MATTER in what case (sEx, seX, SEx, etc etc ), in the WHOLE url (www.sex.com/this, www.something.com/sex, www.microsoft.com/msexcel, etc etc ) Note the msexcel ?? mSEXcel would be blocked too !!! If you got only 'ex' in the url and it's not 'sex', this rule will NOT block it. Sincerily, Leonardo Rodrigues - Original Message - From: "Soccal Andrea" <[EMAIL PROTECTED]> To: "'Leonardo Rodrigues Magalhães'" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 10:47 AM Subject: [squid-users] R: [squid-users] Block site with a specific word in the entire url Ok But if i have a word (ex) similar at the word (sex) With Acl rule1 url_regex -i sex Http_access denied rule1 Block the access or no ? -Messaggio originale- Da: Leonardo Rodrigues Magalhães [mailto:[EMAIL PROTECTED] Inviato: venerdì 19 settembre 2003 15.48 A: Soccal Andrea; [EMAIL PROTECTED] Oggetto: Re: [squid-users] Block site with a specific word in the entire url -i means NO case sensitivite. acl rule1 url_regex sex http_access deny rule1 will deny http://www.something.com/sex but will ALLOW http://www.something.com/sEx because url_regex IS case sensitivite and you asked to blocked 'sex' and not 'sEx' using acl rule1 url_regex -i sex (now we got a case INsensitivite rule) instead will block all possible case combination of the word sex, like SEx, SEX, SeX, etc Hint: Use -i :) - Original Message - From: "Soccal Andrea" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 10:37 AM Subject: [squid-users] Block site with a specific word in the entire url > Hi squid boys ! > I want block the site with a SPECIFIC words in the url ...for example: > > Www.domain.com/ > www.domain.com/ok/ > > The command it's > Acl aclname url_regex ??? > > With -i option ??what's the mean of -i option ?? > > thanks >
[squid-users] R: [squid-users] Block site with a specific word in the entire ur l
Ok But if i have a word (ex) similar at the word (sex) With Acl rule1 url_regex -i sex Http_access denied rule1 Block the access or no ? Andrea Soccal IT Sistema Ufficio [EMAIL PROTECTED] -Messaggio originale- Da: Leonardo Rodrigues Magalhães [mailto:[EMAIL PROTECTED] Inviato: venerdì 19 settembre 2003 15.48 A: Soccal Andrea; [EMAIL PROTECTED] Oggetto: Re: [squid-users] Block site with a specific word in the entire url -i means NO case sensitivite. acl rule1 url_regex sex http_access deny rule1 will deny http://www.something.com/sex but will ALLOW http://www.something.com/sEx because url_regex IS case sensitivite and you asked to blocked 'sex' and not 'sEx' using acl rule1 url_regex -i sex (now we got a case INsensitivite rule) instead will block all possible case combination of the word sex, like SEx, SEX, SeX, etc Hint: Use -i :) Sincerily, Leonardo Rodrigues - Original Message - From: "Soccal Andrea" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 10:37 AM Subject: [squid-users] Block site with a specific word in the entire url > Hi squid boys ! > I want block the site with a SPECIFIC words in the url ...for example: > > Www.domain.com/ > www.domain.com/ok/ > > The command it's > Acl aclname url_regex ??? > > With -i option ??what's the mean of -i option ?? > > thanks >
Re: [squid-users] Block site with a specific word in the entire url
-i means NO case sensitivite. acl rule1 url_regex sex http_access deny rule1 will deny http://www.something.com/sex but will ALLOW http://www.something.com/sEx because url_regex IS case sensitivite and you asked to blocked 'sex' and not 'sEx' using acl rule1 url_regex -i sex (now we got a case INsensitivite rule) instead will block all possible case combination of the word sex, like SEx, SEX, SeX, etc Hint: Use -i :) Sincerily, Leonardo Rodrigues - Original Message - From: "Soccal Andrea" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 10:37 AM Subject: [squid-users] Block site with a specific word in the entire url > Hi squid boys ! > I want block the site with a SPECIFIC words in the url ...for example: > > Www.domain.com/ > www.domain.com/ok/ > > The command it's > Acl aclname url_regex ??? > > With -i option ??what's the mean of -i option ?? > > thanks >
[squid-users] Block site with a specific word in the entire url
Hi squid boys ! I want block the site with a SPECIFIC words in the url ...for example: Www.domain.com/ www.domain.com/ok/ The command it's Acl aclname url_regex ??? With -i option ??what's the mean of -i option ?? thanks Andrea Soccal IT Sistema Ufficio [EMAIL PROTECTED]
[squid-users] Block site with a specific word in the entire url
Hi squid boys ! I want block the site with a SPECIFIC words in the url ...for example: Www.domain.com/pussy www.domain.com/ok/fuck The command it's Acl aclname url_regex ??? With -i option ??what's the mean of -i option ?? thanks Andrea Soccal IT Sistema Ufficio [EMAIL PROTECTED]
Re: [squid-users] Why is Squid restarting?
> >Statistics based on 10 minutes of access.log during peak usage gives a > >quite good figure. > > How to get those statistics? Grab 10 minutes of data from access.log run your preferred log statistics program on the extracted access.log. Ok. I'll do that. > Yes, it is the ntlm helper children. With 1 gb of RAM I think I can increase > this number if needed. Indeed. I would start with 20 since you are today using 5 and your Squid complains a lot. Then use cachemgr to see if this number is sane during peak load (the last helper should only be used occationally) I put 15. If It uses the last helper a lot, I'll increase this number. > >A nice server, but total overkill for the job. > > Do you think I could work with only one processor? Yes, of course it will. Squid can only use one processor to start with, so unless you are also running other CPU intensive operations on the same box the second CPU will be virtually unused. I have another HP PROLIANT DL380 with just one processor. I'll use that server. Thanks a lot. Thank you, regards, Joao _ MSN Messenger: converse com os seus amigos online. http://messenger.msn.com.br
RE: [squid-users] Authentication related query
> No,the browser is configured to use the proxy. Then the problem could be with your browser. Squid requires the browser to send the authentication credentials for every request made. It is the browser alone that controls when and how to get the authentication info from the user. Post your squid.conf so we can make sure nothing's wrong there - be sure to remove any blank lines or comments. Adam
[squid-users] Cluster of caches
Hello, How would the performance in peering of caches effect the users - latency consideration ? If I get page from a peered cache, this info will be saved on the cache making the request ? Squid make checks with peering caches and directly to the relevant server when it has a miss ? Will serve page from the first who replies ? Thank you, Regards, Wilhelm smime.p7s Description: S/MIME cryptographic signature
RE: [squid-users] squid acl help needed
>> Then replace lunchbreak with the following >> >> acl coffeebreak time 09:00-10:00 >> acl coffeebreak time 13:00-14:00 >> acl coffeebreak time 18:00-19:00 > > Wowww! I thought that three acls by the same name might > create a problem. No, all it does is combine them - just as if you did: acl coffeebreak time 09:00-10:00 13:00-14:00 18:00-19:00 Both will work. Adam
Re: [squid-users] Ftp THrough proxy
michel lodap wrote: > > Hi All, > > Iam having trouble ftp-ing through squid > It's working fine when I don't go through or when I use a ftp client. > > For exemple I went to the hp web site and tried to download some > drivers(there is no restrictions at all concerning downloads). > This is what I am getting. > > ERROR > The requested URL could not be retrieved > > The following URL could not be retrieved: > > ftp://ftp.hp.com/pub/softlib/software1/lj845/lj-1908-1/lj845en.exe > > Squid sent the following FTP command: > > RETR lj845en.exe > and then received this reply > Can't build data connection: Connection timed out.This might be caused by > an FTP URL with an absolute path (which does not comply with RFC 1738). If > this is the cause, then the file can be found at > ftp://ftp.hp.com/%2f/pub/softlib/software1/lj845/lj-1908-1/lj845en.exe. > > Your cache administrator is root > > Any advice? > Many thanks in advance Hm, the original url works for me. Through squid 2.5S4 on redhat 6.2 Make sure that you are not confronted with firewalling issues. A good test,for instance is, to see what happens if the ftp is done manually with the ftp command from the squid box. M.
Re: [squid-users] squid acl help needed
On Fri, Sep 19, 2003 at 02:14:49PM +0200, Henrik Nordstrom wrote: > On Fri, 19 Sep 2003, Payal Rathod wrote: > > > What if I have to allow from time 09:00-10:00 and 6:00-07:00 too with > > lunchbreak? > > > > I mean the users can access hotmail, yahoo in the abvoe 3 hours only. > > Then replace lunchbreak with the following > > acl coffeebreak time 09:00-10:00 > acl coffeebreak time 13:00-14:00 > acl coffeebreak time 18:00-19:00 Wowww! I thought that three acls by the same name might create a problem. -Payal > Regards > Henrik > > > note: Squid-2.5.STABLE2 or later required, for earlier versions you need > to create one ACL per time interval > -- For GNU/Linux Success Stories and Articles visit: http://payal.staticky.com
[squid-users] Ftp THrough proxy
Hi All, Iam having trouble ftp-ing through squid It's working fine when I don't go through or when I use a ftp client. For exemple I went to the hp web site and tried to download some drivers(there is no restrictions at all concerning downloads). This is what I am getting. ERROR The requested URL could not be retrieved The following URL could not be retrieved: ftp://ftp.hp.com/pub/softlib/software1/lj845/lj-1908-1/lj845en.exe Squid sent the following FTP command: RETR lj845en.exe and then received this reply Can't build data connection: Connection timed out.This might be caused by an FTP URL with an absolute path (which does not comply with RFC 1738). If this is the cause, then the file can be found at ftp://ftp.hp.com/%2f/pub/softlib/software1/lj845/lj-1908-1/lj845en.exe. Your cache administrator is root Any advice? Many thanks in advance _ MSN Search, le moteur de recherche qui pense comme vous ! http://search.msn.fr/
Re: [squid-users] Authentication related query
Hi, No,the browser is configured to use the proxy. Regards and TIA, Deepa --- Henrik Nordstrom <[EMAIL PROTECTED]> wrote: > On Fri, 19 Sep 2003, Deepa D wrote: > > > The squid is configured to use pam_auth as a > basic > > auth helper and the cache is disabled. For every > url > > request a popup window appears asking for user > name > > and password. > > Are you attemting to set up authentication in a > transparently intercepting > proxy? This is not possible to do due to the nature > of intercepting port > 80. > > To use authentication you MUST have the browser > configured to use the > proxy. > > Regards > Henrik > Yahoo! India Matrimony: Find your partner online. Go to http://yahoo.shaadi.com
Re: [squid-users] squid acl help needed
On Fri, 19 Sep 2003, Payal Rathod wrote: > What if I have to allow from time 09:00-10:00 and 6:00-07:00 too with > lunchbreak? > > I mean the users can access hotmail, yahoo in the abvoe 3 hours only. Then replace lunchbreak with the following acl coffeebreak time 09:00-10:00 acl coffeebreak time 13:00-14:00 acl coffeebreak time 18:00-19:00 Regards Henrik note: Squid-2.5.STABLE2 or later required, for earlier versions you need to create one ACL per time interval
Re: [squid-users] Password Authentication
On Fri, 19 Sep 2003, Frank Chibesakunda wrote: > How do I configure squid so that my users enter a username and password > before accessing the internet? See the Squid FAQ. http://www.squid-cache.org/Doc/FAQ/FAQ-23.html Regards Henrik
Re: [squid-users] squid acl help needed
On Thu, Sep 18, 2003 at 03:28:27PM +0200, Henrik Nordstrom wrote: acl my_network src 192.168.10.0/24 ... [...] Thanks for the mail. It worksbeautifully. Just one small question below. > acl webmail dstdomain .yahoo.com .hotmail.com > acl lunchbreak time 13:00-14:00 > http_access deny !lunchbreak webmail > http_access allow my_network What if I have to allow from time 09:00-10:00 and 6:00-07:00 too with lunchbreak? I mean the users can access hotmail, yahoo in the abvoe 3 hours only. Thanks a lot again and bye. With warm regards, -Payal -- "Visit GNU/Linux Success Stories" http://payal.staticky.com Guest-Book Section Updated.
[squid-users] Password Authentication
list, How do I configure squid so that my users enter a username and password before accessing the internet? rgds Frank
Re: [squid-users] assertion failed "mem->inmem_hi == 0"
Hi, Managed to find this patch that seems to have solve the problem. Should we include this in the squid-icap links? http://sourceforge.net/tracker/index.php?func=detail&aid=651877&group_id=47737&atid=450621 Rgds, Wei Keong On Fri, 19 Sep 2003, Wei Keong wrote: > Hi, > > I encounter the same error while testing icap to anti-virus-scan-server on > squid-2.5S4. > > assertion failed: errorpage.c:292: "mem->inmem_hi == 0" > > I notice that this error will only happen when a virus is found. > > Is this a squid bug or problem with icap? Did you manage to resolve this > problem? > > Thanks, > Wei Keong > > > > > > > Mike Cudmore [EMAIL PROTECTED] > > Fri, 24 Jan 2003 12:31:06 + > > > > I am testing squid working with icap. I am using squid icap client patch > > (1.2.1) from source forge. When I test the squid I get the following > > error. > > > > 2003/01/23 15:04:35| assertion failed: errorpage.c:271: "mem->inmem_hi > > == 0" > > > > in the cache.log file. > > > > After this squid executes a normal startup sequence. > > > > This is regardless of url entered in the browser. (e.g. > > www.google.com) > > There are no entries in access.log nor anything other than normal > > startup sequence in cache.log > > > > > > When I disable ICAP in squid.conf the error does not occur. > > > > OS is linux redhat 8.0 > > Squid is 2.5 stable 1 > > ICAP server is Finjan's released version 7.0 of their icap enabled > > surfingate server, > > > > I have also put this issue on the squid-icapclient mailing list. > > > > Any one in the main squid arena able to offer any assistance? > > > > Regards > > Mike Cudmore > > GSI & Intranet Connectivity Team > > > >
Re: [squid-users] assertion failed "mem->inmem_hi == 0"
Hi, I encounter the same error while testing icap to anti-virus-scan-server on squid-2.5S4. assertion failed: errorpage.c:292: "mem->inmem_hi == 0" I notice that this error will only happen when a virus is found. Is this a squid bug or problem with icap? Did you manage to resolve this problem? Thanks, Wei Keong > Mike Cudmore [EMAIL PROTECTED] > Fri, 24 Jan 2003 12:31:06 + > > I am testing squid working with icap. I am using squid icap client patch > (1.2.1) from source forge. When I test the squid I get the following > error. > > 2003/01/23 15:04:35| assertion failed: errorpage.c:271: "mem->inmem_hi > == 0" > > in the cache.log file. > > After this squid executes a normal startup sequence. > > This is regardless of url entered in the browser. (e.g. > www.google.com) > There are no entries in access.log nor anything other than normal > startup sequence in cache.log > > > When I disable ICAP in squid.conf the error does not occur. > > OS is linux redhat 8.0 > Squid is 2.5 stable 1 > ICAP server is Finjan's released version 7.0 of their icap enabled > surfingate server, > > I have also put this issue on the squid-icapclient mailing list. > > Any one in the main squid arena able to offer any assistance? > > Regards > Mike Cudmore > GSI & Intranet Connectivity Team
Re: [squid-users] non-permanent internet-access, idnsSendQuery
On Fri, 19 Sep 2003, michael mueller wrote: > hi all, > > I'm using squid Version 2.3.STABLE4-hno.CVS Yuck.. an very old version of Squid, using a completely untested patchset of mine. You did read my notes before installing this one? > I have a non-permanent internet-access, and when it's closed and somebody > tries to open a web-page squid says: > squid[10162]: idnsSendQuery: FD 2: sendto: (101) Network is unreachable > squid[10162]: comm_udp_sendto: FD 2, 194.25.2.129, port 53: (101) Network is > unreachable > > the message ist clear, but I don't want to get it for 5 minutes. > where can I set the timeout? I do not quite get the question. In any event, upgrading to a supported Squid version may be in order if there is no dns_timeout parameter in your version. Even if there are I would reommend upgrading. Regards Henrik
Re: [squid-users] Authentication related query
On Fri, 19 Sep 2003, Deepa D wrote: > The squid is configured to use pam_auth as a basic > auth helper and the cache is disabled. For every url > request a popup window appears asking for user name > and password. Are you attemting to set up authentication in a transparently intercepting proxy? This is not possible to do due to the nature of intercepting port 80. To use authentication you MUST have the browser configured to use the proxy. Regards Henrik
Re: [squid-users] Raid Hard Disk requirments
On Fri, 19 Sep 2003, Wilhelm Farrugia wrote: > Would there be any increased in performance in using two IDE hard disks > with hardware raid for the cache storage instead of two seperated Hard > disks managed by squid ? ( Seek time should remain the same as one hard > disk however thoughput would be increased ) Generally seek times is what you want to optimize for Squid, or more precisely the total amount of seeks/s your system can sustain. RAID generally decreases the amount of seeks/s your system can sustain significantly. Best performance is usually from having separate drives. Regards Henrik
Re: [squid-users] Raid Hard Disk requirments
1- I think the more important thing is that two hard disks should be connected with separated bus. 2- How did you calculate this? Regards, Ilker G. On Fri, 2003-09-19 at 11:13, Wilhelm Farrugia wrote: > Hello, > > Would there be any increased in performance in using two IDE hard disks > with hardware raid for the cache storage instead of two seperated Hard > disks managed by squid ? ( Seek time should remain the same as one hard > disk however thoughput would be increased ) > > Size of IDE is 120G each which requires 3.2G of RAM for indexing, is this > correct ? > > Thank you, > Regards, > > Wil
[squid-users] Raid Hard Disk requirments
Hello, Would there be any increased in performance in using two IDE hard disks with hardware raid for the cache storage instead of two seperated Hard disks managed by squid ? ( Seek time should remain the same as one hard disk however thoughput would be increased ) Size of IDE is 120G each which requires 3.2G of RAM for indexing, is this correct ? Thank you, Regards, Wil smime.p7s Description: S/MIME cryptographic signature
Re: [squid-users] Patches causing problem
Understood ! Thanks Henrik. I will try the latest version and do more investigation of my current problem. Thx & Rgds, Awie - Original Message - From: "Henrik Nordstrom" <[EMAIL PROTECTED]> To: "Awie" <[EMAIL PROTECTED]> Cc: "Squid-users" <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 2:50 PM Subject: Re: [squid-users] Patches causing problem > On Fri, 19 Sep 2003, Awie wrote: > > > squid-2.4.STABLE7-url_escape.patch > > squid-2.4.STABLE7-url_port.patch > > > > Are there any experiences using the such patches? > > Why are you patching a Squid-2.4 release and not upgrading to Squid-2.5? > > I cannot comment on these patches as I or any other Squid developer have > not looked at the obsolete Squid-2.4 release for more than a year now, and > there will never be any developer looking into problems relating to > Squid-2.4 unless as part of a paid contract with the developer. But it > does not look like these patches should cause any problems. > > Try a "make distclean; make install". Or better yet upgrade to > Squid-2.5.STABLE4. > > Regards > Henrik >
[squid-users] non-permanent internet-access, idnsSendQuery
hi all, I'm using squid Version 2.3.STABLE4-hno.CVS on kernel version 2.4.4. I have a non-permanent internet-access, and when it's closed and somebody tries to open a web-page squid says: squid[10162]: idnsSendQuery: FD 2: sendto: (101) Network is unreachable squid[10162]: comm_udp_sendto: FD 2, 194.25.2.129, port 53: (101) Network is unreachable the message ist clear, but I don't want to get it for 5 minutes. where can I set the timeout? I've tried: negative_dns_ttl 3 minutes negative_ttl 1 minutes connect_timeout 120 seconds read_timeout 2 minutes request_timeout 30 seconds Thanks in advance -- +++ GMX - die erste Adresse für Mail, Message, More! +++ Getestet von Stiftung Warentest: GMX FreeMail (GUT), GMX ProMail (GUT) (Heft 9/03 - 23 e-mail-Tarife: 6 gut, 12 befriedigend, 5 ausreichend) Jetzt selbst kostenlos testen: http://www.gmx.net
[squid-users] Authentication related query
Hi All, The squid is configured to use pam_auth as a basic auth helper and the cache is disabled.For every url request a popup window appears asking for user name and password. Is there a way to avoid this from happening for every request?The requirement is as follows :- The first time a request comes from a particular IP , the login page should be popped up.But for some defined time interval , any requests coming from the same IP should not be asked to login. I tried setting the credentialsttl option in the squid.conf file but it doesn't serve the purpose.I think we need the IP and the current time when the first time successful authentication happens for that IP to be mapped and saved in the user cache.Each time a request comes in this mapping will have to be compared with the allowed time period and only if the time has exceeded the the user should be prompted to relogin. Kindly tell me how to do this. Regards and TIA, Deepa Yahoo! India Matrimony: Find your partner online. Go to http://yahoo.shaadi.com
Re: [squid-users] Problem with Filedescriptors
Add the following to your /etc/system file to increase your maximum file descriptors per process: set rlim_fd_max = 4096 Next you should re-run the configure script in the top directory so that it finds the new value. If it does not find the new limit, then you might try editing include/autoconf.h and setting #define DEFAULT_FD_SETSIZE by hand. Note that include/autoconf.h is created from autoconf.h.in every time you run configure. Thus, if you edit it by hand, you might lose your changes later on. If you have a very old version of Squid (1.1.X), and you want to use more than 1024 descriptors, then you must edit src/Makefile and enable $(USE_POLL_OPT). Then recompile squid. -- Best Regs, Masood Ahmad Shah System Administrator ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ | * * * * * * * * * * * * * * * * * * * * * * * * | Fibre Net (Pvt) Ltd. Lahore, Pakistan | Tel: +92-42-6677024 | Mobile: +92-300-4277367 | http://www.fibre.net.pk | * * * * * * * * * * * * * * * * * * * * * * * * ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ Unix is very simple, but it takes a genius to understand the simplicity. (Dennis Ritchie) - Original Message - From: "Gustavo" <[EMAIL PROTECTED]> To: "Squid Users" <[EMAIL PROTECTED]> Sent: Friday, September 19, 2003 1:52 AM Subject: [squid-users] Problem with Filedescriptors | I've had installed the Squid Cache: Version 2.5.STABLE2-20030411 pon = | Solaris 8 Server and iva have this error messages in the cache.log=20 | | 2003/09/18 15:15:09| WARNING! Your cache is running out of = | filedescriptors | | This error stop the squid services and i must to restart it = | againg...any idea about the solution? | | thanks=20 | | | |