[squid-users] Re: authentication problems
Awesome linky amos. Thanks it is just what I'm looking for. I'm going to try to work that into my squid.conf file and I'll report back here any problems. -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/authentication-problems-tp3072735p3080564.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Re: authentication problems
AWESOME it is working mostly flawlessly!! I notice that the whitelist file (/etc/squid3/whitelist1.sites) doesn't take comments or duplications or reduntant info. Like .ftp.debian.org when there is already a .debian.org. It errors and don't work. But once I got over that it seems to be working nicely. As long as you surf the white list you aren't prompted for a password. But if you go off white list you are!! Is it possible to direct browsers that fail to authenticate to a website? I could direct them to the internal web server with instructions on how to get valid credentials. Here is my current squid.conf file... http_port 3128 #cache_mem 512 MB# May need to set lower if I run low on RAM redirect_rewrites_host_header off cache_replacement_policy lru auth_param basic program /usr/lib/squid3/ncsa_auth /etc/squid/passwd auth_param basic children 5 auth_param basic realm blocker auth_param basic credentialsttl 12 hours auth_param basic casesensitive off acl whitelist dstdomain /etc/squid3/whitelist1.sites acl ncsa_users proxy_auth REQUIRED acl localnet src 192.168.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 81 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT acl AUTH_users proxy_auth ant2ne xbox mandi http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow whitelist http_access allow ncsa_users http_access allow AUTH_users http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 visible_hostname BLOCKER -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/authentication-problems-tp3072735p3080682.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Re: authentication problems
I upgraded to squid 3.0, and the slow prompt problem went away. Problem 1 solved. Problem 2, I would like anyone who fails to authenticate to be assigned a user creditials; default-user. How would I do this? No reasonably secure browser sends credentials by default. Anyone who fails to authenticate is requested to send credentials. Let me address problem 2 a different way. Suppose my external firewall bounces all traffic that is not originating from my proxy. Suppose there are automated applications that need to access the internet, and cant' supply credentials. Could I tweak squid's acl to not require authentication for devices trying to access those locations. Suppose I needed my anti virus to get files from http://myantivirus.com, but it doesn't open a browser to fetch these updates. -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/authentication-problems-tp3072735p3075367.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] authentication problems
I want to use squid and dansguardign to filter by groups. It is working, sort of. Problem 1, after launching the web browser it takes a very long time (a minute or 2) before the authentication dialog pops up. This needs to be instant. What am I doing wrong? Once it does finally pop up, I can authenticate and dansguardian does assign the proper filtering groups. Problem 2, I would like anyone who fails to authenticate to be assigned a user creditials; default-user. How would I do this? Problem 3, Can I edit the text of the authentication dialog box? Below this point is my squid.conf file... http_port 3128 # acl QUERY urlpath_regex cgi-bin \? #Removed by Amos, suggested to speed up web sites using media #cache_mem 512 MB# May need to set lower if I run low on RAM #maximum_object_size_in_memory 4096 KB #Increased by Amos, suggested to speed up web sites using media #maximum_object_size 1 GB #cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru #auth_param basic program /usr/lib/squid/getpwnam_auth /etc/passwd # above may require this at the end - /etc/passwd auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/passwd auth_param basic children 5 auth_param basic realm blocker auth_param basic credentialsttl 12 hours auth_param basic casesensitive off #auth_param basic max_challenge_lifetime 2 minutes # above line fails acl ncsa_users proxy_auth REQUIRED acl all src all acl localnet src 192.168.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 81 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl AUTH_users proxy_auth ant2ne xbox mandi #acl internalSite1 dstdomain eaplus.altonschools.org #acl internalSite2 dstdomain reports.altonschools.org acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow ncsa_users http_access allow AUTH_users #http_access allow reports_Printing #http_access allow internalSite1 #http_access allow internalSite2 http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 #access_log /var/log/squid/access.log squid visible_hostname BLOCKER -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/authentication-problems-tp3072735p3072735.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Re: authentication problems
Also, Is it possible to permit certain sites (automatic updates and anti-virus) with out any authentications? -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/authentication-problems-tp3072735p3072745.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] squid crashing
The squid web cache proxy is crashing. The symptom is that the all client browsers just time out after several minutes. This server was working fine until last week. And I can't think of anything that changed last week. Since moving into production, this is the first problem I've had with this server. I'm eager for advice on troubleshooting. Examining the /var/log/squid/access.log cache.log and store.log isn't showing me anything obvious. current squid.conf http_port 3128 # acl QUERY urlpath_regex cgi-bin \? #Removed by Amos, suggested to speed up web sites using media cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 4096 KB #Increased by Amos, suggested to speed up web sites using media maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cach_effective_user proxy cache_replacement_policy lru acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 81 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl internalSite1 dstdomain eaplus.altonschools.org acl internalSite2 dstdomain reports.altonschools.org acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow reports_Printing http_access allow internalSite1 http_access allow internalSite2 http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 access_log /var/log/squid/access.log squid visible_hostname AMSPX01 I've had with this server. I'm eager for advice on troubleshooting. -- View this message in context: http://old.nabble.com/squid-crashing-tp27244484p27244484.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] can squid redirect the browser?
Can squid do redirection? For example, instead of microsoft's run once website loading (http://runonce.msn.com/runonce2.aspx), I'd like to redirect the browser to our internal home page. How would I accomplish this? -- View this message in context: http://www.nabble.com/can-squid-redirect-the-browser--tp26119294p26119294.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] how do I pass through the proxy for all data within the intranet
the plot thickens... it seems the that site that gets access denied is reports.altonschools.org:81/blahblah.pdf How do I permit port 81? -- View this message in context: http://www.nabble.com/how-do-I-pass-through-the-proxy-for-all-data-within-the-intranet-tp25995121p26027833.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] how do I pass through the proxy for all data within the intranet
I can do it by name ;-) I've added the following 2 lines to my squid.conf acl internalSite dstdomain eaplus.altonschools.org http_access allow internalSite Is that correct? So, the new squid conf would look like... http_port 3128 # acl QUERY urlpath_regex cgi-bin \? #Removed by Amos, suggested to speed up web sites using media cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 4096 KB #Increased by Amos, suggested to speed up web sites using media maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl internalSite dstdomain eaplus.altonschools.org acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow internalSite http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 access_log /var/log/squid/access.log squid visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/how-do-I-pass-through-the-proxy-for-all-data-within-the-intranet-tp25995121p26016211.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Squid not caching some sites
Amos, Thanks for the assistance. I made the changes as suggested. But the teacher is still complaining about choppiness and slow performance. I made the suggested changes yesterday and then I tried to prime the proxy by visiting the site and playing the video after hours. It played just fine after hours. I feel that If I can get the proxy to grab these videos then it should improve performance in the classroom. My current squid.conf. http_port 3128 #acl QUERY urlpath_regex cgi-bin \? cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 4096 KB # May need to set lower if I run low on RAM maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl all src all acl localnet src 10.80.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 #ignore-reload refresh_pattern \.gif$ 3600 50% 60 #ignore-reload refresh_pattern \.css$ 3600 50% 60 #ignore-reload refresh_pattern \.js$ 3600 50% 60 #ignore-reload refresh_pattern \.html$ 300 50% 10 #ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 access_log /var/log/squid/access.log squid visible_hostname AMSPX01 -- View this message in context: http://www.nabble.com/Squid-not-caching-some-sites-tp25962650p25993328.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] how do I pass through the proxy for all data within the intranet
I'm not sure what I'm trying to describe. This webcache proxy is not used for any security what so ever. We have other filtering devices. This proxy is only designed to cache websites. For the most part it is working well. But, we have some users that try to access intranet sites vie a web console and they get access denied from squid. I'm thinking that it is probably that these intranet sites open up a port that is restricted by squid in some way. I'm wanting to pass through all traffic on all ports for all client computers who are accessing an ip address of 10.0.0.0. I want these sites just get passed through the proxy without caching the data. Here is my current squid.conf http_port 3128 # acl QUERY urlpath_regex cgi-bin \? #Removed by Amos, suggested to speed up web sites using media cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 4096 KB #Increased by Amos, suggested to speed up web sites using media maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 access_log /var/log/squid/access.log squid visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/how-do-I-pass-through-the-proxy-for-all-data-within-the-intranet-tp25995121p25995121.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Squid not caching some sites
My squid web cache proxy server is not caching sites such as... http://www.netsmartz.org/resources/reallife.htm http://www.netsmartz.org/stories/canttake.htm http://www.nsteens.org/videos/social-networking/ These sites contain video that, when played, are choppy and cut out. I'm certain that these videos aren't getting cached. And this is kind of the point to the whole web cache project. I need for teachers to be able to cache these kinds of things, so when the students try to access them they play quicker and more smooth. How do I convince squid to cache these? Here is my current squid.conf http_port 3128 acl QUERY urlpath_regex cgi-bin \? cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 2048 KB # May need to set lower if I run low on RAM maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl all src all acl localnet src 10.80.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 refresh_pattern \.gif$ 3600 50% 60 refresh_pattern \.css$ 3600 50% 60 refresh_pattern \.js$ 3600 50% 60 refresh_pattern \.html$ 300 50% 10 refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 access_log /var/log/squid/access.log squid visible_hostname AMSPX01 -- View this message in context: http://www.nabble.com/Squid-not-caching-some-sites-tp25962650p25962650.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] My sarg broke
I like Webmin and Sarg. Something recently has broken it. I don't care if I loose all old logs, but I need to generate new ones. Here is the error I get Now generating Sarg report from Squid log file /var/log/squid/access.log and all rotated versions .. sarg -l /var/log/squid/access.log.1 -d 05/10/2009-06/10/2009 SARG: No records found SARG: End sarg -l /var/log/squid/access.log.2.gz -d 05/10/2009-06/10/2009 SARG: Decompressing log file: /var/log/squid/access.log.2.gz /tmp/sarg-file.in (zcat) SARG: No records found SARG: End .. Sarg finished, but no report was generated. See the output above for details. Here is some Back Story, along with my current squid.conf http://www.nabble.com/not-caching-enough-td25530445.html#a25553196 -- View this message in context: http://www.nabble.com/My-sarg-broke-tp25775972p25775972.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Squid version 2.6. This is the apt-get version for ubuntu 8.04. I think you are right about the ignore-reload. Here is my squid.conf that I will put into production at 3pm today. http_port 3128 acl QUERY urlpath_regex cgi-bin \? cache_mem 512 MB# May need to set lower if I run low on RAM maximum_object_size_in_memory 2048 KB# May need to set lower if I run low on RAM maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT http_access deny !Safe_ports http_access deny CONNECT !SSL_Ports http_access allow localnet http_access allow localhost http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 #ignore-reload refresh_pattern \.gif$ 3600 50% 60 #ignore-reload refresh_pattern \.css$ 3600 50% 60 #ignore-reload refresh_pattern \.js$ 3600 50% 60 #ignore-reload refresh_pattern \.html$ 300 50% 10 #ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25752421.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
This is great, the proxy is caching about a gig a day. Below is the final and fine tuned squid.conf that I will put into production after school lets out today. administra...@ahspx01:~$ cat /etc/squid/squid.conf http_port 3128 acl QUERY urlpath_regex cgi-bin \? #no_cache deny QUERY cache_mem 512 MB maximum_object_size_in_memory 2048 KB maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru #acl QUERY urlpath_regex cgi-bin \? acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl SSL_Ports port 443 acl CONNECT method CONNECT http_access allow localnet http_access allow localhost http_access deny !Safe_ports http_access allow localnet http_access allow localhost http_access deny CONNECT http_access deny CONNECT !Safe_Ports http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 ignore-reload refresh_pattern \.gif$ 3600 50% 60 ignore-reload refresh_pattern \.css$ 3600 50% 60 ignore-reload refresh_pattern \.js$ 3600 50% 60 ignore-reload refresh_pattern \.html$ 300 50% 10 ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 #refresh_pattern . 60 50% 10 ignore-reload refrsh_pattern . 0 20% 4320 visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25704652.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Thanks for the continued support! You say The CONNECT rule does need to be deny CONNECT !SSL_Ports. - But I dont' see a !SSL_Ports You say I'd shift that one http pattern above up above the ftp pattern. But I dont see http as a refresh pattern. I do see html. Is this what you mean? My currrent squid.conf http_port 3128 acl QUERY urlpath_regex cgi-bin \? #no_cache deny QUERY cache_mem 512 MB maximum_object_size_in_memory 2048 KB maximum_object_size 1 GB cache_dir aufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru #acl QUERY urlpath_regex cgi-bin \? acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl CONNECT method CONNECT http_access allow localnet http_access allow localhost http_access deny !Safe_ports http_access allow localnet http_access allow localhost http_access deny CONNECT http_access deny CONNECT !Safe_Ports http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 ignore-reload refresh_pattern \.gif$ 3600 50% 60 ignore-reload refresh_pattern \.css$ 3600 50% 60 ignore-reload refresh_pattern \.js$ 3600 50% 60 ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern \.html$ 300 50% 10 ignore-reload refresh_pattern . 60 50% 10 ignore-reload refrsh_pattern . 0 20% 4320 visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25681166.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Thanks for all of the great replies. There is lots of information to digest. I appreciate all of the suggestions. But, Before I got any of these replies, I went ahead and made modifications to my squid.conf to match an example I found on the internet here is my current running squid.conf http_port 3128 icp_port 0 acl QUERY urlpath_regex cgi-bin \? no_cache deny QUERY cache_mem 16 MB cache_dir ufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1/255.255.255.255 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl CONNECT method CONNECT acl all src 0.0.0.0/0.0.0.0 http_access allow localnet http_access allow localhost http_access deny !Safe_ports http_access deny CONNECT http_access deny all log_icp_queries off This one seems to be caching. I can refresh webmin system info every few hours and see that /cache is growing in space used. Although, very slowly. Amos Jeffries tookers; I've taken the working squid.conf (above), and applied your suggestions to it (below). Please review this squid.conf (below) and make suggestions to it before I put it into production. http_port 3128 icp_port 0 no_cache deny QUERY cache_mem 512 MB maximum_object_size_in_memory 2048 KB maximum_object_size 1 GB cache_dir ufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl QUERY urlpath_regex cgi-bin \? acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl CONNECT method CONNECT http_access allow localnet http_access allow localhost http_access deny !Safe_ports http_access deny CONNECT http_access deny all icp_access allow our_networks icp_access allow localhost icp_access deny all refresh_pattern \.jpg$ 3600 50% 60 ignore-reload refresh_pattern \.gif$ 3600 50% 60 ignore-reload refresh_pattern \.css$ 3600 50% 60 ignore-reload refresh_pattern \.js$ 3600 50% 60 ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern \.html$ 300 50% 10 ignore-reload THANKS!! -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25668625.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
THANKS!!! With those changes I'm looking at... http_port 3128 no_cache deny QUERY cache_mem 512 MB maximum_object_size_in_memory 2048 KB maximum_object_size 1 GB cache_dir ufs /cache 50 256 256 redirect_rewrites_host_header off cache_replacement_policy lru acl QUERY urlpath_regex cgi-bin \? acl all src all acl localnet src 10.60.0.0/255.255.0.0 acl localhost src 127.0.0.1 acl to_localhost dst 127.0.0.0/8 0.0.0.0/8 acl Safe_ports port 80 443 210 119 70 21 1025-65535 acl CONNECT method CONNECT http_access allow localnet http_access allow localhost http_access deny !Safe_ports http_access deny CONNECT http_access deny CONNECT !Safe_Ports http_access deny all icp_port 0 refresh_pattern \.jpg$ 3600 50% 60 ignore-reload refresh_pattern \.gif$ 3600 50% 60 ignore-reload refresh_pattern \.css$ 3600 50% 60 ignore-reload refresh_pattern \.js$ 3600 50% 60 ignore-reload refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern \.html$ 300 50% 10 ignore-reload refresh_pattern . 60 50% 10 ignore-reload visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25669996.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Ooops, I need acl QUERY urlpath_regex cgi-bin \? before no_cache deny QUERY -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25670155.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Ok, it has come to my attention that /cache only grwos when I run a report using sarg. So it maybe that my proxy server is working, but only as a proxy and not as a web cache proxy. Below is my squid.conf file with the comment # invert grepped out. Please review and tell me what it is that I need to change to turn this proxy server into a web cache server. below is me squid.conf acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl purge method PURGE acl CONNECT method CONNECT http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access deny to_localhost acl our_networks src 10.60.140.0/24 http_access allow our_networks http_access allow localhost http_access allow all http_access deny all icp_access allow all http_port 3128 hierarchy_stoplist cgi-bin ? cache_dir ufs /cache 50 256 256 maximum_object_size 32768 KB access_log /var/log/squid/access.log squid acl QUERY urlpath_regex cgi-bin \? cache deny QUERY refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl apache rep_header Server ^Apache broken_vary_encoding allow apache extension_methods REPORT MERGE MKACTIVITY CHECKOUT hosts_file /etc/hosts coredump_dir /var/spool/squid visible_hostname AHSPX01 -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25645183.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
You said, In which case do RAM check and see how much is used free before trying to cache any more. I'm currently using 205Megs of the 5Gigs of RAM You said, This change requires a full stop of Squid. remove the cache directory and rebuild it with squid -z, then restart Squid. That would explain it. But how do I remove the cache directory 'rm -R /cache'? -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25611950.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] not caching enough
Hey thanks for the input This web cache proxy server is a dedicated machine running ubuntu 64bit OS (no gui) with 5gigs of RAM and a 1TB drive dedicated to the cache. Only using 500GB currently. (The OS is on a different 80GB drive) Accordign to the math of 10MB of RAM per 1GB of Disk space that would put me right at 5GB of RAM. Yesterday I increased the maximum_object_size value to 32MB from the default of 4MB and haven't noticed any difference in the cache size. I think I'll push it up around 50MB. I tried to change my L1 and L2 values from the default 16 256 but it didn't like the change because the squid service didn't restart until I changed it back. I don't think I tried 256 256, so I'll try that next. I will need to read up on and expirement with refresh_pattern -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25578014.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] not caching enough
I got this 1TB drive and mounted it as /cache. I want to cache everything and anything and keep it until it is outdated. Webmin | Servers | squid | Cache has cache directories set to /cache and Size (MB) set to 50 (the rest set to default) administra...@ahspx01:~$ df -h FilesystemSize Used Avail Use% Mounted on /dev/sda1 72G 1.2G 67G 2% / varrun2.5G 184K 2.5G 1% /var/run varlock 2.5G 0 2.5G 0% /var/lock udev 2.5G 40K 2.5G 1% /dev devshm2.5G 0 2.5G 0% /dev/shm /dev/sdb1 917G 1.1G 870G 1% /cache administra...@ahspx01:~$ ls -l /cache total 5220 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 00 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 01 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 02 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 03 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 04 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 05 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 06 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 07 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 08 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 09 drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0A drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0B drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0C drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0D drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0E drwxr-x--- 258 proxy proxy4096 2009-07-15 14:11 0F drwxr-x--- 2 proxy proxy 16384 2009-07-15 11:01 lost+found -rw-r- 1 proxy proxy 5247120 2009-09-21 14:32 swap.state -rw-r- 1 proxy proxy 0 2009-09-21 06:30 swap.state.last-clean There is currently about 100 computers using this cache proxy. I intend to add another 200. I can use sarg and get reports showing that websites are being cached, so it is working. But it just doesn't seem to be caching enough. Are there file types that are not getting cached that I can turn on? Why wont this cache fill up? -- View this message in context: http://www.nabble.com/not-caching-enough-tp25530445p25530445.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] inintended computers are using the proxy
Squid is up and running great. I want to push out proxy settings to the windows xp computers VIA domian level group policy so that some computers use the proxy server (Gorup A) and some do not (group B). This is accomplished by configured gmc.mscUser Configuration | Windows Settings | Internet Explorer Maintenance | Connection/Proxy Settings. (and including a loop back) F or group A this is working great. But for some reason group B is getting configured in their browsers as well. I don't understand what is going on. how are computers in group B getting configured like computers in group A. Both are in separate OUs and only OU A has the policy linked for it. Do web browsers have a way of auto discovering squid and configuring themselves? If so how do I turn that feature off? -- View this message in context: http://www.nabble.com/inintended-computers-are-using-the-proxy-tp25230790p25230790.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Squid Web Cache Proxy Server and automated Page Pre Caching
Could you suggest one of thes 3rd party tools? Something local on the proxy would be great (ubuntu 8.04 server, no GUI) and something webmin configurable would be awesome! But i'd settle for a cron controlled script of some sort. Matus UHLAR - fantomas wrote: On 29.07.09 14:17, ant2ne wrote: I've set up 2 squid Web Cache Proxies and they are working great! I'm preparing to movee proxie servers into production. I'm using webmin for configuration as much as possible. And I'm using sarg for report running. One thing I'd like to be able to do is to tell squid to be certain that it has up to date (checks daily) caches for given web pages. Either automatically once a day the proxy needs to refresh the cache on those pages, and/or refuse those pages to be over written (prioritizing) by other pages being cached. I'm sure that squid can do this, but I'm not sure where to begin. Or even what I'm describing is technically called. And is there a way to configure this all through webmin? No, squid can't do that. But you can use third party tool to fetch those pages through squid so it caches them. But first check if they will really get cached, it's quite useless to pre-fetch uncacheable content. -- Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/ Warning: I wish NOT to receive e-mail advertising to this address. Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu. Windows 2000: 640 MB ought to be enough for anybody -- View this message in context: http://www.nabble.com/Squid-Web-Cache-Proxy-Server-and-automated-Page-Pre-Caching-tp24727762p24780347.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Squid Web Cache Proxy Server and automated Page Pre Caching
I've set up 2 squid Web Cache Proxies and they are working great! I'm preparing to move the proxie servers into production. I'm using webmin for configuration as much as possible. And I'm using sarg for report running. One thing I'd like to be able to do is to tell squid to be certain that it has up to date (checks daily) caches for given web pages. Either automatically once a day the proxy needs to refresh the cache on those pages, and/or refuse those pages to be over written (prioritizing) by other pages being cached. I'm sure that squid can do this, but I'm not sure where to begin. Or even what I'm describing is technically called. And is there a way to configure this all through webmin? -- View this message in context: http://www.nabble.com/Squid-Web-Cache-Proxy-Server-and-automated-Page-Pre-Caching-tp24727762p24727762.html Sent from the Squid - Users mailing list archive at Nabble.com.