[squid-users] Re: Squid3 Reverse Proxy based on url path
in few words what i'm tryng to do is to publish over internet just a reverse proxy that answer to only one dns record: ex: webapps.domain.com behind it i would like to place my intranet web servers that host diffferent services like webmail,cms etc.. simply checking the url after webapps.domain.com... There's no load balancing ... every server host a different service... If its not possible... can u suggest me a similar way to reach this goal? Are there any config examples regarding it? Thank u very much -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid3-Reverse-Proxy-based-on-url-path-tp2197692p2216251.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Re: Squid3 Reverse Proxy based on url path
rainolf wrote: in few words what i'm tryng to do is to publish over internet just a reverse proxy that answer to only one dns record: ex: webapps.domain.com behind it i would like to place my intranet web servers that host diffferent services like webmail,cms etc.. simply checking the url after webapps.domain.com... There's no load balancing ... every server host a different service... If its not possible... can u suggest me a similar way to reach this goal? Forget the use of imaginary URLs. If the web app can't handle what the client is asking for things go pear shaped quickly and easily. Are there any config examples regarding it? The wiki example you already know for multiple backends with cache_peer_access based on the actual URLs those apps handle. Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.3
[squid-users] Settings per User?
Hi there, I would like to run squid with the ability to change the settings for the user. Is there any way use different settings based e. g. on the source ip, that means User with IP 10.6.0.2 uses privoxy to filter ads and 10.6.0.6 does not. Is this possible? I know I can do so with different instances of squid, but I would like to run only one instance as it is much easier to maintain. Thanks in advance. regards
[squid-users] how do I ensure that my request is passing through squid proxy
Hi, I have a doubt, I am using a proxy enabled network (which is provided by my institution). My browser is configured with the corresponding proxy in order to access the web pages. I want to use squid proxy in order to serve some of my needs. I can't configure browser with the squid as I need to my institution proxy to access the web. So all my requests first should go to squid proxy then to the institution proxy. I did this by reading the faq (How do I configure Squid forward all requests to another proxy?). But now my doubt is how would I ensure that first my request is going to squid and then to my insti proxy (as my browser is not configured with the squid)? Thank you. Regards jyoti
Re: [squid-users] how do I ensure that my request is passing through squid proxy
jyothi wrote: Hi, I have a doubt, I am using a proxy enabled network (which is provided by my institution). My browser is configured with the corresponding proxy in order to access the web pages. I want to use squid proxy in order to serve some of my needs. I can't configure browser with the squid as I need to my institution proxy to access the web. So all my requests first should go to squid proxy then to the institution proxy. I did this by reading the faq (How do I configure Squid forward all requests to another proxy?). But now my doubt is how would I ensure that first my request is going to squid and then to my insti proxy (as my browser is not configured with the squid)? Since that Squid is using the Institution proxy you should configure your browser to use the Squid. So you end up with this: Browser -> Squid -> Institution Proxy -> Internet Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.3
Re: [squid-users] how do I ensure that my request is passing through squid proxy
Thanks for the quick reply. I am using firefox, there I can only specify one proxy (that is presently institution proxy), may be I don't know, could you please elaborate? On Fri, 14 May 2010 23:26:25 +1200, Amos Jeffries wrote > jyothi wrote: > > Hi, > > > > I have a doubt, I am using a proxy enabled network (which is provided by my institution). > > My browser is configured with the corresponding proxy in order to access > > the web pages. > > I want to use squid proxy in order to serve some of my needs. I can't > > configure browser > > with the squid as I need to my institution proxy to access the web. So all > > my requests > > first should go to squid proxy then to the institution proxy. I did this by > > reading the > > faq (How do I configure Squid forward all requests to another proxy?). But > > now my doubt > > is how would I ensure that first my request is going to squid and then to > > my insti proxy > > (as my browser is not configured with the squid)? > > > > Since that Squid is using the Institution proxy you should configure > your browser to use the Squid. > > So you end up with this: >Browser -> Squid -> Institution Proxy -> Internet > > Amos > -- > Please be using >Current Stable Squid 2.7.STABLE9 or 3.1.3
[squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube
Dear All, I require your help and guidance regarding dynamic content caching. Following are the quries. 1. I am running squid in multiple instances mode (For Cache Disk Failure Protection). I dont think that it has any effect on internet object caching? I am confused that if connect methods are to be duplicate on both of the instances or i have configured it right specially in perspective of windows update. 2. As rewrite_url is not exported in new versions(version 3 and above) of squid is it still possible for squid to cache facebook/youtube videos? Have i configured it correctly? As i have seen no TCP_HIT for mp3,mp4 etc so i think caching is not done. 3. Please can u please check my configuration for windows updates? is there anything else which i have missed there? How could i assure that windows update is being cached properly? Through studying online tutorials mailarchive support and best of my understanding i have configured squid as follows. Please peruse and guide. -- Squid Cache Instance: visible_hostname squidlhr.v.local unique_hostname squidcacheinstance pid_filename /var/run/squidcache.pid cache_dir aufs /cachedisk1/var/spool/squid 5 128 256 coredump_dir /cachedisk1/var/spool/squid cache_swap_low 75 cache_mem 1000 MB range_offset_limit -1 maximum_object_size 4096 MB minimum_object_size 10 KB quick_abort_min -1 cache_replacement_policy heap refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 #specific for youtube belowone refresh_pattern (get_video\?|videoplayback\?|videodownload\?) 5259487 % 5259487 # For any dynamic content caching. refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 -- Squid Main Instance: visible_hostname squidlhr unique_hostname squidmain cache_peer 127.0.0.1 parent 3128 0 default no-digest no-query prefer_direct off cache_dir aufs /var/spool/squid 1 16 256 coredump_dir /var/spool/squid cache_swap_low 75 cache_replacement_policy lru refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 #Defining & allowing ports section acl SSL_ports port 443 # https acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT # Only allow cachemgr access from localhost http_access allow manager localhost http_access deny manager # Deny request to unknown ports http_access deny !Safe_ports # Deny request to other than SSL ports http_access deny CONNECT !SSL_ports #Allow access from localhost http_access allow localhost # Windows Update Section... acl windowsupdate dstdomain windowsupdate.microsoft.com acl windowsupdate dstdomain .update.microsoft.com acl windowsupdate dstdomain download.windowsupdate.com acl windowsupdate dstdomain redir.metaservices.microsoft.com acl windowsupdate dstdomain images.metaservices.microsoft.com acl windowsupdate dstdomain c.microsoft.com acl windowsupdate dstdomain www.download.windowsupdate.com acl windowsupdate dstdomain wustat.windows.com acl windowsupdate dstdomain crl.microsoft.com acl windowsupdate dstdomain sls.microsoft.com acl windowsupdate dstdomain productactivation.one.microsoft.com acl windowsupdate dstdomain ntservicepack.microsoft.com acl wuCONNECT dstdomain www.update.microsoft.com acl wuCONNECT dstdomain sls.microsoft.com http_access allow CONNECT wuCONNECT all http_access allow windowsupdate all regards & thanks Bilal _ Hotmail: Free, trusted and rich email service. https://signup.live.com/signup.aspx?id=60969
RE: [squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube
All, I am really sorry i was looking at the access.log file of squid instance that is user facing and not the instance that is doing the fetching/caching and there i can see mp4 files being cached. However i am not very much confident about my settings so please read my queries and the configuration file and advice. I would be really thankful. > From: gi...@msn.com > To: squid-users@squid-cache.org > Date: Fri, 14 May 2010 12:00:46 + > Subject: [squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube > > > > Dear All, > > > I require your help and guidance regarding dynamic content caching. Following > are the quries. > > > 1. I am running squid in multiple instances mode (For Cache Disk Failure > Protection). I dont think that it has any effect on internet object caching? > I am confused that if connect methods are to be duplicate on both of the > instances or i have configured it right specially in perspective of windows > update. > > > 2. As rewrite_url is not exported in new versions(version 3 and above) of > squid is it still possible for squid to cache facebook/youtube videos? Have i > configured it correctly? As i have seen no TCP_HIT for mp3,mp4 etc so i think > caching is not done. > > > 3. Please can u please check my configuration for windows updates? is there > anything else which i have missed there? How could i assure that windows > update is being cached properly? > > > > > > > > > Through studying online tutorials mailarchive support and best of my > understanding i have configured squid as follows. Please peruse and guide. > > -- > Squid Cache Instance: > > visible_hostname squidlhr.v.local > unique_hostname squidcacheinstance > pid_filename /var/run/squidcache.pid > > > cache_dir aufs /cachedisk1/var/spool/squid 5 128 256 > coredump_dir /cachedisk1/var/spool/squid > > cache_swap_low 75 > cache_mem 1000 MB > range_offset_limit -1 > maximum_object_size 4096 MB > minimum_object_size 10 KB > quick_abort_min -1 > cache_replacement_policy heap > > refresh_pattern ^ftp: 1440 20% 10080 > refresh_pattern ^gopher: 1440 0% 1440 > refresh_pattern . 0 20% 4320 > > #specific for youtube belowone > refresh_pattern (get_video\?|videoplayback\?|videodownload\?) 5259487 > % 5259487 > > # For any dynamic content caching. > refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 > > -- > Squid Main Instance: > visible_hostname squidlhr > unique_hostname squidmain > cache_peer 127.0.0.1 parent 3128 0 default no-digest no-query > prefer_direct off > > cache_dir aufs /var/spool/squid 1 16 256 > coredump_dir /var/spool/squid > cache_swap_low 75 > cache_replacement_policy lru > > refresh_pattern ^ftp: 1440 20% 10080 > refresh_pattern ^gopher: 1440 0% 1440 > refresh_pattern . 0 20% 4320 > > > #Defining & allowing ports section > acl SSL_ports port 443 # https > acl Safe_ports port 80 # http > acl Safe_ports port 21 # ftp > acl Safe_ports port 443 # https > acl Safe_ports port 70 # gopher > acl Safe_ports port 210 # wais > acl Safe_ports port 1025-65535 # unregistered ports > acl Safe_ports port 280 # http-mgmt > acl Safe_ports port 488 # gss-http > acl Safe_ports port 591 # filemaker > acl Safe_ports port 777 # multiling http > acl CONNECT method CONNECT > > # Only allow cachemgr access from localhost > http_access allow manager localhost > http_access deny manager > > # Deny request to unknown ports > http_access deny !Safe_ports > > # Deny request to other than SSL ports > http_access deny CONNECT !SSL_ports > > #Allow access from localhost > http_access allow localhost > > > # Windows Update Section... > acl windowsupdate dstdomain windowsupdate.microsoft.com > acl windowsupdate dstdomain .update.microsoft.com > acl windowsupdate dstdomain download.windowsupdate.com > acl windowsupdate dstdomain redir.metaservices.microsoft.com > acl windowsupdate dstdomain images.metaservices.microsoft.com > acl windowsupdate dstdomain c.microsoft.com > acl windowsupdate dstdomain www.download.windowsupdate.com > acl windowsupdate dstdomain wustat.windows.com > acl windowsupdate dstdomain crl.microsoft.com > acl windowsupdate dstdomain sls.microsoft.com > acl windowsupdate dstdomain productactivation.one.microsoft.com > acl windowsupdate dstdomain ntservicepack.microsoft.com > acl wuCONNECT dstdomain www.update.microsoft.com > acl wuCONNECT dstdomain sls.microsoft.com > http_access allow CONNECT wuCONNECT all > http_access allow windowsupdate all > > > regards & thanks > > Bilal > _ > Hotmail: Free, trusted and rich email service. > https://signup.live.com/signup.aspx?id=60969 > _ Hotmail: Powerful Free email with security by Microsoft. https://signup.live.com/signup.aspx?id=60969
Re: [squid-users] Settings per User?
2010/5/14 Joe : > Hi there, > > I would like to run squid with the ability to change the settings for the > user. Is there any way use different settings based e. g. on the source ip, > that means User with IP 10.6.0.2 uses privoxy to filter ads and 10.6.0.6 > does not. Is this possible? > Does the src ACL have any help to you? http://www.squid-cache.org/Doc/config/acl/ -- Tech support agency in China http://duxieweb.com/
Re: [squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube
2010/5/14 GIGO . : > > > Dear All, > > > I require your help and guidance regarding dynamic content caching. Following > are the quries. > > > 1. I am running squid in multiple instances mode (For Cache Disk Failure > Protection). I dont think that it has any effect on internet object caching? > I am confused that if connect methods are to be duplicate on both of the > instances or i have configured it right specially in perspective of windows > update. > For reference: http://wiki.squid-cache.org/ConfigExamples/DynamicContent -- Tech support agency in China http://duxieweb.com/
[squid-users] Access.log
Hi all, Can anybody please explain me what does this error mean and why it occurs it happens while i was testing youtube/facebook caching. TCP_NEGATIVE_HIT/204 Does this suggest that some object in cache has corrupted? if so how to rectify the error? Is this error only means that user has aborted the transfer or it may come for some other reason as well. TCP_MISS/000 thanks & regards, Bilal _ Hotmail: Trusted email with powerful SPAM protection. https://signup.live.com/signup.aspx?id=60969
[squid-users] Cannot connect to squid port intermittently
Hi, I am performance issues with squid, sometimes users get page cannot be displayed. During the troubleshootign I found that doing a telnet to port 3128 from my squid box itself time out or it connects after some time. I need help desperatley on this . Warm Regards Tej
Re: [squid-users] Settings per User?
Am 14.05.2010 14:31, schrieb Peng, Jeff: Does the src ACL have any help to you? http://www.squid-cache.org/Doc/config/acl/ Not really as it does not explain. Maybe my explanation was not so good. I want to give users the possibility to change their user-agent or the contents that should be filtered, read these settings from a mysql and apply them to the user: 1. User 10.6.0.2 header_replace User-Agent "xyz Browser" deny xxx, ad, warez content 2. User 10.6.0.6 request_header_access User-Agent deny all allow all content I found out, that is possible to use redirect_program to redirect the user based on the src ip. But I can only send back the redirect url. I want to edit other settings too. And btw. is it possible to edit the User-Agent. I mean to keep the original and just add something to it?
Re: [squid-users] Cannot connect to squid port intermittently
Is filedescriptor too few? Or disk IO is congested? You may take a look at cache.log for details. 2010/5/14 Tejpal Amin : > Hi, > > I am performance issues with squid, sometimes users get page cannot be > displayed. > During the troubleshootign I found that doing a telnet to port 3128 > from my squid box itself time out or it connects after some time. > > I need help desperatley on this . > > Warm Regards > Tej > -- Tech support agency in China http://duxieweb.com/
Re: [squid-users] a keepalive problem about NTLM authentication pass through
Hi, Henrik > The first condition "orig_request->flags.must_keepalive" should have > triggered when seeing NTLM/Negotiate/Kerberos passthru. I think so, too ! Could you fix it ? Sincerely, -- Mikio Kishi 2010/5/13 Henrik Nordström : > ons 2010-05-12 klockan 18:54 +0900 skrev Mikio Kishi: > >> > http.c:HttpStateData::sendRequest() >> > >> > 1998 /* >> > 1999 * Is keep-alive okay for all request methods? >> > 2000 */ >> > 2001 if (orig_request->flags.must_keepalive) >> > 2002 flags.keepalive = 1; >> > 2003 else if (!Config.onoff.server_pconns) >> > 2004 flags.keepalive = 0; > >> I think that we have to insert the following code >> >> else if (_peer->connection_auth == 1) >> flags.keepalive = 1; >> >> What do you think ? > > The first condition "orig_request->flags.must_keepalive" should have > triggered when seeing NTLM/Negotiate/Kerberos passthru. > > What is needed is to investigate why this did not get set. > > Regards > Henrik > >
[squid-users] http CONNECT method with fwd proxy to content server on same subnet
Hi, I have a new need for deploying squid in my environment and I have been trying to set it up but it is not working as expected. Please see me requirements below and I have tried this with both 2.7-stable9 and 3.1.3 on CentOS4.6 64bit. I have a remote server sending a HTTP CONNECT to my server but my server can't handle an HTTP CONNECT. So I wanted to use squid to handle the CONNECT method and then send the https requests to my local server to handle the request. I know that a transparent proxy doesn't know how to handle the SSL requests because is not operating as a normal proxy. So I have been using squid as a fwd proxy but it keeps sending the http CONNECT method to my end server which is causing issues. So I am asking for ideas on what I need to do to look at do this. I have tried various iptables rules and cache_peers but nothing is seeming to work I am using pretty much the default config except for my local network IPs and ACL to allow the traffic. I would appreciate any ideas.. Thanks, Guin
[squid-users] accelerator mode localhost issues
Hello! i'm issuing some troubles when i try to browse my website throw browser. i have domain name dev-zone.org.ua thich points to my apache box. apache box is listening on 127.0.0.1:80 and squid config is: visible_hostname box http_port 192.168.1.2:80 accel vhost cache_peer 127.0.0.1 parent 80 0 no-query no-digest acl web dstdomain dev-zone.org.ua http_access allow web refresh_pattern . 100 50% 1 all works, except cache. please help me to understand how to setup minimal working config for caching. p.s. dev-zone.org.ua has address in hosts file 192.168.1.2 so all is correct and i get content from squid, not from apache itself. -- With best regards, Yaroslav. Tel: 0 (44) 232-64-70 (Kiev, Ukraine) icq: 335110462 web: http://awesome-developer.org.ua/
[squid-users] accelerator mode localhost issues
Hello! i'm issuing some troubles when i try to browse my website throw browser. i have domain name dev-zone.org.ua thich points to my apache box. apache box is listening on 127.0.0.1:80 and squid config is: visible_hostname box http_port 192.168.1.2:80 accel vhost cache_peer 127.0.0.1 parent 80 0 no-query no-digest acl web dstdomain dev-zone.org.ua http_access allow web refresh_pattern . 100 50% 1 all works, except cache. please help me to understand how to setup minimal working config for caching. p.s. dev-zone.org.ua has address in hosts file 192.168.1.2 so all is correct and i get content from squid, not from apache itself. -- With best regards, Yaroslav. Tel: 0 (44) 232-64-70 (Kiev, Ukraine) icq: 335110462 web: http://awesome-developer.org.ua/
Re: [squid-users] accelerator mode localhost issues
2010/5/14 Ярослав Матейко : > Hello! > > i'm issuing some troubles when i try to browse my website throw browser. > i have domain name dev-zone.org.ua thich points to my apache box. > > apache box is listening on 127.0.0.1:80 > > and squid config is: > > visible_hostname box > http_port 192.168.1.2:80 accel vhost > cache_peer 127.0.0.1 parent 80 0 no-query no-digest > acl web dstdomain dev-zone.org.ua > http_access allow web > refresh_pattern . 100 50% 1 > > all works, except cache. > It seems no problem with the config. Or you may take a look at my module for setting up a reverse proxy quickly: http://search.cpan.org/~pangj/Net-Squid-ReverseProxy-0.04/lib/Net/Squid/ReverseProxy.pm -- Tech support agency in China http://duxieweb.com/
Re: [squid-users] accelerator mode localhost issues
2010/5/14 Ярослав Матейко : > and when i try for example this tool: > http://webtools.live2support.com/header.php > With the same cache box? So in the first case you don't get pages cached, have you set up Apache correctly with expire or max-age output headers? -- Tech support agency in China http://duxieweb.com/
[squid-users] removing Proxy
Hello all I was wondering if anyone knows how to remove a proxy from a linux system I haven't added the proy to any files, but everytime I try for instance to connect it keeps searching for the proxy I initially added the proxy with export http://proxy:port Now when I do unset http_proxy I can connect to the internet and do the apt-get and use synaptic without a problem but if I reboot the machine it comes back again, unless I unset the proxy manually everytime does anyone know where I can remove it from an Ubuntu hardy based system please? I decided to stop using squid it caused me a lot of problems since I installed it it was non stop one problem after another. I came to realise that squid doesn't like me lol so I am going to do without it. Too many things didn't work while using it, now that I stopped the proxy they all back to normal If anyone knows how to remove it please let me know Best of luck everyone KR Adam
Re: [squid-users] a keepalive problem about NTLM authentication pass through
fre 2010-05-14 klockan 23:01 +0900 skrev Mikio Kishi: > > The first condition "orig_request->flags.must_keepalive" should have > > triggered when seeing NTLM/Negotiate/Kerberos passthru. > > I think so, too ! > Could you fix it ? If I had a bit of free time yes. You are welcome to give it a stab if you want. The logics setting this flags is in client_side_reply.cc Regards Henrik
Re: [squid-users] how do I ensure that my request is passing through squid proxy
jyothi wrote: Thanks for the quick reply. I am using firefox, there I can only specify one proxy (that is presently institution proxy), may be I don't know, could you please elaborate? Perhapse I misunderstood you. You said you already read the FAQ titled "How do I configure Squid forward all requests to another proxy?" That FAQ explains how to configure Squid to use the institution proxy. Your browser only needs to be configured for the one single proxy it is talking to. Squid. On Fri, 14 May 2010 23:26:25 +1200, Amos Jeffries wrote jyothi wrote: Hi, I have a doubt, I am using a proxy enabled network (which is provided by my institution). My browser is configured with the corresponding proxy in order to access the web pages. I want to use squid proxy in order to serve some of my needs. I can't configure browser with the squid as I need to my institution proxy to access the web. So all my requests first should go to squid proxy then to the institution proxy. I did this by reading the faq (How do I configure Squid forward all requests to another proxy?). But now my doubt is how would I ensure that first my request is going to squid and then to my insti proxy (as my browser is not configured with the squid)? Since that Squid is using the Institution proxy you should configure your browser to use the Squid. So you end up with this: Browser -> Squid -> Institution Proxy -> Internet Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.3
Re: [squid-users] Settings per User?
Joe wrote: Am 14.05.2010 14:31, schrieb Peng, Jeff: Does the src ACL have any help to you? http://www.squid-cache.org/Doc/config/acl/ Not really as it does not explain. Maybe my explanation was not so good. I want to give users the possibility to change their user-agent or the See the users web browser documentation how to set it's user agent. contents that should be filtered, read these settings from a mysql and apply them to the user: 1. User 10.6.0.2 header_replace User-Agent "xyz Browser" deny xxx, ad, warez content 2. User 10.6.0.6 request_header_access User-Agent deny all allow all content I found out, that is possible to use redirect_program to redirect the user based on the src ip. But I can only send back the redirect url. I want to edit other settings too. And btw. is it possible to edit the User-Agent. I mean to keep the original and just add something to it? The short answer is no. It's not possible to reconfigure Squid per request. Squids configuration is concerned with how and where to pass requests around. All of it is configured per-action. Some being globally applied actions, some being ACL controlled. Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.3
Re: [squid-users] removing Proxy
a...@gmail wrote: Hello all I was wondering if anyone knows how to remove a proxy from a linux system I haven't added the proy to any files, but everytime I try for instance to connect it keeps searching for the proxy I initially added the proxy with export http://proxy:port Not valid syntax for setting the system proxy variables. Now when I do unset http_proxy I can connect to the internet and do the apt-get and use synaptic without a problem but if I reboot the machine it comes back again, unless I unset the proxy manually everytime does anyone know where I can remove it from an Ubuntu hardy based system please? Find the place you configured it and remove. The menu entry: "System" -> "Preferences" -> "Network Proxy" is the proper place to set it in Ubuntu with GUI. /etc/profile is where system-wide exports are set on command line boxes. ~/.profile for per-user settings on command line boxes. If you have not set it there, $DEITY only knows what you did. Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.3
Re: [squid-users] how do I ensure that my request is passing through squid proxy
Thanks! I got your point. I have one more doubt, my institution proxy needs authentication. When I configured my browser with squid proxy, it was asking me for uname and password. when I enter my institution username and password it is not accepting. When I saw in FAQ it shows how to authenticate for squid, now I have to forward to other other proxy so how can I resolve this? ( means how my insti proxy resolves the authentication when the request is coming from squid)? On Sat, 15 May 2010 18:14:30 +1200, Amos Jeffries wrote > jyothi wrote: > > Thanks for the quick reply. I am using firefox, there I can only specify > > one proxy (that > > is presently institution proxy), may be I don't know, could you please > > elaborate? > > > > Perhapse I misunderstood you. > You said you already read the FAQ titled "How do I configure Squid > forward all requests to another proxy?" > > That FAQ explains how to configure Squid to use the institution proxy. > > Your browser only needs to be configured for the one single proxy it is > talking to. Squid. > > > > > On Fri, 14 May 2010 23:26:25 +1200, Amos Jeffries wrote > >> jyothi wrote: > >>> Hi, > >>> > >>> I have a doubt, I am using a proxy enabled network (which is provided by > >>> my > > institution). > >>> My browser is configured with the corresponding proxy in order to access > >>> the web pages. > >>> I want to use squid proxy in order to serve some of my needs. I can't > >>> configure browser > >>> with the squid as I need to my institution proxy to access the web. So > >>> all my requests > >>> first should go to squid proxy then to the institution proxy. I did this > >>> by reading the > >>> faq (How do I configure Squid forward all requests to another proxy?). > >>> But now my doubt > >>> is how would I ensure that first my request is going to squid and then to > >>> my insti proxy > >>> (as my browser is not configured with the squid)? > >>> > >> Since that Squid is using the Institution proxy you should configure > >> your browser to use the Squid. > >> > >> So you end up with this: > >>Browser -> Squid -> Institution Proxy -> Internet > >> > > Amos > -- > Please be using >Current Stable Squid 2.7.STABLE9 or 3.1.3
Re: [squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube
GIGO . wrote: All, I am really sorry i was looking at the access.log file of squid instance that is user facing and not the instance that is doing the fetching/caching and there i can see mp4 files being cached. However i am not very much confident about my settings so please read my queries and the configuration file and advice. I would be really thankful. From: gi...@msn.com To: squid-users@squid-cache.org Date: Fri, 14 May 2010 12:00:46 + Subject: [squid-users] Dynamic Content Caching/Windowsupdate/Facebook/youtube Dear All, I require your help and guidance regarding dynamic content caching. Following are the quries. 1. I am running squid in multiple instances mode (For Cache Disk Failure Protection). I dont think that it has any effect on internet object caching? I am confused that if connect methods are to be duplicate on both of the instances or i have configured it right specially in perspective of windows update. Depends on whether the port the cache instance is listening on is reachable to external people, if it is then its Squid will definitely need the http_access security settings turned on as well. 2. As rewrite_url is not exported in new versions(version 3 and above) of squid is it still possible for squid to cache facebook/youtube videos? Have i configured it correctly? As i have seen no TCP_HIT for mp3,mp4 etc so i think caching is not done. If you meant to write "storeurl_rewrite"? then yes. That particular method of caching them is not possible yet in 3.x. YouTube will still cache using the low-efficiency duplicate-object way it does most places. 3. Please can u please check my configuration for windows updates? is there anything else which i have missed there? How could i assure that windows update is being cached properly? You don't show any http_access rules from the cache instance. The default is to block all access through that instance. The main instance is okay. Through studying online tutorials mailarchive support and best of my understanding i have configured squid as follows. Please peruse and guide. -- Squid Cache Instance: visible_hostname squidlhr.v.local unique_hostname squidcacheinstance pid_filename /var/run/squidcache.pid cache_dir aufs /cachedisk1/var/spool/squid 5 128 256 coredump_dir /cachedisk1/var/spool/squid cache_swap_low 75 cache_mem 1000 MB range_offset_limit -1 maximum_object_size 4096 MB minimum_object_size 10 KB quick_abort_min -1 cache_replacement_policy heap refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern . 0 20% 4320 #specific for youtube belowone refresh_pattern (get_video\?|videoplayback\?|videodownload\?) 5259487 % 5259487 The youtube pattern and all other custom refresh_patterns' need to be configured above the default set (ftp:, gopher:, cgi-bin, and . ). # For any dynamic content caching. refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 This dynamic content needs to be between the refresh_pattern ^gopher: and the refresh_pattern . patterns. -- Squid Main Instance: visible_hostname squidlhr unique_hostname squidmain cache_peer 127.0.0.1 parent 3128 0 default no-digest no-query prefer_direct off cache_dir aufs /var/spool/squid 1 16 256 coredump_dir /var/spool/squid cache_swap_low 75 cache_replacement_policy lru refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 (should be set on all squid caching or handling dynamic objects, even in memory-only mode). refresh_pattern . 0 20% 4320 #Defining & allowing ports section acl SSL_ports port 443 # https acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT # Only allow cachemgr access from localhost http_access allow manager localhost http_access deny manager # Deny request to unknown ports http_access deny !Safe_ports # Deny request to other than SSL ports http_access deny CONNECT !SSL_ports #Allow access from localhost http_access allow localhost # Windows Update Section... acl windowsupdate dstdomain windowsupdate.microsoft.com acl windowsupdate dstdomain .update.microsoft.com acl windowsupdate dstdomain download.windowsupdate.com acl windowsupdate dstdomain redir.metaservices.microsoft.com acl windowsupdate dstdomain images.metaservices.microsoft.com acl windowsupdate dstdomain c.microsoft.com acl windowsupdate dstdomain www.download.windowsupdate.com acl windowsupdate dstdomain wustat.windows.com acl windowsupdate dstdomain crl.microsoft.com acl windowsu