Re: [squid-users] Squid+ADS - Multiple Group Based Authentication (ISA to SQUID Migration)
Any help is really appreciated!!! Try being case-sensitive in the group names. The ones you configured Squid with do not match the ones you detailed as example. Assuming both were correct they may be mis-matched because 'S' is not 's' etc. It was my mistake in the mail. all are lowercase in group names as well as in squid.conf Try also with this as the first of the auth ACL: acl AuthorizedUsers proxy_auth REQUIRED http_access deny !AuthorizedUsers it will force a login if none is supplied. I tried this too, but No hope. Once again the following is my environment. Win 2k3 (with ADS) --- Squid Proxy (squid-3.0.STABLE13-1.el5) on CentOS 5.3 (Samba, Winbind, Kerberos, squid configured) now this are my entries; auth_param ntlm program /usr/bin/ntlm_auth --helper-protocol=squid-2.5-ntlmssp auth_param ntlm children 5 #auth_param ntlm max_challenge_reuses 0 #auth_param ntlm max_challenge_lifetime 2 minutes auth_param basic program /usr/bin/ntlm_auth --helper-protocol=squid-2..5-basic auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours acl AuthorizedUsers proxy_auth REQUIRED http_access deny !AuthorizedUsers external_acl_type unix_group %LOGIN /usr/lib/squid/squid_unix_group acl senior_acl external unix_group senior acl engineer_acl external unix_group engineer acl restricted_acl external unix_group restricted acl guestgroup_acl external unix_group guestgroup acl parttime_acl external unix_group parttime some restrictions like bad sites, proxy sites, game sites, time based restrictions, etc, here... http_access allow senior_acl http_access allow engineer_acl http_access allow restricted_acl http_access allow guestgroup_acl http_access allow parttime_acl http_access deny all If i put http_access allow AuthorizedUsers just below the http_access deny !AuthorizedUsers, it again start allowing all authenticated users, and restricting others. But the group based is not working for me. Amos Kindly advice...
[squid-users] delay_access line
Hi Is this a valid config line? delay_access 6 allow lan-students magic_words url_words Or do I need one for each acl? Best regards Dayo Adewunmi
Re: [squid-users] Squid+ADS - Multiple Group Based Authentication (ISA to SQUID Migration)
Truth Seeker wrote: Any help is really appreciated!!! Try being case-sensitive in the group names. The ones you configured Squid with do not match the ones you detailed as example. Assuming both were correct they may be mis-matched because 'S' is not 's' etc. It was my mistake in the mail. all are lowercase in group names as well as in squid.conf Try also with this as the first of the auth ACL: acl AuthorizedUsers proxy_auth REQUIRED http_access deny !AuthorizedUsers it will force a login if none is supplied. I tried this too, but No hope. Once again the following is my environment. Win 2k3 (with ADS) --- Squid Proxy (squid-3.0.STABLE13-1.el5) on CentOS 5.3 (Samba, Winbind, Kerberos, squid configured) now this are my entries; auth_param ntlm program /usr/bin/ntlm_auth --helper-protocol=squid-2.5-ntlmssp auth_param ntlm children 5 #auth_param ntlm max_challenge_reuses 0 #auth_param ntlm max_challenge_lifetime 2 minutes auth_param basic program /usr/bin/ntlm_auth --helper-protocol=squid-2..5-basic auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours acl AuthorizedUsers proxy_auth REQUIRED http_access deny !AuthorizedUsers external_acl_type unix_group %LOGIN /usr/lib/squid/squid_unix_group Oh, hang on. UNIX groups are not the same as AD groups. I think that helper is probably not testing AD compatible. Try the winbind group helper wbinfo_group.pl Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
Re: [squid-users] delay_access line
Dayo Adewunmi wrote: Hi Is this a valid config line? delay_access 6 allow lan-students magic_words url_words Maybe. Are lan-students, magic_words and url_words the names of defined ACL? Or do I need one for each acl? You imply that they are, which makes the answer to the first question yes. And the second question: maybe yes, maybe no. Since question 2 requires that we are psychic and can understand both what you intend to do with that single line and what the rest of your configuration looks like. There is no way we can do any better answers. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
[squid-users] download problem!
Hi, We installed squid in our gatway, anything is fine except that we have a problem with downloading. when we start a downlaod, something like below lines will appear in the browser! Downloaded 775131 bytes from 65950444 of data Downloaded 1450491 bytes from 65950444 of data Downloaded 2221991 bytes from 65950444 of data Downloaded 3004551 bytes from 65950444 of data Downloaded 3803191 bytes from 65950444 of data Downloaded 4565651 bytes from 65950444 of data Downloaded 5360067 bytes from 65950444 of data Downloaded 6161387 bytes from 65950444 of data Downloaded 6962707 bytes from 65950444 of data Downloaded 7762687 bytes from 65950444 of data ... What is the problem? I thinks we made a mistake in our config file? could anyone help me? Thanks Hamid Reza Hasani Ya Ali -- View this message in context: http://www.nabble.com/download-problem%21-tp24884971p24884971.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] download problem!
hrhasani wrote: Hi, We installed squid in our gatway, anything is fine except that we have a problem with downloading. when we start a downlaod, something like below lines will appear in the browser! Downloaded 775131 bytes from 65950444 of data Downloaded 1450491 bytes from 65950444 of data Downloaded 2221991 bytes from 65950444 of data Downloaded 3004551 bytes from 65950444 of data Downloaded 3803191 bytes from 65950444 of data Downloaded 4565651 bytes from 65950444 of data Downloaded 5360067 bytes from 65950444 of data Downloaded 6161387 bytes from 65950444 of data Downloaded 6962707 bytes from 65950444 of data Downloaded 7762687 bytes from 65950444 of data ... What is the problem? I thinks we made a mistake in our config file? could anyone help me? Squid does not generate anything like this. Your browser maybe broken and displaying status updates in text-only mode, or the website is trying to be fancy and getting it wrong. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
Re: [squid-users] Add a prefix/suffix if a domain is not resolved?
Hum, it seems there is a misunderstanding here. I do not want to always add www. in front of the domain name, I want it tried if it's not already there. Here is the scenario : I type in domain. This does not resolve to anything, so I want Squid to try with .com appended, that is try with domain.com. Only .com is to be tried, not all the possible suffixes. If domain.com resolves in DNS, then it stops there and connects to it. If the domain.com did not resolve to anything in DNS, then try again with www. prefixed this time, thus giving www.domain.com. If that does not resolve either, then stop there. Now, if I type in domain.eu, then I don't want it to try with .com appended because there already is an extension. However, if that domain.eu does not get DNS resolution, then I want the www. prefix part to apply. And finally, if I type sometld.domain.ext, I don't want it to try anything, even if it does not resolve. I hope this makes it clearer. Regards Olivier
Re: [squid-users] Script Check
don't do that. As someone who did this 10+ years, I suggest you do this. * do some hackery to find out how your freeradius server stores the currently logged in users. It may be in a mysql database, it may be in a disk file, etc, etc * have your redirector query -that- directly, rather than running radwho. When I did this 10 years ago, the radius server kept a wtmp style file with current logins which worked okish for a few dozen users, then sucked for a few hundred users. I ended up replacing it with a berkeley DB hash table to make searching for users faster. * then in the helper, cache the IP results for a short period (say, 5 to 10 seconds) so frequent page accesses wouldn't result in a flurry of requests to the backend * keep the number of helpers low - you're doing it wrong if you need more than 5 or 6 helpers doing this.. Adrian 2009/8/8 mic...@casa.co.cu: Hello Using squid 2.6 on my work, I have a group of users who connect by dial-up access to a NAS and a server freeradius to authenticate each time they log my users are assigned a dynamic IP address, making it impossible to create permissions without authentication by IP address. now to assign levels of access to sites are authenticating against an Active Directory, but I want to change that. I want to create a script for when you get a request to the squid from the block of IP addresses, run a script that reads the username and IP address from the server freeradius radwho tool that shows users connected + ip address or mysql from which you can achieve the same process and can be compared to a text file if the user is listed, then access it without authentication of any kind. It is possible to do this? Sorry for my english, is very poor. Thanks Michel -- Webmail, servicio de correo electronico Casa de las Americas - La Habana, Cuba.
Re: [squid-users] ban brute force attacks in squid through ncsa
What is the best one to use? What runs the auth backend in squid in Linux? Would something like fail2ban work - I thought that only did SSH? What about mysql auth logging? -- From: Kinkie gkin...@gmail.com Sent: Saturday, August 08, 2009 2:56 PM To: J Webster webster_j...@hotmail.com; squid-users@squid-cache.org Subject: Re: [squid-users] ban brute force attacks in squid through ncsa Hello, this kind of functionality does not really belong to Squid but to the authentication backend. Ncsa passwd check script are quite naive and usually do not provide that kind of protection. On 8/8/09, J Webster webster_j...@hotmail.com wrote: Is there anything in squid to ban brute force attacks on usernames and passwords via ncsa authentication? -- /kinkie
Re: [squid-users] Add a prefix/suffix if a domain is not resolved?
Olivier Sannier wrote: Hum, it seems there is a misunderstanding here. Yes. You seem to misunderstand how domain names work... I do not want to always add www. in front of the domain name, I want it tried if it's not already there. Here is the scenario : I type in domain. This does not resolve to anything, so I want Squid to try with .com appended, that is try with domain.com. Only .com is to be tried, not all the possible suffixes. If domain.com resolves in DNS, then it stops there and connects to it. If the domain.com did not resolve to anything in DNS, then try again with www. prefixed this time, thus giving www.domain.com. If that does not resolve either, then stop there. Now, if I type in domain.eu, then I don't want it to try with .com appended because there already is an extension. However, if that domain.eu does not get DNS resolution, then I want the www. prefix part to apply. And finally, if I type sometld.domain.ext, I don't want it to try anything, even if it does not resolve. 3 seconds additional loading time later you realize that the French (Secretary general?) (http://domain.eu/) are having a little DNS trouble and Squid is sending you to the domain squatter at http://domain.eu.com/ for the next three months. (see for yoruself, visit both those websites). The only three rules about domain names is that they contain certain readable characters, separated by dot at each ownership level, and there is no more than 255 letters between each dot. * by ownership level it means: my domain name (treenet.co.nz) is sold by the NZ government (nz) to DomainNZ (co.nz) who sub-leased it to me (treenet.co.nz) who runs a website (http://treenet.co.nz/) Lets put it this way since you like examples... Here are a couple of very real domain names: http://hk/ -- Hong Kong .hk domain registry http://www.com/ -- verisign .com registry http://www.com.au/ -- optus networks http://com.au/ -- optus networks http://www.co.nz/ -- a local bank Luckily OptusNet own both domains, so it does not matter if you add www to them. The bank is a different story; co.nz is owned by another business, but without a website there so far. That might change at any time. They have been talking about it. Here is a case where that has actually happened already. 3 _different_ websites with very different owners around the planet. http://tm/ - a Turkmenistan community help network http://tm.com/ - a North American advertising co. http://tm.com.my/ - a Malaysian ISP I have just downloaded a web page from all URL links I mention above, so they are currently operating. There used to be cool domain at just http://com/; as well but it seems to be dead now. One of my most popular customers has 7 dots following the www. Like this: www.a.b.c.d.e.f.g ... but only on some of their sites does www work, others it won't. Some need it to be there, most don't. The domain itself ends in .net, but often contain .com it mid-way down. So to repeat; Squid will _already_ try the exact domain you gave it, then try each of a set of possible right-hand sides. Squid cannot determine automatically when a left-hand side is missing what should or should not be there as part of the domain. Squid will _already_ add the http:// part under certain circumstances if its missing. We will not be making Squid do it any other way, because: * 280 possible right-hand side of a domain. resolv.conf offers a nice clean way to solve the problem. * 987 possible left-hand sides. even when cut down to just the registered services like www, ftp, http, _http, _http._tcp etc. * Both those numbers are growing. * A single failed DNS lookup can slow Squid by half a second. now multiply: 280 * 987 * .5 seconds = possible web page loading time. Please hold the line... Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
[squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, I setup Squid on my Ubuntu server and it does work fine from Internet Explorer on my Windows machine when I change my Internet Options to use the proxy server. But, I need it to be fully transparent. On Linux this can be done by forwarding traffic to the proxy with 'iptables', but how can I do this on Windows? Is there a way? Thank you, Andrej
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Andrej van der Zee wrote: Hi, I setup Squid on my Ubuntu server and it does work fine from Internet Explorer on my Windows machine when I change my Internet Options to use the proxy server. But, I need it to be fully transparent. On Linux this can be done by forwarding traffic to the proxy with 'iptables', but how can I do this on Windows? Is there a way? Thank you, Andrej http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, Thanks for your email. http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers Actually I was looking for a way NOT to configure my browser through a proxy, but somehow doing this on a lower level. The problem is that Internet Explorer sends an HTTP header Proxy-Connection: Keep Alive along with the request. Maybe I am mistaking, but I can't see how any of the methods in the manual can do this for me. Or am I mistaking? Thank you, Andrej
Aw: Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
- Original Nachricht Von: Andrej van der Zee andrejvander...@gmail.com Actually I was looking for a way NOT to configure my browser through a proxy, You need a transparent proxy drived by iptables which most probably is available in your ubuntu. see: http://tldp.org/HOWTO/TransparentProxy.html -- Jeff Pang eMail: pa...@arcor.de Y! M: yonghua_peng Plantschen oder Grog? Der Arcor-Wetterdienst weiss Bescheid! Wassertemperaturen für den Urlaub jetzt checken: www.arcor.de/rd/wetter_wasser
Re: Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, Thanks for your email. You need a transparent proxy drived by iptables which most probably is available in your ubuntu. see: http://tldp.org/HOWTO/TransparentProxy.html The problem is that Squid is running on Ubuntu, but my browser must (unfortunately) be Internet Explorer on Windows. So how can I do iptables on Windows? Thanks, Andrej
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Andrej van der Zee wrote: Hi, Thanks for your email. http://wiki.squid-cache.org/SquidFaq/ConfiguringBrowsers Actually I was looking for a way NOT to configure my browser through a proxy, but somehow doing this on a lower level. The problem is that Internet Explorer sends an HTTP header Proxy-Connection: Keep Alive along with the request. Maybe I am mistaking, but I can't see how any of the methods in the manual can do this for me. Or am I mistaking? Not possible. There are a lot of other things than that header going on when the browser knows about the proxy (via the control panel or WPAD/PAC configuration). Certain special types of request are sent for FTP and HTTPS, different types of URL sent, other headers along with the one you noticed, no DNS happens in the whole process, and different types of replies are expected back. With IE sometimes whole different versions of HTTP protocol itself are used. Why is that single very commonly seen header a problem? Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Andrej van der Zee wrote: Hi, Thanks for your email. You need a transparent proxy drived by iptables which most probably is available in your ubuntu. see: http://tldp.org/HOWTO/TransparentProxy.html The problem is that Squid is running on Ubuntu, but my browser must (unfortunately) be Internet Explorer on Windows. So how can I do iptables on Windows? There is no such thing. NAT does not happen on the users PC. It happens on the centralized router/gateway box. In your case the Ubuntu box running Squid to intercept traffic flowing through that box from anywhere on the network and redirect it to Squid as it passes. So far in only two posts you have asked how to do three very different things. Amos -- Please be using Current Stable Squid 2.7.STABLE6 or 3.0.STABLE18 Current Beta Squid 3.1.0.13
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, Thanks for your email. So far in only two posts you have asked how to do three very different things. Maybe I should explain the problem I have then. I live in Japan but want to be able to see Dutch Football from the Internet. There is such a service but it only allows connections from within Holland. So I set up a Squid proxy server in Holland and configured Internet Explorer in Japan to use the proxy server. But, somehow the football service still knows I am not in Holland. I found in their FAQ that configuring Internet Explorer through a proxy does not work, even if you are in Holland. So somehow I want to circumvent using the browser settings. Is there any way? Thank you, Andrej
[squid-users] Problem with Squid + Tproxy and Rapdishare
hi, this is my first post here. I have a problem, but first I describe the scenario I have clients with public IP Mikrotik router redirecting traffic to SQUID Squid 3.1 with support for TPROXY Iptables 1.4.4 with support for TPROXY Debian Lenny / Kernel 2.6.28 with support for TPROXY well. The proxy works as well, and when I made some test pages whatismyip, shows that the ip is the CLIENT. However. I can not get my clients with public IP address simultaneously downloading from RapidShare / Megaupload ETC. The error shown within these pages is the typical already are downloading from that ip, so if viewing RapidShare IP SQUID in reality and not the client. How fix this? the configuration file of squid in the harbor is well http_port 81 tproxy Iptables: iptables -t mangle -N DIVERT iptables -t mangle -A PREROUTING -p tcp -m socket -j DIVERT iptables -t mangle -A DIVERT -j MARK --set-mark 1 iptables -t mangle -A DIVERT -j ACCEPT iptables -t mangle -A PREROUTING -p tcp --dport 3128 -j TPROXY --tproxy-mark 0x1/0x1 --on-port 81 ip rule add fwmark 1 lookup 100 ip route add local 0.0.0.0/0 dev lo table 100 echo 1 /proc/sys/net/ipv4/ip_forward Mikrotik: Have a rule in the firewall to redirect all traffic to port 80 of the SQUID to the IP, port 3128 All clients create sessions PPPOE in Router Mikrotik May help? Regards
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
I mean, is there really nothing on Windows that you can instruct to forward all outgoing TCP/UDP traffic to port 80/443 to a port on a remote machine? I just need the remote proxy server to be accessed transparently, without using the browser setting. Is that really impossible on Windows? That is hard to believe, there must be a way. Cheers, Andrej
Re: [squid-users] Script Check
Hello My server freeradius users currently stored in a file on disk, but it is best to do what I want, I can configure it to store it in mysql. What I need is help me, explain me, guide me how can I achieve my goal. Thanks Michel Adrian Chadd adr...@squid-cache.org ha escrito: don't do that. As someone who did this 10+ years, I suggest you do this. * do some hackery to find out how your freeradius server stores the currently logged in users. It may be in a mysql database, it may be in a disk file, etc, etc * have your redirector query -that- directly, rather than running radwho. When I did this 10 years ago, the radius server kept a wtmp style file with current logins which worked okish for a few dozen users, then sucked for a few hundred users. I ended up replacing it with a berkeley DB hash table to make searching for users faster. * then in the helper, cache the IP results for a short period (say, 5 to 10 seconds) so frequent page accesses wouldn't result in a flurry of requests to the backend * keep the number of helpers low - you're doing it wrong if you need more than 5 or 6 helpers doing this.. Adrian 2009/8/8 mic...@casa.co.cu: Hello Using squid 2.6 on my work, I have a group of users who connect by dial-up access to a NAS and a server freeradius to authenticate each time they log my users are assigned a dynamic IP address, making it impossible to create permissions without authentication by IP address. now to assign levels of access to sites are authenticating against an Active Directory, but I want to change that. I want to create a script for when you get a request to the squid from the block of IP addresses, run a script that reads the username and IP address from the server freeradius radwho tool that shows users connected + ip address or mysql from which you can achieve the same process and can be compared to a text file if the user is listed, then access it without authentication of any kind. It is possible to do this? Sorry for my english, is very poor. Thanks Michel -- Webmail, servicio de correo electronico Casa de las Americas - La Habana, Cuba. -- Webmail, servicio de correo electronico Casa de las Americas - La Habana, Cuba.
Re: [squid-users] External_acl_type and cache_peer
fre 2009-08-07 klockan 11:18 -0800 skrev Chris Robertson: ...which indicates to me that if the external_acl_type returns the keywords user and/or password, those will be substituted for the real credentials supplied by the client. I have to assume the original poster interpreted this documentation in the same manner. Those are meant to be used when there is no credentials in the request, not as a substitute. Regards Henrik
Re: [squid-users] Script Check
fre 2009-08-07 klockan 21:34 -0400 skrev mic...@casa.co.cu: Using squid 2.6 on my work, I have a group of users who connect by dial-up access to a NAS and a server freeradius to authenticate each time they log my users are assigned a dynamic IP address, making it impossible to create permissions without authentication by IP address. Ok. I want to create a script for when you get a request to the squid from the block of IP addresses, run a script that reads the username and IP address from the server freeradius radwho tool that shows users connected + ip address or mysql from which you can achieve the same process The user= result interface of external acls is intended for exacly this purpose. What you need is a small script which reads IP addresses on stdin (one at a time) and prints the following on stdout: OK user=radiususername if the user is authenticated via radius, or ERR if the user is not and should fall back on other authentication methods. You can then plug this into Squid using external_acl_type, and bind an acl to that using the external acl type. Remember to set ttl=nnn and negative_ttl=nnn as suitable for your purpose. Regards Henrik
Re: [squid-users] Add a prefix/suffix if a domain is not resolved?
lör 2009-08-08 klockan 15:54 +0200 skrev Kinkie: Many browsers already do this. With Squid you can use a redirector scriptan but you'll have to write your custom script, such a functionality is not bundled with squid. Indeed.. and didn't I write such a script some many years ago? Or was it someone else who posted one.. don't remember. A better alternative is to use a PAC script in the browser, forwarding only known domains to the proxy. Domains which do not resolve is then handled by the browser like how it would do if not using a proxy (as it's not for those domains...). Drawback from the PAC approach is longer initial pageloads as the clients also need to DNS lookup each visited domain, before forwarding the request to the proxy which then also performs a DNS Lookup. But if both is sharing the same caching dns resolver and clients have reasonable connectivity to this DNS resolver then it should not be too bad.. But if clients have limited connectivity then performing DNS lookups in the clients is not something I would recommend unless absolutely needed. Regards Henrik
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, On Sun, 09 Aug 2009, Andrej van der Zee wrote: I mean, is there really nothing on Windows that you can instruct to forward all outgoing TCP/UDP traffic to port 80/443 to a port on a remote machine? I just need the remote proxy server to be accessed transparently, without using the browser setting. Is that really impossible on Windows? That is hard to believe, there must be a way. It sounds like what you're looking for is that the proxy be transparent to the web server not to the web browser. If you're in a position to set up squid on the remote server, a VPN might be the best approach to this. You can do that in a very simple way with ssh: http://fermiparadox.wordpress.com/2008/06/12/vpn-with-openssh/ Though you probably can configure squid to not forward any headers which reveals itself. Gavin
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
On Sun, 9 Aug 2009 22:26:27 +0900, Andrej van der Zee andrejvander...@gmail.com wrote: Hi, Thanks for your email. So far in only two posts you have asked how to do three very different things. Maybe I should explain the problem I have then. I live in Japan but want to be able to see Dutch Football from the Internet. There is such a service but it only allows connections from within Holland. So I set up a Squid proxy server in Holland and configured Internet Explorer in Japan to use the proxy server. But, somehow the football service still knows I am not in Holland. I found in their FAQ that configuring Internet Explorer through a proxy does not work, even if you are in Holland. So somehow I want to circumvent using the browser settings. Is there any way? Thank you, Andrej Aha, that is yet another thing altogether. You seem to have been confused by two or more of the meanings of transparent regarding proxies. All your Qs so far have been about various ways of transparent _interception_. 'transparent' in all its meanings involves making the proxy itself invisible and the web server see the real client details. That is the exact opposite of what you need to do the above. What you are needing to do is forget about the iptables and other transparent proxy stuff. What you require to do the above is more commonly mentioned as anonymization. via off forwarded-for off header_access X-forwarded-For deny all (NP: requires Squid built with --enable-http-violations) Squid will take care of removing stuff only relevant between the browser and itself (ie Proxy-Connection). But is required to add the above two headers. Anonymization so the web server only sees Squid as the source involves removing them and all other tracing information. If the above still does not work not we need to look deeper at how the tv station is detecting the proxy, possibly how the streaming happens. There are other things that might be tried but which degrade web access a little. Amos
Re: [squid-users] Problem with Squid + Tproxy and Rapdishare
On Sun, 9 Aug 2009 10:58:23 -0300, Carlos Botejara cbotej...@gmail.com wrote: hi, this is my first post here. I have a problem, but first I describe the scenario I have clients with public IP Mikrotik router redirecting traffic to SQUID Squid 3.1 with support for TPROXY Iptables 1.4.4 with support for TPROXY Debian Lenny / Kernel 2.6.28 with support for TPROXY well. The proxy works as well, and when I made some test pages whatismyip, shows that the ip is the CLIENT. However. I can not get my clients with public IP address simultaneously downloading from RapidShare / Megaupload ETC. The error shown within these pages is the typical already are downloading from that ip, so if viewing RapidShare IP SQUID in reality and not the client. How fix this? the configuration file of squid in the harbor is well http_port 81 tproxy Iptables: iptables -t mangle -N DIVERT iptables -t mangle -A PREROUTING -p tcp -m socket -j DIVERT iptables -t mangle -A DIVERT -j MARK --set-mark 1 iptables -t mangle -A DIVERT -j ACCEPT iptables -t mangle -A PREROUTING -p tcp --dport 3128 -j TPROXY --tproxy-mark 0x1/0x1 --on-port 81 You have this rule ass-backwards. TPROXY is intended to intercept port 80 traffic, not port 3128 traffic. When the client is NOT configured to use the proxy. The HTTP request formats are noticeably different. It's trivially easy to detect those differences and probably what rapidshare is doing. Please go back and use the http://wiki.squid-cache.org/Features/Tproxy4 documentation and configuration example. ip rule add fwmark 1 lookup 100 ip route add local 0.0.0.0/0 dev lo table 100 echo 1 /proc/sys/net/ipv4/ip_forward Mikrotik: Have a rule in the firewall to redirect all traffic to port 80 of the SQUID to the IP, port 3128 All clients create sessions PPPOE in Router Mikrotik May help? Regards Amos
Re: [squid-users] Howto run Internet Explorer without proxy setting in Internet Options
Hi, Thank you for your email. So far in only two posts you have asked how to do three very different things. Sorry, probably because of my misunderstanding of Squid proxy. Anyway, I configured squid-3.0.STABLE18 with ./configure --with-pthreads --enable-http-violations and added the options (I guess you meant 'request_header_access' instead of 'header_access'). When I go to www.whatismyip.com it does NOT mention anymore that I am possibly redirected by Squid (it did before). This looks much better! Unfortunately I cannot test my Football site until next weekend because there are no matches until then. I will continue my anonymous-ation then. Anyway, I really appreciate you help! Thank you, Andrej
[squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email)
Hi, Can anyone give me a hint as to block 443 and let some other secured site be excluded from the block? TIA __ Information from ESET NOD32 Antivirus, version of virus signature database 4295 (20090731) __ The message was checked by ESET NOD32 Antivirus. http://www.eset.com --- This message is solely intended to the person(s) indicated on the header and has been scanned for viruses and dangerous content by MailScanner. If any malware detected on this transmission, please email the postmaster at ad...@sscrmnl.edu.ph. Providing Quality Catholic Education for the Masses for more info visit us at http://www.sscrmnl.edu.ph
Re: [squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email)
On Mon, 10 Aug 2009 10:24:04 +0800, SSCR Internet Admin ad...@sscrmnl.edu.ph wrote: Hi, Can anyone give me a hint as to block 443 and let some other secured site be excluded from the block? Depends on what you want to block there... I assume that you actually mean you want to block HTTPS traffic except to some certain sites. Squid default controls have ACLs called SSL_ports and CONNECT. With this configuration line: http_access deny CONNECT !SSL_ports To restrict further and only allow certain websites to use port 443/HTTPS create an ACL listing their domain names and change the access lien like so acl httpSites dstdomain .example.com http_access deny CONNECT !SSL_ports !httpsSites Amos
RE: [squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email)
Thanks Amos, hope this could partially stop ultrasurf... crossing fingers.. -Original Message- From: Amos Jeffries [mailto:squ...@treenet.co.nz] Sent: Monday, August 10, 2009 10:35 AM To: SSCR Internet Admin Cc: squid-users@squid-cache.org Subject: Re: [squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email) On Mon, 10 Aug 2009 10:24:04 +0800, SSCR Internet Admin ad...@sscrmnl.edu.ph wrote: Hi, Can anyone give me a hint as to block 443 and let some other secured site be excluded from the block? Depends on what you want to block there... I assume that you actually mean you want to block HTTPS traffic except to some certain sites. Squid default controls have ACLs called SSL_ports and CONNECT. With this configuration line: http_access deny CONNECT !SSL_ports To restrict further and only allow certain websites to use port 443/HTTPS create an ACL listing their domain names and change the access lien like so acl httpSites dstdomain .example.com http_access deny CONNECT !SSL_ports !httpsSites Amos --- This message is solely intended to the person(s) indicated on the header and has been scanned for viruses and dangerous content by MailScanner. If any malware detected on this transmission, please email the postmaster at ad...@sscrmnl.edu.ph. Providing Quality Catholic Education for the Masses for more info visit us at http://www.sscrmnl.edu.ph __ Information from ESET NOD32 Antivirus, version of virus signature database 4295 (20090731) __ The message was checked by ESET NOD32 Antivirus. http://www.eset.com __ Information from ESET NOD32 Antivirus, version of virus signature database 4295 (20090731) __ The message was checked by ESET NOD32 Antivirus. http://www.eset.com --- This message is solely intended to the person(s) indicated on the header and has been scanned for viruses and dangerous content by MailScanner. If any malware detected on this transmission, please email the postmaster at ad...@sscrmnl.edu.ph. Providing Quality Catholic Education for the Masses for more info visit us at http://www.sscrmnl.edu.ph
[squid-users] When will SSLBump be included on the production version?
Hello, I would like to ask as to when will SSLBump be included in the main stream of squid stable version? It seems that SSLBump can help use admin specially in schools where ultrasurf is used on some laboratories (on usb memory sticks) mostly on WiFi users. Regards __ Information from ESET NOD32 Antivirus, version of virus signature database 4295 (20090731) __ The message was checked by ESET NOD32 Antivirus. http://www.eset.com --- This message is solely intended to the person(s) indicated on the header and has been scanned for viruses and dangerous content by MailScanner. If any malware detected on this transmission, please email the postmaster at ad...@sscrmnl.edu.ph. Providing Quality Catholic Education for the Masses for more info visit us at http://www.sscrmnl.edu.ph
Re: [squid-users] refresh_pattern configuration
On Sun, 2009-08-09 at 14:19 +1200, Amos Jeffries wrote: Muhammad Sharfuddin wrote: Squid Cache: Version 2.5.STABLE12 and Squid Cache: Version 2.7.STABLE5 I am using following refresh_patterns and never encounter any problem. e.g once I visit a website, on next visit usually squid serves it from cache, and TCP_HIT or TCP_MEM_HIT or TCP_REFRESH_HIT etc are so common in '/var/log/squid/access' But a person(who I beleive is a Linux/Squid Guru) critcize on the refresh_pattern I am using in squid. (One of my posts or someone else?). So please pass your comments and corrections on the following configs #Suggested default: refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 refresh_pattern -i \.ico$ 43200 100% 43200 override-lastmod override-expire ignore-reload The problem with these commonly used patterns is that websites are now obfuscating the URL with query strings more and more often. Not always intentionally. Example; the above pattern will not match any website with: http://example.com/some.ico?sid=user-session-idtrack=fukn-cookie-id changing the hard $ to softer (\?.*)?$ catches all of those websites and keeps Squid doing what you meant to configure. Other than that the only thing to draw real criticism is the use of non-compliant override options. It's not nice netizen behaviour, ... but ... everyone else does it. [warning rant ahead: (not your fault I know)] Personally as a webmaster I set realistic expiry info on every website I touch in order to maximize speed and cacheability, particularly since getting to now Squid. It really annoys me that admin like yourself are forced to do this by a horribly large amount of clueless websites and CMS software developers. Such rules will in fact _decrease_ the cacheability times and benefits for many of the websites I and other clued-on people setup. We are forced to cope by changing filenames and sometimes URL links on every single edit, no matter how trivial. I'm sick of people complaining why can Y see their user icon in forum X but I can't? ... what?! cant fix it till next month just because I live in country/ISP X? always the webmaster to blame, never the browser author or transparent proxy admin. /rant Amos So in other words its not a healthy practice to use 'refresh_patterns' other then the defaults(in squid.conf 'Suggested default') ?. Regards --ms