Re: [squid-users] MSN causing a breach.. help!
Honestly the easiest technical fix is to deny access at the firewall or squid acl to the paid proxy site. Best long term fix is an enforced security policy (I think I might be too optimistic). On Tue, Jan 12, 2010 at 6:56 AM, Roland Roland r_o_l_a_...@hotmail.com wrote: i have the following config set to allow msn messenger to connect through my squid. acl msnport port 1863 http_access allow connect msnport http_access allow msnport i have a security breach where one of the users may be using port 1863 to reach a paid proxy that he acquired. is there a way to allow port 1863 to only work with msn messenger destinations? i've already denied access to that domain and warned the user but i want a more permanent solution the simplest way possible is to do an AND access rule with msn's domains but there's a vast list of domains that should be added and i dont have them all.. so is there another way ? PS: i'm using ADIUM client to connect to msn so when using msn's mime type its not working not sure why...
Re: [squid-users] Blocking webex
I think all the webex stuff is still done over https for the actual sessions so I would say blocking CONNECT or port 443 should achieve the desired results. Regular http should still access as expected. Something along the lines of acl webex dstdomain .webex.com #Add a custom error message to let the end customer know why they arent able to get out to webex # and how to get out if they absolutely need to. #place deny_webex in /etc/squid/error/ or /usr/share/squid/errors/English #deny_info ERR_DENY_WEBEX deny_webex http_access deny CONNECT webex On Jan 9, 2008 12:45 AM, Nadeem Semaan [EMAIL PROTECTED] wrote: Hello everyone, anyone know a way of blocking webex without blocking the actual site? I mean I still want users to read about it (even on the official website), I just dont want them to be able to use it without prior permission. Thanks and Happy New Year Never miss a thing. Make Yahoo your home page. http://www.yahoo.com/r/hs
Re: [squid-users] Forbiden
Salute Dominique, abcd.txt will be drive by url_regex given the definition provided lines like .gator.com should work http://www.squid-cache.org/Doc/FAQ/FAQ.html#toc10.4 give the basic overview /usr/local/squid/etc/errors (or where the errors directory under squid/etc) ERR_NO_abcd - File name should contain html. A simple p as the example in the faq has. squid.conf additions acl porn url_regex /usr/local/squid/etc/abcd.txt deny info ERR_NO_abcd Bill On 5/26/06, Dominique Bagnato [EMAIL PROTECTED] wrote: Merci Bill, But How to trigger Squid to answers to those forbiden requests ? How Squid will make the differnce between a legal request or a forbiden ? In the exemple: acl porn url_regex /usr/local/squid/etc/porno.txt What should I put in the file abcd in /usr/local/squid/etc/abcd.txt ? Thank you. Bill Jacqmein wrote: Dominique, http://www.squid-cache.org/Doc/FAQ/FAQ-10.html#ss10.24, is a FAQ section for customizing squid error messages. Good Luck, Bill On 5/26/06, Dominique Bagnato [EMAIL PROTECTED] wrote: Hi squid users, I have squid running on Solaris 10 with apache2. It's working perfectly but Is it possible for the Not Allowed Proxy User to have a message saying :Forbiden to use this proxy. Right now they don't have access at all but they don't have any messages. They just see This page cannot be display. I guess is just cosmetic but If it's easy to do thank you. -- Dominique Bagnato - Head of the Technology Department. French International School - Bethesda, MD. USA Tel:301 530 8260 Ext:279 - http://www.rochambeau.org -- Dominique Bagnato - Head of the Technology Department. French International School - Bethesda, MD. USA Tel:301 530 8260 Ext:279 - http://www.rochambeau.org
Re: [squid-users] Forbiden
Dominique, The outside is the Internet? Bill On 5/26/06, Dominique Bagnato [EMAIL PROTECTED] wrote: Thank you, But the forbiden users are from outside my network. They could come from what ever domain and try to use the proxy from outside. Bill Jacqmein wrote: Salute Dominique, abcd.txt will be drive by url_regex given the definition provided lines like .gator.com should work http://www.squid-cache.org/Doc/FAQ/FAQ.html#toc10.4 give the basic overview /usr/local/squid/etc/errors (or where the errors directory under squid/etc) ERR_NO_abcd - File name should contain html. A simple p as the example in the faq has. squid.conf additions acl porn url_regex /usr/local/squid/etc/abcd.txt deny info ERR_NO_abcd Bill On 5/26/06, Dominique Bagnato [EMAIL PROTECTED] wrote: Merci Bill, But How to trigger Squid to answers to those forbiden requests ? How Squid will make the differnce between a legal request or a forbiden ? In the exemple: acl porn url_regex /usr/local/squid/etc/porno.txt What should I put in the file abcd in /usr/local/squid/etc/abcd.txt ? Thank you. Bill Jacqmein wrote: Dominique, http://www.squid-cache.org/Doc/FAQ/FAQ-10.html#ss10.24, is a FAQ section for customizing squid error messages. Good Luck, Bill On 5/26/06, Dominique Bagnato [EMAIL PROTECTED] wrote: Hi squid users, I have squid running on Solaris 10 with apache2. It's working perfectly but Is it possible for the Not Allowed Proxy User to have a message saying :Forbiden to use this proxy. Right now they don't have access at all but they don't have any messages. They just see This page cannot be display. I guess is just cosmetic but If it's easy to do thank you. -- Dominique Bagnato - Head of the Technology Department. French International School - Bethesda, MD. USA Tel:301 530 8260 Ext:279 - http://www.rochambeau.org -- Dominique Bagnato - Head of the Technology Department. French International School - Bethesda, MD. USA Tel:301 530 8260 Ext:279 - http://www.rochambeau.org -- Dominique Bagnato - Head of the Technology Department. French International School - Bethesda, MD. USA Tel:301 530 8260 Ext:279 - http://www.rochambeau.org
Re: [squid-users] HTTPS Web SITE TIMEOUT
Any firewall rules in place upstream from the squid proxy? On 4/19/06, Rodrigo Barros [EMAIL PROTECTED] wrote: The web site is www.equifax.com.br , but the problem only happens after I authenticate in the site and try to access an specific url (https://novoequifaxpessoal.equifax.com.br/PessoalPlusWeb/login.jsp). The result is always the same: novoequifaxpessoal.equifax.com.br:443 (60) Connection timed out/ Here's what is shown in the access.log file: 1145466458.378445 XX.XXX.XX.XX TCP_DENIED/407 1901 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466459.524591 XX.XXX.XX.XX TCP_DENIED/407 2089 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466465.724 6200 XX.XXX.XX.XX TCP_MISS/200 4441 CONNECT novoequifaxpessoal.equifax.com.br:443 XXX\barrosr DIRECT/200.142.202.182 - 1145466465.770 2 XX.XXX.XX.XX TCP_DENIED/407 1901 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466465.783 9 XX.XXX.XX.XX TCP_DENIED/407 2089 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466465.999215 XX.XXX.XX.XX TCP_MISS/200 3576 CONNECT novoequifaxpessoal.equifax.com.br:443 XXX\barrosr DIRECT/200.142.202.182 - 1145466466.078 19 XX.XXX.XX.XX TCP_DENIED/407 1901 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466466.109 22 XX.XXX.XX.XX TCP_DENIED/407 2089 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466466.316202 XX.XXX.XX.XX TCP_MISS/200 3587 CONNECT novoequifaxpessoal.equifax.com.br:443 XXX\barrosr DIRECT/200.142.202.182 - 1145466466.323 2 XX.XXX.XX.XX TCP_DENIED/407 1901 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466466.334 7 XX.XXX.XX.XX TCP_DENIED/407 2089 CONNECT novoequifaxpessoal.equifax.com.br:443 - NONE/- text/html 1145466526.011 59676 XX.XXX.XX.XX TCP_MISS/503 0 CONNECT novoequifaxpessoal.equifax.com.br:443 XXX\barrosr DIRECT/200.142.202.182 - After the last TCP_MISS/503 I got the (60) timeout message. Here's what it's shown in cache.log: [2006/04/19 14:06:04, 3] libsmb/ntlmssp.c:ntlmssp_server_auth(606) Got user=[barrosr] domain=[XXX] workstation=[XXX] len1=24 len2=24 [2006/04/19 14:06:04, 3] libsmb/ntlmssp_sign.c:ntlmssp_sign_init(319) NTLMSSP Sign/Seal - Initialising with flags: [2006/04/19 14:06:04, 3] libsmb/ntlmssp.c:debug_ntlmssp_flags(62) Got NTLMSSP neg_flags=0x20088215 Is there anythign else I can provide ? Thanks, Rodrigo -Original Message- From: Mark Elsen [mailto:[EMAIL PROTECTED] Sent: Wednesday, April 19, 2006 1:32 AM To: Rodrigo Barros Cc: squid-users@squid-cache.org Subject: Re: [squid-users] HTTPS Web SITE TIMEOUT Hi All, I've been searching google for a while and couldn't find a solution for my problem, so if this has already been posted here sorry. I'm running Squid 2.5.10 with ntlm authentication, and I have this ssl web site that does not connect. The only error message I get is (60) Connection timed out . If I bypass the proxy and go straight to the web site, I can succesfully access the resource. Any ideas? - What's the URL of the site ? - access.log entry when this is tried ? - Anything further in cache.log ? M.
Re: [squid-users] Multiple Destinations
Slight Off-topic but can the same configuration be done with different ports on the same ip? On 4/12/06, Sketch [EMAIL PROTECTED] wrote: On 4/11/06, Henrik Nordstrom [EMAIL PROTECTED] wrote: mån 2006-04-10 klockan 17:59 -0400 skrev Sketch: Not sure what host header based vhosts are, but it's just a single site on each. Gotcha. I use IP Based hosts, so from my research thus far the following is true: * set accel host to virtual, call a redirector which is a separate program, and have it rewrite the URL. My question regarding this is will we see higher performance invoking a small perl script for every request, rather then setting up a completely separate squid instance? Has anyone else treaded on this ground? Your results? Thanks!
Re: [squid-users] squid wont let wget traffic thru
export http_proxy=http://ipaddress:port Both should pick it up for the environment. On 3/22/06, Joey S. Eisma [EMAIL PROTECTED] wrote: hello! we have our proxy server running squid (obviously). just wondering why i cannot download anything using wget. but if i use a browser and put in the download address, download would simply go through. ive already asked the admin if there is any setting that would disallow/block such download. but he said none. i have configure wgetrc to use proxy but to no avail. is there anything with squid that would block such (wget) traffic? i cant also run apt-get, but i can download sources via browser. thanks!
Re: [squid-users] squid wont let wget traffic thru
The client setting sound normal. Forbidden is normally and acl or lack of an acl for access. Maybe based on something similar to the following http://gaugusch.at/squid.shtml would be my guess. pick an IE string or Mozilla string depending on which browser was working for you from http://www.zytrax.com/tech/web/browser_ids.htm#msie snip from http://www.gnu.org/software/wget/manual/wget.html -U agent-string --user-agent=agent-string Identify as agent-string to the http server. The http protocol allows the clients to identify themselves using a User-Agent header field. This enables distinguishing the www software, usually for statistical purposes or for tracing of protocol violations. Wget normally identifies as Wget/version, version being the current version number of Wget. However, some sites have been known to impose the policy of tailoring the output according to the User-Agent-supplied information. While this is not such a bad idea in theory, it has been abused by servers denying information to clients other than (historically) Netscape or, more frequently, Microsoft Internet Explorer. This option allows you to change the User-Agent line issued by Wget. Use of this option is discouraged, unless you really know what you are doing. Specifying empty user agent with --user-agent= instructs Wget not to send the User-Agent header in http requests. /snip http://www.gnu.org/software/wget/manual/wget.html On 3/22/06, Joey S. Eisma [EMAIL PROTECTED] wrote: hi! when i run wget it says: connecting to 192.168.0.2:8088... connected. proxy request sent, awaiting response... 403 Forbidden 10:13:53 ERROR 403: Forbidden. i cannot ask the admin yet to see the what the logs say. but as of yet. my client setting seems normal eh? thanks! Henrik Nordstrom wrote: tor 2006-03-23 klockan 09:30 +0800 skrev Joey S. Eisma: declare -x ftp_proxy=http://192.168.0.2:8088/; declare -x http_proxy=http://192.168.0.2:8088; which is exactly my proxy settings. Looks fine.. what's this supposed to mean? i already have the correct setting but squid wont still let wget traffic thru? Which server does wget say it's connecting to? The proxy, or the origin server? Is there anything in the Squid logs? Regards Henrik -- Joey S. Eisma Information Systems P.IMES Corporation Phase IV, CEPZA, Rosario Cavite, Philippines Tel : 63.46.4372401 Fax : 63.46.4372425 http://www.pimes.com.ph
Re: [squid-users] squid wont let wget traffic thru
One more assumption: The browser reported as working was coming from the same IP address as wget is being used from. On 3/22/06, Bill Jacqmein [EMAIL PROTECTED] wrote: The client setting sound normal. Forbidden is normally and acl or lack of an acl for access. Maybe based on something similar to the following http://gaugusch.at/squid.shtml would be my guess. pick an IE string or Mozilla string depending on which browser was working for you from http://www.zytrax.com/tech/web/browser_ids.htm#msie snip from http://www.gnu.org/software/wget/manual/wget.html -U agent-string --user-agent=agent-string Identify as agent-string to the http server. The http protocol allows the clients to identify themselves using a User-Agent header field. This enables distinguishing the www software, usually for statistical purposes or for tracing of protocol violations. Wget normally identifies as Wget/version, version being the current version number of Wget. However, some sites have been known to impose the policy of tailoring the output according to the User-Agent-supplied information. While this is not such a bad idea in theory, it has been abused by servers denying information to clients other than (historically) Netscape or, more frequently, Microsoft Internet Explorer. This option allows you to change the User-Agent line issued by Wget. Use of this option is discouraged, unless you really know what you are doing. Specifying empty user agent with --user-agent= instructs Wget not to send the User-Agent header in http requests. /snip http://www.gnu.org/software/wget/manual/wget.html On 3/22/06, Joey S. Eisma [EMAIL PROTECTED] wrote: hi! when i run wget it says: connecting to 192.168.0.2:8088... connected. proxy request sent, awaiting response... 403 Forbidden 10:13:53 ERROR 403: Forbidden. i cannot ask the admin yet to see the what the logs say. but as of yet. my client setting seems normal eh? thanks! Henrik Nordstrom wrote: tor 2006-03-23 klockan 09:30 +0800 skrev Joey S. Eisma: declare -x ftp_proxy=http://192.168.0.2:8088/; declare -x http_proxy=http://192.168.0.2:8088; which is exactly my proxy settings. Looks fine.. what's this supposed to mean? i already have the correct setting but squid wont still let wget traffic thru? Which server does wget say it's connecting to? The proxy, or the origin server? Is there anything in the Squid logs? Regards Henrik -- Joey S. Eisma Information Systems P.IMES Corporation Phase IV, CEPZA, Rosario Cavite, Philippines Tel : 63.46.4372401 Fax : 63.46.4372425 http://www.pimes.com.ph
Re: [squid-users] squid against malware and worms
Dave, Squidguard (http://www.squidguard.org/intro/) should be able to accomplish what you are looking to do. Regards, Bill On 3/19/06, Dave [EMAIL PROTECTED] wrote: Hello, Can squid offer any protection against malware such as 180solution's zango and other spyware or worms such as the blackworm? I use these two because one of my machines got each of those through my antivirus and antispyware progs. What i was wondering is if squid could do scanning and if needed elimination as the items are coming in? Thanks. Dave.
Re: [squid-users] proxy.pac help
Raj, The below should work. Assuming the isInNet is working properly. I would leave the if statement out and just start with returning the Proxy statements if possible. Eliminate systems by just not pointing them to the proxy.pac Regards, Bill // Assign Proxy based on IP Address of Client if (isInNet(myIpAddress(), 172.16.96.0, 255.255.240.0)){ return PROXY proxy03.au.ap.abnamro.com:3128 PROXY proxy04.au.ap.abnamro.com:3128; } On 3/18/06, Raj [EMAIL PROTECTED] wrote: Hi All, I am running Squid 2.5.STABLE10. All the clients in our company use proxy.pac file in the browser settings. I need some help with the proxy.pac file. At the moment I have the following configuration: // Assign Proxy based on IP Address of Client if (isInNet(myIpAddress(), 172.16.96.0, 255.255.240.0)) return PROXY prox y03.au.ap.abnamro.com:3128; PROXY proxy04.au.ap.abnamro.com:3128; If the source IP address is from that IP range, it should go to proxy03 first and if proxy03 is down it should go to proxy04. But that is not happening. If proxy03 is down, it is not going to proxy04. Is there any syntax error in the above config. What is the correct syntax in proxy.pac file so that if proxy03 is down it will go to proxy04? Thanks.
Re: [squid-users] Question
Might be easier to setup as a policy matter instead of a technology application. Setup the AUP and have HR provide the muscle to getting it acknowledged. On 3/17/06, Richard J Palmer [EMAIL PROTECTED] wrote: I'm wondering if Squid can help in this situation... We have a setup where we want to set a range of PCs to use Squid to allow access to websites, etc. Howeve what we idally want the users to do is on their first web request to the internet be greeted with a page where they have to accept an AUP (in reality all I want is a page to appear and then once they have viewed it they can access any other sites they want, without future issues (at least fro a set time if that is easier). now I guess this could be done as some form of authentication but would be grateful for any thoughts here (or pointers if it has been siscused (I can't see anything obvious). I'm open to thoughts -- Richard Palmer
Re: [squid-users] Number Of Users
The number of connections is probably the more important from a systems point of view. Should be able to parse the look to generate how many times a particular IP visits to get a better guessimate of the user connection volume for the people management view. On 3/4/06, Kinkie [EMAIL PROTECTED] wrote: On Sat, 2006-03-04 at 05:13 +0530, Jacob, Stanley (GE Consumer Finance, consultant) wrote: this will give you rough estimate netstat -an | grep 3128 | wc -l Maybe you mean mis-estimate... This is the number of TCP connections to squid, also available in cachemgr. Unfortunately it has no real connection to the number of people accessing squid: each person who is currently downloading some webpage might have multiple streams open (up to four per window in Internet Explorer, on Mozilla Firefox up to 4 per process by default, but the number can be raised quite a lot). On the other hand, someone who is reading a page she downloaded will have no active connections to squid (except those connections which are kept alive, and you see how things get messy fast...) In other words, guesstimating the number of users accessing a proxy is even messier than trying to estimate the number of users accessing a website. Kinkie