RE: [squid-users] Custom error page based on IP.
Hi, NP: the line above deny_info ERR_EXTERNAL_IP not swan should be configured as: deny_info ERR_EXTERNAL_IP swan Can you explain why would want to do that? Unless the ERR_EXTERNAL_IP is generating the redirect to include various of the Squid % error page codes it can be replaced further with: deny_info http://internal.server/errorpage.html swan That's a far more elegant way of doing it, I shall implement that when I return after the holiday. Thanks, Jezz. -Original Message- From: Amos Jeffries [mailto:squ...@treenet.co.nz] Sent: 09 April 2009 05:18 To: Palmer J.D.F. Cc: John Doe; squid-users@squid-cache.org Subject: RE: [squid-users] Custom error page based on IP. Sorry for the somewhat large delay in replying to you, I have been on longish term sick. However I've just returned and have sussed this out. Firstly I added the following rules to squid.conf. acl swan src 123.45.0.0/16 # The campus subnet, which was already defined in squid.conf . deny_info ERR_EXTERNAL_IP not swan # if client's source IP is not in swan subnet then instantiate error page acl www dst 123.45.67.89 # campus www server holding the instruction page http_access allow www !swan # allows access to web server from IP's that are outside of swan subnet http_access deny !swan # deny src IP's outside the swan subnet. Then created a custom error file (ERR_EXTERNAL_IP) which contains a redirect to the page on the campus webserver. If you don't allow the access to the campus web server, you get a recursive deny and all gets a bit messy. NP: the line above deny_info ERR_EXTERNAL_IP not swan should be configured as: deny_info ERR_EXTERNAL_IP swan Unless the ERR_EXTERNAL_IP is generating the redirect to include various of the Squid % error page codes it can be replaced further with: deny_info http://internal.server/errorpage.html swan Amos Simples! Cheers, Jezz. -Original Message- From: John Doe [mailto:jd...@yahoo.com] Sent: 13 February 2009 09:58 To: Palmer J.D.F. Subject: Re: [squid-users] Custom error page based on IP. From: Palmer J.D.F. j.d.f.pal...@swansea.ac.uk Is it possible to have a custom error page that is displayed only when a client machine tries to connect to our squid caches from outside our subnet? We have a lot of users visitors that use their machines on site, but also off site on other networks; occasionally these users try to proxy via our cache from off site networks outside our subnet; we have acls in place that prevent remote proxying, but as it is they just get an Access Denied error. If possible I'd like to replace this error with an explanation and instructions on how to re-configure their browser. As far as I can tell the same Access Denied error (ERR_ACCESS_DENIED) is displayed for a multitude of reasons, hence not viable to just edit the existing error; is it possible to have a different error just for this scenario? Maybe you could use url rewrites to forward them to a specific web page that would explain why they cannot use the proxy from outside... JD
RE: [squid-users] Custom error page based on IP.
Sorry for the somewhat large delay in replying to you, I have been on longish term sick. However I've just returned and have sussed this out. Firstly I added the following rules to squid.conf. acl swan src 123.45.0.0/16 # The campus subnet, which was already defined in squid.conf . deny_info ERR_EXTERNAL_IP not swan # if client's source IP is not in swan subnet then instantiate error page acl www dst 123.45.67.89 # campus www server holding the instruction page http_access allow www !swan # allows access to web server from IP's that are outside of swan subnet http_access deny !swan # deny src IP's outside the swan subnet. Then created a custom error file (ERR_EXTERNAL_IP) which contains a redirect to the page on the campus webserver. If you don't allow the access to the campus web server, you get a recursive deny and all gets a bit messy. Simples! Cheers, Jezz. -Original Message- From: John Doe [mailto:jd...@yahoo.com] Sent: 13 February 2009 09:58 To: Palmer J.D.F. Subject: Re: [squid-users] Custom error page based on IP. From: Palmer J.D.F. j.d.f.pal...@swansea.ac.uk Is it possible to have a custom error page that is displayed only when a client machine tries to connect to our squid caches from outside our subnet? We have a lot of users visitors that use their machines on site, but also off site on other networks; occasionally these users try to proxy via our cache from off site networks outside our subnet; we have acls in place that prevent remote proxying, but as it is they just get an Access Denied error. If possible I'd like to replace this error with an explanation and instructions on how to re-configure their browser. As far as I can tell the same Access Denied error (ERR_ACCESS_DENIED) is displayed for a multitude of reasons, hence not viable to just edit the existing error; is it possible to have a different error just for this scenario? Maybe you could use url rewrites to forward them to a specific web page that would explain why they cannot use the proxy from outside... JD
[squid-users] Custom error page based on IP.
Hi, Is it possible to have a custom error page that is displayed only when a client machine tries to connect to our squid caches from outside our subnet? We have a lot of users visitors that use their machines on site, but also off site on other networks; occasionally these users try to proxy via our cache from off site networks outside our subnet; we have acls in place that prevent remote proxying, but as it is they just get an Access Denied error. If possible I'd like to replace this error with an explanation and instructions on how to re-configure their browser. As far as I can tell the same Access Denied error (ERR_ACCESS_DENIED) is displayed for a multitude of reasons, hence not viable to just edit the existing error; is it possible to have a different error just for this scenario? Many thanks, Jezz Palmer.
RE: [squid-users] Broadcom Chipset with squid
Hi, I'm using a DL360 with the Broadcom chipset at 1000Mbs as a squid server and have never had any problems, infact it's been very stable since I moved the cache onto that box. Previously I was running an LVS 2 squid server cluster (and a seperate balancer) and the single DL360 outperforms the cluster quite substantially and is far more stable. That all said, I have about 10 DL360/DL380s doing various things, and a couple of Dell laptops with the Broadcom 57** gigabit NICS in them and it isn't good news on all of them. We have had issues with poor linux drivers which an upgrade fixed, and have had issues with the laptops running Ethereal under WindowsXP, they don't capture VLan traffic properly. Strangely under Linux Ethereal works ok. Cheers, Jezz Palmer. -Original Message- From: Jason Whiteaker To: squid-users@squid-cache.org Sent: 07/09/2005 17:55 Subject: RE: [squid-users] Broadcom Chipset with squid With all due respect, I'm quickly coming to the conclusion that Broadcom Ethernet chipsets are the biggest pile of dung on the planet. Their software, in particular any NIC teaming configuration tools, is absolute garbage. I've wasted so much time just trying to get them to cooperate in a Windows 2003 environment. Never could get teaming to work right, so I've given up and accepted that we'll have failover only - no load sharing of the two NICs. Do yourself a favor and install some discrete Intel NICs and be done with it. -Jason -Original Message- From: Joel Jaeggli [mailto:[EMAIL PROTECTED] Sent: Wednesday, September 07, 2005 11:19 AM To: [EMAIL PROTECTED] Cc: squid-users@squid-cache.org Subject: Re: [squid-users] Broadcom Chipset with squid On Wed, 7 Sep 2005, [EMAIL PROTECTED] wrote: Hi I am not sure if this is the correct forum to discuss this problem.But I would like to share my exp. I am using DL380 compaq server which comes Broadcom Chip set for Ethernet controller. Is any one using Compaq Server for Squid. I am running Redhat 9 on it. I noticed when squid is passing high traffic, Ethernet interface get reset. Did anyone else faced this kind of problem? If yes what is the solution? I have have several broadcom chipset ethernets in servers. The driver for this ethernet chipset has undergone substantial ferment in the last couple of years, so as the first order of business you should probably consider installing a newer version on the OS preferably something built around the 2.6 kernel. fedora core 4 runs without any problems on a dl380... joelja Thanks - LK Disclaimer ** ** The information contained in this e-mail, any attached files, and response threads are confidential and may be legally privileged. It is intended solely for the use of individual(s) or entity to which it is addressed and others authorised to receive it. If you are not the intended recipient, kindly notify the sender by return mail and delete this message and any attachment(s) immediately. Save as expressly permitted by the author, any disclosure, copying, distribution or taking action in reliance on the contents of the information contained in this e-mail is strictly prohibited and may be unlawful. Unless otherwise clearly stated, and related to the official business of Accelon Nigeria Limited, opinions, conclusions, and views expressed in this message are solely personal to the author. Accelon Nigeria Limited accepts no liability whatsoever for any loss, be it direct, indirect or consequential, arising from information made available in this e-mail and actions resulting there from. For more information about Accelon Nigeria Limited, please see our website at http://www.accelonafrica.com ** -- -- Joel Jaeggli Unix Consulting [EMAIL PROTECTED] GPG Key Fingerprint: 5C6E 0104 BAF0 40B0 5BD3 C38B F000 35AB B67F 56B2 CONFIDENTIALITY NOTICE: The information contained in this electronic mail transmission (including any attachment) is intended for the exclusive use of the named recipient and may contain information that is privileged or otherwise confidential. It is not intended for transmission to, or receipt by, anyone other than the named recipient (or person authorized to deliver it to the named recipient). It should not be copied or forwarded to any unauthorized person. If you have received this electronic mail transmission in error, please delete it from your system including any attachment without copying or forwarding it, and notify the sender of the error by return e-mail.
RE: [squid-users] WindowsUpdate Problems.
Hi, Thanks for all of your suggestions. :-) Someone from the list has kindly sent me a list of IPs for making holes in the FW with which was a quick fix, I have since set up WPAD autoproxy configuration as this seemed like the most transparent and resilient way of fixing it; WU does a WPAD discovery as it starts up. WRT using SUS, we already do use SUS (WUS now) for the majority of PC's on campus but this problem I'm having at the minute is caused by student/personal machines connecting to our Wi-Fi service which we can't really manage in this way. Getting students etc to install reg fixes etc is an option but would be a lot easier not to have to rely on them to do it. I'd imagine there would also be an issue should they take their laptop to another network? Unfortunately I discovered that WPAD doesn't work on VPNs using the M$ client (and possibly other VPN clients), in their infinite wisdom M$ have made it so VPNs can't have the DNS-Suffix set on them unless hardcoded on the client machine, this interaction isn't really an option so I am relying on FW holes to allow these machines to WU. All in all with a combination of .pac files, WPAD and FW holes it I now have all clients able to WU, I hope. Thanks for all of your help. Cheers, Jezz. -Original Message- From: Steve Palmer [mailto:[EMAIL PROTECTED] Sent: 19 January 2005 13:24 To: James Gray; squid-users@squid-cache.org Subject: RE: [squid-users] WindowsUpdate Problems. I would probably recommend setting up the free SUS server which would have direct or regular proxy access to the official servers, and use group policys to direct all your clients to your inhouse server. http://www.microsoft.com/windowsserversystem/sus/default.mspx ooo just noticed they're doing an update to it called WUS. Hope its good! ;) From: James Gray [mailto:[EMAIL PROTECTED] Sent: Tue 1/18/2005 9:13 PM To: squid-users@squid-cache.org Subject: Re: [squid-users] WindowsUpdate Problems. On Mon, 17 Jan 2005 11:18 pm, Palmer J.D.F. wrote: Hello, I have just been made aware that some machines are not Windows updating on our campus network, I've done a fair bit of investigation and I 'think' I know what the problem is and just wondered if anyone else had seen this, and if so how it was remedied. Initially I thought this was a Squid problem, but I'm now tending to think it's a Microsoft problem. On our campus we force certain IP ranges to go through our squid caches, which I guess you could call opaque, IE browsers/clients etc have to be configured to go through the cache rather than transparent. These restricted clients are forced to use the cache by the use of acls on core routers denying port 80 traffic from various IPs. It appears that the Windows Update V5 client (not sure about V4) tries to open a port 80 connection directly to Microsoft servers to check for and download updates, this obviously fails as the router acls drop the packets. We had similar problems with WinXP clients trying to get updates both automatically and manually from Windows Update (v5, but be had intermittent problems with automatic updates on win2k - v4.windowsupdate...). Turns out M$ can't figure out how to implement authenticated proxy requests from the client to a proxy for Windows Update. I found a M$ knowledge-base article about it and the suggestion was to allow all requests to *windowsupdate.microsoft.com to be done without proxy authentication. The way you do this in squid is to put an ACL to allow requests to windows update BEFORE the ACL that requires authentication. I'm offline ATM, but the I can send you the relevant bits from our squid.conf if you like. Cheers, James This message is confidential. You should not copy it or disclose its contents to anyone. If this email has come to you in error please delete it and any attachments. Internet communications are not secure and therefore Noel-Baker Community School and Language College does not accept legal responsibility for the contents of this message. Unless expressly stated, any views or opinions presented are those of the author and not those of Noel-Baker Community School and Language College. Please note that Noel-Baker Community School and Language College will monitor incoming and outgoing e-mail communications for security and regulatory purposes.
[squid-users] WindowsUpdate Problems.
Hello, I have just been made aware that some machines are not Windows updating on our campus network, I've done a fair bit of investigation and I 'think' I know what the problem is and just wondered if anyone else had seen this, and if so how it was remedied. Initially I thought this was a Squid problem, but I'm now tending to think it's a Microsoft problem. On our campus we force certain IP ranges to go through our squid caches, which I guess you could call opaque, IE browsers/clients etc have to be configured to go through the cache rather than transparent. These restricted clients are forced to use the cache by the use of acls on core routers denying port 80 traffic from various IPs. It appears that the Windows Update V5 client (not sure about V4) tries to open a port 80 connection directly to Microsoft servers to check for and download updates, this obviously fails as the router acls drop the packets. The only way I've found to get this to work is to totally disable the windows update client so it makes no checks etc, then manually run WU from a browser which isn't ideal. Even manual attempts fail if the WU client is running. Does anyone know of a list of IP's that the client uses so holes can be made to allow port 80 traffic through to them, or if there is a way to configure the WU client with the proxy settings? Or perhaps I am barking up the wrong tree altogether? Many thanks, Jezz Palmer. Jezz Palmer. Internet Systems Officer. Library and Information Services University of Wales, Swansea Singleton Park Swansea SA2 8PP
[squid-users] Corrupt Downloads.
Hi, We have been running Squid Version 2.4.STABLE7 for some time now without problems, though recently I've had a few reports of corrupt downloads. The file download okay and are the correct size (or appear to be) but are corrupt when they are unzipped or executed. I have tested the reported files myself by downloading them with and without the cache and true enough the files are being corrupted by the cache as they pass through it. The file sizes vary between 1-30MB Does anyone have any idea as to why this is happening? Is it time perhaps that I upgraded? Many thanks, Jezz Palmer. Jezz Palmer. Internet Systems Officer. Library and Information Services University of Wales, Swansea Singleton Park Swansea SA2 8PP
RE: [squid-users] Corrupt Downloads.
- Are you using any parents offering bad QoS (Quality of Service) ? No we have no parent's anymore, we've not had for a couple of years now. Is it time perhaps that I upgraded? - Certainly advizable : to use the latest stable release and verify this issue again. I don't know why I asked that really, was a bit of a dumb question. :-) I'll get on the case now. Cheers, Jezz. We have been running Squid Version 2.4.STABLE7 for some time now without problems, though recently I've had a few reports of corrupt downloads. The file download okay and are the correct size (or appear to be) but are corrupt when they are unzipped or executed. I have tested the reported files myself by downloading them with and without the cache and true enough the files are being corrupted by the cache as they pass through it. The file sizes vary between 1-30MB Does anyone have any idea as to why this is happening? - Are you using any parents offering bad QoS (Quality of Service) ? M.
[squid-users] Squid and Windows Update.
Hello, I'm having a bit of an issue with Squid and Windows Update. In the last day or so we have noticed machines on campus failing to get their WUs. All goes well until I click the scan for updates link and then I get an error, the M$ error is the seemingly infamous '0x800a138F' error. Many pages from the search below blame the new hosting arrangements that M$ have with Akamai, stating that Akamai are also a host for many ad banners so are often blocked by admin's. http://www.google.com/search?sourceid=navclientie=UTF-8oe=UTF-8q=0x800a13 8F However we don't appear to have any rules in our squid.conf that block access to that site, neither when I log the requests from my test machine does it deny access to any of the requests; Anyway on further investigation I have retrieved another M$ error code from the WU Log file on the client PC, this is '0x800C0002' which according to M$ is Invalid URL. I only get this problem going through the squid boxes. Another twist to this is that if I turn the cache settings off in IE do a WU scan which succeeds and then turn the cache settings back on it works fine thereafter. However it is not possible for us to turn the cache setting off all the machines here, even if it were we'd have to open up the firewall to allow port 80 access for all machines rather than just the WWW and a select few admin machines. Is this a known bug with squid? Many thanks, Jezz Palmer. Jezz Palmer. Internet Systems Officer. Library and Information Services University of Wales, Swansea Singleton Park Swansea SA2 8PP