[squid-users] connection limit and X-Forwarded-For IP
Hi All, Recently I configure Squid as reverse proxy for back-end apache server running Drupal. acl airarabia_web dstdomain www.airarabia.com cache_peer 10.4.171.6 parent 80 0 no-query originserver name=airarabia_peer2 round-robin forceddomain=www.airarabia.com default # cache_peer 10.4.171.7 parent 80 0 no-query originserver name=airarabia_peer1 round-robin forceddomain=www.airarabia.com default # not yet implemented cache_peer_access airarabia_peer2 allow airarabia_web cache_peer_access airarabia_peer2 deny all Problem 1:- With Apache I had connection Limit of 20 per IP (mod_limitipconn.so) I need to achieve this with squid reverse proxy. please let me know if below configurations is correct. === acl connectionLimit maxconn 20 acl airarabia_web dstdomain www.airarabia.com cache_peer 10.4.171.6 parent 80 0 no-query originserver name=airarabia_peer2 round-robin forceddomain=www.airarabia.com default cache_peer_access airarabia_peer2 allow airarabia_web connectionLimit cache_peer_access airarabia_peer2 deny all === Problem 2:- After configuring reverse proxy, The apache back-end server gets the IP of the reverse proxy and not of the actual clients. squid.conf === follow_x_forwarded_for allow airarabia_web follow_x_forwarded_for deny all acl_uses_indirect_client on delay_pool_uses_indirect_client on log_uses_indirect_client on === I will work on HOW TO for mod_extract_forwarded, but mean time if someone can verify if the above squid.conf for problem 2 is correct? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] how to log headers
Hi, I wan to log all type of headers. I have a similar rule but on i386 system same squid version which works fine //Remy tookers wrote: Hi there, What particular headers are you trying to log? e.g. Via: User-Agent: etc Thanks, tookers Mario Remy Almeida wrote: Hi All, Squid Cache: Version 2.7.STABLE6 logformat headers %ts.%03tu %tg %a %rp [ %h ] %rm [ %h ] access_log /var/log/squid/headers.log headers but in the headers.log file I get 1255582968.512 15/Oct/2009:05:02:48 + 10.200.2.174 /xbe/css/BG1.jpg [ - ] GET [ - ] and no headers are logged. any reason why? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA. -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] how to log headers
Hi, Thanks for that, what is support I want to know what all headers are set? //Remy * * tookers wrote: Hmm, this works fine for me... logformat custom ... %{User-Agent}h %h %a I've tested on an i686 and Sparc based servers and this works fine. Cheers tookers Mario Remy Almeida wrote: Hi, I wan to log all type of headers. I have a similar rule but on i386 system same squid version which works fine //Remy tookers wrote: Hi there, What particular headers are you trying to log? e.g. Via: User-Agent: etc Thanks, tookers Mario Remy Almeida wrote: Hi All, Squid Cache: Version 2.7.STABLE6 logformat headers %ts.%03tu %tg %a %rp [ %h ] %rm [ %h ] access_log /var/log/squid/headers.log headers but in the headers.log file I get 1255582968.512 15/Oct/2009:05:02:48 + 10.200.2.174 /xbe/css/BG1.jpg [ - ] GET [ - ] and no headers are logged. any reason why? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA. -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA. -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] how to log headers
Hi All, Squid Cache: Version 2.7.STABLE6 logformat headers %ts.%03tu %tg %a %rp [ %h ] %rm [ %h ] access_log /var/log/squid/headers.log headers but in the headers.log file I get 1255582968.512 15/Oct/2009:05:02:48 + 10.200.2.174 /xbe/css/BG1.jpg [ - ] GET [ - ] and no headers are logged. any reason why? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] POST NONE://
Hi All, Was on leave for few days Thanks for all the support. In other way if I upgrade to 3.x my problem will be solved? //Remy Amos Jeffries wrote: On Mon, 05 Oct 2009 14:30:06 +0200, Henrik Nordstrom hen...@henriknordstrom.net wrote: mån 2009-10-05 klockan 22:56 +1300 skrev Amos Jeffries: I'm not sure if that applies to this situation since it requires an intermediate proxies to upgrade as well. Ofcourse. For the record, Chunked coding is in all current 3.x releases since 3.0.STABLE16. That's just responses right? This thread is about POST requests... Regards Henrik Both. Request support arrived just in time to get it for 3.0 during the back-port http://www.squid-cache.org/Versions/v3/3.0/changesets/b9025.patch Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] POST NONE://
Hi Amos, Thanks for that, My problem is solved. Is there any way to by-pass such problems. I mean for known source IP if HTTP headers are not set then still it is pass through. //Remy Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, Thanks for your reply. You mean the length is less then what is required? No there is an HTTP header Content-Length: which is missing from the POST request. When you pass the below SOAP message I get the error, But when it is passed directly to the jboss applications server the request is served correctly. What is wrong the Header settings in the soap message or do i need to do some config changes in squid.conf file The SOAP message below looks like data inside the body. The missing bit is in the wrapping HTTP headers. They are generated by the client software. Amos === SOAP MESSAGE == soapenv:Envelope xmlns:soapenv=http://schemas.xmlsoap.org/soap/envelope/; xmlns:env=http://schemas.xmlsoap.org/soap/envelop/; xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance; xmlns:enc=http://schemas.xmlsoap.org/soap/encoding/; xmlns:xsd=http://www.w3.org/2001/XMLSchema;soapenv:Header wsse:Security xmlns:wsse=http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd;wsse:UsernameTokenwsse:UsernameWSUSER/wsse:Usernamewsse:Passwordpass43/wsse:Password/wsse:UsernameToken/wsse:Security/soapenv:Headersoapenv:Bodyns2:OTA_ReadRQ xmlns:ns2=http://www.opentravel.org/OTA/2003/05; EchoToken=WWW0909271406222 PrimaryLangID=en-us SequenceNmbr=1 TimeStamp=2009-09-27T02:06:22 TransactionIdentifier=ns2:POSns2:Source TerminalID=TestUser/Test Runnerns2:RequestorID ID=WSDIBUSERATM Type=9/ns2:RequestorIDns2:BookingChannel Type=9/ns2:BookingChannel/ns2:Source/ns2:POSns2:ReadRequestsns2:ReadRequestns2:UniqueID ID=18496815 Type=14/ns2:UniqueID/ns2:ReadRequestns2:AirReadRequestns2:DepartureDate2009-10-30T00:00:00/ns2:DepartureDate/ns2:AirReadRequest/ns2:ReadRequests/ns2:OTA_ReadRQns1:AAReadRQExt xmlns:ns1=http://www.isaaviation.com/thinair/webservices/OTA/Extensions/2003/05;ns1:AALoadDataOptionsns1:LoadTravelerInfotrue/ns1:LoadTravelerInfons1:LoadAirItinerytrue/ns1:LoadAirItineryns1:LoadPriceInfoTotalstrue/ns1:LoadPriceInfoTotalsns1:LoadFullFilmenttrue/ns1:LoadFullFilment/ns1:AALoadDataOptions/ns1:AAReadRQExt/soapenv:Body/soapenv:Envelope //Remy Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, would like to know what is the reason that i get NONE:// in the access.log file as below 1254046127.530 0 195.229.115.202 TCP_DENIED/411 1757 POST NONE:// - NONE/- text/html my squid proxy acts like a reverse proxy. A valid request is sent from the above IP 411 status code is failure to pass a basic validity test. This one was a test for Content-Length: header on POST requests. Could some one help be in solving the problem. My setup Request from Internet - Squid Reverse proxy(A) - Squid reverse proxy(B) - Jboss Applications Server. The NONE:// means no the request did not complete, it did not even get far enough to determine if it was a HIT or MISS on the URL. This is due to the required header making Squid abort its processing immediately. Amos Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] POST NONE://
Hi All, would like to know what is the reason that i get NONE:// in the access.log file as below 1254046127.530 0 195.229.115.202 TCP_DENIED/411 1757 POST NONE:// - NONE/- text/html my squid proxy acts like a reverse proxy. A valid request is sent from the above IP Could some one help be in solving the problem. My setup Request from Internet - Squid Reverse proxy(A) - Squid reverse proxy(B) - Jboss Applications Server. -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] POST NONE://
Hi Amos, Thanks for your reply. You mean the length is less then what is required? When you pass the below SOAP message I get the error, But when it is passed directly to the jboss applications server the request is served correctly. What is wrong the Header settings in the soap message or do i need to do some config changes in squid.conf file === SOAP MESSAGE == soapenv:Envelope xmlns:soapenv=http://schemas.xmlsoap.org/soap/envelope/; xmlns:env=http://schemas.xmlsoap.org/soap/envelop/; xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance; xmlns:enc=http://schemas.xmlsoap.org/soap/encoding/; xmlns:xsd=http://www.w3.org/2001/XMLSchema;soapenv:Header wsse:Security xmlns:wsse=http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd;wsse:UsernameTokenwsse:UsernameWSUSER/wsse:Usernamewsse:Passwordpass43/wsse:Password/wsse:UsernameToken/wsse:Security/soapenv:Headersoapenv:Bodyns2:OTA_ReadRQ xmlns:ns2=http://www.opentravel.org/OTA/2003/05; EchoToken=WWW0909271406222 PrimaryLangID=en-us SequenceNmbr=1 TimeStamp=2009-09-27T02:06:22 TransactionIdentifier=ns2:POSns2:Source TerminalID=TestUser/Test Runnerns2:RequestorID ID=WSDIBUSERATM Type=9/ns2:RequestorIDns2:BookingChannel Type=9/ns2:BookingChannel/ns2:Source/ns2:POSns2:ReadRequestsns2:ReadRequestns2:UniqueID ID=18496815 Type=14/ns2:UniqueID/ns2:ReadRequestns2:AirReadRequestns2:DepartureDate2009-10-30T00:00:00/ns2:DepartureDate/ns2:AirReadRequest/ns2:ReadRequests/ns2:OTA_ReadRQns1:AAReadRQExt xmlns:ns1=http://www.isaaviation.com/thinair/webservices/OTA/Extensions/2003/05;ns1:AALoadDataOptionsns1:LoadTravelerInfotrue/ns1:LoadTravelerInfons1:LoadAirItinerytrue/ns1:LoadAirItineryns1:LoadPriceInfoTotalstrue/ns1:LoadPriceInfoTotalsns1:LoadFullFilmenttrue/ns1:LoadFullFilment/ns1:AALoadDataOptions/ns1:AAReadRQExt/soapenv:Body/soapenv:Envelope //Remy Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, would like to know what is the reason that i get NONE:// in the access.log file as below 1254046127.530 0 195.229.115.202 TCP_DENIED/411 1757 POST NONE:// - NONE/- text/html my squid proxy acts like a reverse proxy. A valid request is sent from the above IP 411 status code is failure to pass a basic validity test. This one was a test for Content-Length: header on POST requests. Could some one help be in solving the problem. My setup Request from Internet - Squid Reverse proxy(A) - Squid reverse proxy(B) - Jboss Applications Server. The NONE:// means no the request did not complete, it did not even get far enough to determine if it was a HIT or MISS on the URL. This is due to the required header making Squid abort its processing immediately. Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Disable file upload
Hi Maurizio, Thanks for your reply. unfortunately even that policy is not working. //Remy Maurizio Marini wrote: On Tuesday 22 September 2009, Mario Remy Almeida wrote: Hi All Need to disable file upload with gmail how can I do this? acl fileupload req_mime_type -i ^multipart/form-data$ http_reply_access deny fileupload -m -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] Disable file upload
Hi All Need to disable file upload with gmail how can I do this? acl fileupload req_mime_type -i ^multipart/form-data$ # Only allow cachemgr access from localhost http_access allow manager localhost http_access allow localhost PURGE http_access deny manager http_access deny fileupload the above acl is not working. Could someone help me? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] Custom Error Page
Hi All, acl ipA src 10.0.0.1 acl acTime time SM http_access deny ipA acTime for the above acl need to have custom ERR_ page deny_info ERR_TIME_DENIED ipA deny_info ERR_TIME_DENIED acTime ERR_TIME_DENIED page is in the squid error directory. what is the correct deny_info parameter to get a custom ERR_ page? //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] MTU problem
Hi All, WebServer Cofnig: OS: Centos 5.3 running on VM Ware connected to Nortel switch with MTU 1500 Applications: Jboss-4.2.3 Network MTU Setup to 9000 Reverse Proxy Config OS: Centos 5.3 IBM x3350 Server Connected to Cisco Switch with MTU 1500 Application: Squid 2.7 Network MTU Setup to 9000 Squid cannot connect with MTU 9000 when MTU set to 1500 on the WebServer all fine //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] MTU problem
Hi Amos, But I can login and browser the Applications server without any issue even if MTU set to 9000. //Remy Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, WebServer Cofnig: OS: Centos 5.3 running on VM Ware connected to Nortel switch with MTU 1500 Applications: Jboss-4.2.3 Network MTU Setup to 9000 Reverse Proxy Config OS: Centos 5.3 IBM x3350 Server Connected to Cisco Switch with MTU 1500 Application: Squid 2.7 Network MTU Setup to 9000 Squid cannot connect with MTU 9000 when MTU set to 1500 on the WebServer all fine //Remy Sounds like a typical MTU situation. Forcing 9000 bytes through a 1500 byte port on the switch will result in failure. The switch will be generating ICMP messages to signal the problem and cause automatic packet reduction to kick in. Unless of course you have a firewall that blocks ICMP messages or have path MTU discovery turned off some other way anywhere down the chain of software and devices. Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] wild card ssl certificate
Hi All I followed the steps mentioned in the below url http://wiki.squid-cache.org/ConfigExamples/Reverse/SslWithWildcardCertifiate when below cmd executed openssl req -x509 -newkey rsa -out cacert.pem -outform PEM -days 1000 I get below message which means some options missing. can someone tell me what am i missing? is it rsa:1024 instead rsa? req [options] infile outfile where options are -inform arginput format - DER or PEM -outform arg output format - DER or PEM -in arginput file -out arg output file -text text form of request -pubkeyoutput public key -noout do not output REQ -verifyverify signature on REQ -modulus RSA modulus -nodes don't encrypt the output key -engine e use engine e, possibly a hardware device -subject output the request's subject -passinprivate key password source -key file use the private key contained in file -keyform arg key file format -keyout argfile to send the key to -rand file:file:... load the file (or the files in the directory) into the random number generator -newkey rsa:bits generate a new RSA key of 'bits' in size -newkey dsa:file generate a new DSA key, parameters taken from CA in 'file' -[digest] Digest to sign with (md5, sha1, md2, mdc2, md4) -config file request template file. -subj arg set or modify request subject -multivalue-rdn enable support for multivalued RDNs -new new request. -batch do not ask anything during request generation -x509 output a x509 structure instead of a cert. req. -days number of days a certificate generated by -x509 is valid for. -set_serialserial number to use for a certificate generated by -x509. -newhdroutput NEW in the header lines -asn1-kludge Output the 'request' in a format that is wrong but some CA's have been reported as requiring -extensions .. specify certificate extension section (override value in config file) -reqexts ..specify request extension section (override value in config file) -utf8 input characters are UTF8 (default ASCII) -nameopt arg- various certificate name options -reqopt arg- various request text options //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] wild card ssl certificate
Hi Amos, Tired with the changed worked very well no issues One small change in the wiki in openssl.cnf it is mentioned as dir = /usr/newrprgate/CertAuth but mkdir newprpgate; cd newrprgate should be mkdir newrprgate if possible please correct in the wiki //Remy On Mon, 2009-07-06 at 10:45 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All I followed the steps mentioned in the below url http://wiki.squid-cache.org/ConfigExamples/Reverse/SslWithWildcardCertifiate when below cmd executed openssl req -x509 -newkey rsa -out cacert.pem -outform PEM -days 1000 I get below message which means some options missing. can someone tell me what am i missing? is it rsa:1024 instead rsa? Yes it needs the bit-length. Though for the CA cert its advised to use stronger/longer bit length than normal. 2048 bits is mentioned in the wiki for now. Thanks for reporting that. Wiki updated. Amos req [options] infile outfile where options are -inform arginput format - DER or PEM -outform arg output format - DER or PEM -in arginput file -out arg output file -text text form of request -pubkeyoutput public key -noout do not output REQ -verifyverify signature on REQ -modulus RSA modulus -nodes don't encrypt the output key -engine e use engine e, possibly a hardware device -subject output the request's subject -passinprivate key password source -key file use the private key contained in file -keyform arg key file format -keyout argfile to send the key to -rand file:file:... load the file (or the files in the directory) into the random number generator -newkey rsa:bits generate a new RSA key of 'bits' in size -newkey dsa:file generate a new DSA key, parameters taken from CA in 'file' -[digest] Digest to sign with (md5, sha1, md2, mdc2, md4) -config file request template file. -subj arg set or modify request subject -multivalue-rdn enable support for multivalued RDNs -new new request. -batch do not ask anything during request generation -x509 output a x509 structure instead of a cert. req. -days number of days a certificate generated by -x509 is valid for. -set_serialserial number to use for a certificate generated by -x509. -newhdroutput NEW in the header lines -asn1-kludge Output the 'request' in a format that is wrong but some CA's have been reported as requiring -extensions .. specify certificate extension section (override value in config file) -reqexts ..specify request extension section (override value in config file) -utf8 input characters are UTF8 (default ASCII) -nameopt arg- various certificate name options -reqopt arg- various request text options //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA. -- -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] wild card ssl certificate
Hi Amos, Everything is correct except the spelling of newrprgate in openssl.cnf it is correct mkdir create directory as newprpgate then cd to directory newrprgate which does not exists. so newprpgate should be newrprgate for mkdir command //Remy On Mon, 2009-07-06 at 17:37 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, Tired with the changed worked very well no issues One small change in the wiki in openssl.cnf it is mentioned as dir = /usr/newrprgate/CertAuth but mkdir newprpgate; cd newrprgate should be mkdir newrprgate if possible please correct in the wiki //Remy Do you mean: dir = /usr/newrprgate/CertAuth becomes dir = /usr/CertAuth and === Setup a certificate Signing Authority (if needed) === cd /usr mkdir newprpgate; cd newrprgate mkdir CertAuth; cd CertAuth mkdir certs; mkdir private chmod 700 private echo '01' serial touch index.txt becomes: === Setup a certificate Signing Authority (if needed) === cd /usr mkdir newprpgate; mkdir CertAuth; cd CertAuth mkdir certs; mkdir private chmod 700 private echo '01' serial touch index.txt ?? IIRC the only funky thing I found when following those myself a long while ago was a missing cd .. somewhere. Amos -- -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] info on reverse proxy for multiple https sites
Hi All, Would like to know if its possible to setup reverse proxy for multiple https with just 1 IP for squid meaning squid will listen on 1 IP and do reverse proxy for multiple domains with multiple certificate (certificate as per the domain) //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
RE: [squid-users] How to setup squid proxy to run in fail-over mode
Hi Sagar, Just a Question? How can a DNS server determine that the primary server is down and it should resolve the secondary server IP? //Remy On Mon, 2009-06-15 at 11:21 +0530, Sagar Navalkar wrote: Hi Abdul, Please try to enter 2 different IPs in the DNS 10.xxx.yyy.zz1 (proxyA) as primary (proxyA-Name should be same on both the servers.) 10.xxx.yyy.zz2 (proxyA) as secondary. Start squid services on both the servers (Primary Secondary) If Primary server fails, the DNS will resolve secondary IP for proxyA the squid on second server will kick in automatically.. Hope am able to explain it properly. Regards, Sagar Navalkar -Original Message- From: abdul sami [mailto:sami.me...@gmail.com] Sent: Monday, June 15, 2009 11:17 AM To: squid-users@squid-cache.org Subject: [squid-users] How to setup squid proxy to run in fail-over mode Dear all, Now that i have setup a proxy server, as a next step i want to run it in fail-over high availability mode, so that if one proxy is down due to any reason, second proxy should automatically be up and start serving requests. any help in shape of articles/steps would be highly appreciated. Thanks and regards, A Sami -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] Load Balancing Query
Hi All, Want to know if load balancing is possible with squid by maintaining sessions. Health check should be TCP Ports eg: Server A - Active port 8080 Server B - Active port 8080 Client - Squid - Server A and/or B Request 1 comes from 'Client A' Squid forwards the request to 'Server A' Request 2 comes from 'Client A' Squid forwards the request to 'Server A' and so on any further request from 'Client A' squid should only forward to 'Server A' until the session is same if Request 1 comes from 'Client B' Squid forwards the request to 'Server B' Request 2 comes from 'Client B' Squid forwards the request to 'Server B' if 'Server A' fails Squid should forward all the request to 'Server B' //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
RE: [squid-users] How to setup squid proxy to run in fail-over mode
That is what I am saying. Since you say If Primary server fails, the DNS will resolve secondary IP for proxyA //Remy On Mon, 2009-06-15 at 14:39 +0530, Sagar Navalkar wrote: Hey Remy, The DNS server does not determine which server is down, however If It is unable to resolve the 1st entry, it will automatically go down to the 2nd entry. Regards, Sagar Navalkar Team Leader -Original Message- From: Mario Remy Almeida [mailto:malme...@isaaviation.ae] Sent: Monday, June 15, 2009 1:36 PM To: Sagar Navalkar Cc: squid-users@squid-cache.org; 'abdul sami' Subject: RE: [squid-users] How to setup squid proxy to run in fail-over mode Hi Sagar, Just a Question? How can a DNS server determine that the primary server is down and it should resolve the secondary server IP? //Remy On Mon, 2009-06-15 at 11:21 +0530, Sagar Navalkar wrote: Hi Abdul, Please try to enter 2 different IPs in the DNS 10.xxx.yyy.zz1 (proxyA) as primary (proxyA-Name should be same on both the servers.) 10.xxx.yyy.zz2 (proxyA) as secondary. Start squid services on both the servers (Primary Secondary) If Primary server fails, the DNS will resolve secondary IP for proxyA the squid on second server will kick in automatically.. Hope am able to explain it properly. Regards, Sagar Navalkar -Original Message- From: abdul sami [mailto:sami.me...@gmail.com] Sent: Monday, June 15, 2009 11:17 AM To: squid-users@squid-cache.org Subject: [squid-users] How to setup squid proxy to run in fail-over mode Dear all, Now that i have setup a proxy server, as a next step i want to run it in fail-over high availability mode, so that if one proxy is down due to any reason, second proxy should automatically be up and start serving requests. any help in shape of articles/steps would be highly appreciated. Thanks and regards, A Sami -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA. -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Load Balancing Query
Hi Amos, Thanks for that, so I need to use carp and sourcehash to do load balancing, right? but where do I specify in squid to monitor the prots? I mean if port 8080 is down on 'ServerA' how Squid will know that it should send the request to 'ServerB' on port 8080? //Remy On Mon, 2009-06-15 at 23:05 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Want to know if load balancing is possible with squid by maintaining sessions. Health check should be TCP Ports eg: Server A - Active port 8080 Server B - Active port 8080 Client - Squid - Server A and/or B Request 1 comes from 'Client A' Squid forwards the request to 'Server A' Request 2 comes from 'Client A' Squid forwards the request to 'Server A' and so on any further request from 'Client A' squid should only forward to 'Server A' until the session is same if Request 1 comes from 'Client B' Squid forwards the request to 'Server B' Request 2 comes from 'Client B' Squid forwards the request to 'Server B' if 'Server A' fails Squid should forward all the request to 'Server B' //Remy HTTP is stateless. It contains no such thing as sessions. That is a browser feature. What you are looking for is something like CARP or sourcehash peering algorithms. They keep all requests for certain URLs sent to the same place (CARP) or all requests for the same IP to the same place (sourcehash). see http://www.squid-cache.org/Doc/config/cache_peer Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Load Balancing Query
Thanks Amos for the help On Tue, 2009-06-16 at 00:30 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, Thanks for that, so I need to use carp and sourcehash to do load balancing, right? only the one you want. but where do I specify in squid to monitor the prots? I mean if port 8080 is down on 'ServerA' how Squid will know that it should send the request to 'ServerB' on port 8080? It's automatic in the background. The latest 2.HEAD and 3.1 have options to configure how long it takes to detect. Other squid attempt ~10 connects and then failover. Amos //Remy On Mon, 2009-06-15 at 23:05 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Want to know if load balancing is possible with squid by maintaining sessions. Health check should be TCP Ports eg: Server A - Active port 8080 Server B - Active port 8080 Client - Squid - Server A and/or B Request 1 comes from 'Client A' Squid forwards the request to 'Server A' Request 2 comes from 'Client A' Squid forwards the request to 'Server A' and so on any further request from 'Client A' squid should only forward to 'Server A' until the session is same if Request 1 comes from 'Client B' Squid forwards the request to 'Server B' Request 2 comes from 'Client B' Squid forwards the request to 'Server B' if 'Server A' fails Squid should forward all the request to 'Server B' //Remy HTTP is stateless. It contains no such thing as sessions. That is a browser feature. What you are looking for is something like CARP or sourcehash peering algorithms. They keep all requests for certain URLs sent to the same place (CARP) or all requests for the same IP to the same place (sourcehash). see http://www.squid-cache.org/Doc/config/cache_peer Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] 3rd email for RPC Over HTTPS issue
Hi All, This is my 3rd email for the below mentioned problem. I am writing this email in the hope that someone will reply and say if it can be done or not. Just yes or no will do for me so that I know it is possible or not. Successfully configure reverse proxy HTTPS but proxy with RPC Over HTTPS Squid 2.7STABLE6 Windows 2008 Exchange 2007 Having issue with RPC over HTTPS, below is the error message Attempting to ping RPC Endpoint 6001 (Exchange Information Store) on server hubsexchange.airarabiauae.com Failed to ping Endpoint Additional Details An RPC Error was thrown by the RPC Runtime. Error 1818 1818 Please let me know what could be the problem, some hint. //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] 3rd email for RPC Over HTTPS issue
Thanks Amos for the reply I will go through that provided link. If anyone having a working configurations could you'll please send it to me. //Remy On Tue, 2009-06-16 at 14:38 +1200, Amos Jeffries wrote: On Mon, 15 Jun 2009 22:44:33 +0400, Mario Remy Almeida malme...@isaaviation.ae wrote: Hi All, This is my 3rd email for the below mentioned problem. I am writing this email in the hope that someone will reply and say if it can be done or not. Just yes or no will do for me so that I know it is possible or not. Successfully configure reverse proxy HTTPS but proxy with RPC Over HTTPS Squid 2.7STABLE6 Windows 2008 Exchange 2007 Having issue with RPC over HTTPS, below is the error message Attempting to ping RPC Endpoint 6001 (Exchange Information Store) on server hubsexchange.airarabiauae.com Failed to ping Endpoint Additional Details An RPC Error was thrown by the RPC Runtime. Error 1818 1818 Please let me know what could be the problem, some hint. Many people are successfully using RPC over Squid. Configured as per http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess Error 1818 appears to be the problem. I cannot help an further sorry. This is not a squid issue AFAICT. Look for RPC or exchange documentation. Or even the MS errors information to find out what that means. Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] RPC ORVER HTTPS
Hi All, I have successfully configured reverse proxy, But have issue with RCP over https Testing my setup with the following link https://www.testexchangeconnectivity.com/ have the below error Attempting to ping RPC Endpoint 6001 (Exchange Information Store) on server hubsexchange.airarabiauae.com Failed to ping Endpoint Additional Details An RPC Error was thrown by the RPC Runtime. Error 1818 1818 What could be the problem? Squid Cache: Version 2.7.STABLE6 Need help please. //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] RPC Over HTTPS
Hi All, I have successfully configured reverse proxy, But have issue with RCP over https Testing my setup with the following link https://www.testexchangeconnectivity.com/ have the below error Attempting to ping RPC Endpoint 6001 (Exchange Information Store) on server hubsexchange.airarabiauae.com Failed to ping Endpoint Additional Details An RPC Error was thrown by the RPC Runtime. Error 1818 1818 What could be the problem? squid -v == Squid Cache: Version 2.7.STABLE6 configure options: '--host=x86_64-redhat-linux-gnu' '--build=x86_64-redhat-linux-gnu' '--target=x86_64-redhat-linux' '--program-prefix=' '--prefix=/usr' '--exec-prefix=/usr' '--bindir=/usr/bin' '--sbindir=/usr/sbin' '--sysconfdir=/etc' '--includedir=/usr/include' '--libdir=/usr/lib64' '--libexecdir=/usr/libexec' '--sharedstatedir=/usr/com' '--mandir=/usr/share/man' '--infodir=/usr/share/info' '--exec_prefix=/usr' '--bindir=/usr/sbin' '--libexecdir=/usr/lib64/squid' '--localstatedir=/var' '--datadir=/usr/share' '--sysconfdir=/etc/squid' '--enable-epoll' '--enable-snmp' '--enable-removal-policies=heap,lru' '--enable-storeio=aufs,coss,diskd,null,ufs' '--enable-ssl' '--with-openssl=/usr/kerberos' '--enable-delay-pools' '--enable-linux-netfilter' '--enable-linux-tproxy' '--with-pthreads' '--enable-ntlm-auth-helpers=SMB,fakeauth' '--enable-external-acl-helpers=ip_user,ldap_group,unix_group,wbinfo_group' '--enable-auth=basic,digest,ntlm,negotiate' '--enable-digest-auth-helpers=password' '--enable-useragent-log' '--enable-referer-log' '--disable-dependency-tracking' '--enable-cachemgr-hostname=localhost' '--enable-basic-auth-helpers=LDAP,MSNT,NCSA,PAM,SMB,YP,getpwnam,multi-domain-NTLM,SASL' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-cache-digests' '--enable-ident-lookups' '--enable-follow-x-forwarded-for' '--enable-wccpv2' '--enable-x-accelerator-vary' '--enable-xmalloc-statistics' '--enable-icmp' '--enable-kill-parent-hack' '--enable-arp-acl' '--enable-default-err-language=English' '--enable-err-languages=English' '--disable-http-violations' '--enable-large-cache-files' '--with-dl' '--with-maxfd=16384' 'build_alias=x86_64-redhat-linux-gnu' 'host_alias=x86_64-redhat-linux-gnu' 'target_alias=x86_64-redhat-linux' 'CFLAGS=-fPIE -Os -g -pipe -fsigned-char -O2 -g -m64 -mtune=generic' 'LDFLAGS=-pie' == squid.conf as below = acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 10.200.8.20 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT acl PURGE method PURGE acl localnet src 10.200.2.0/24 acl snmppublic snmp_community public acl OWA dstdomain mail.airarabia.ae http_access allow manager localhost http_access deny manager http_access allow localhost PURGE http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow OWA all http_access allow localnet http_access allow localhost http_access deny all icp_access allow localnet icp_access deny all miss_access allow OWA reply_body_max_size 52428800 allow all follow_x_forwarded_for allow localnet follow_x_forwarded_for allow localhost follow_x_forwarded_for deny all acl_uses_indirect_client on delay_pool_uses_indirect_client on log_uses_indirect_client on ssl_unclean_shutdown on sslproxy_flags DONT_VERIFY_PEER http_port 8080 http_port 10.200.8.20:80 accel defaultsite=mail.airarabia.ae vhost https_port 10.200.8.20:443 accel \ cert=/etc/squid/keys/airarabia_key.pem \ key=/etc/squid/keys/airarabia_key.pem defaultsite=mail.airarabia.ae cache_peer proxy1.emirates.net.ae parent 8080 0 no-query default cache_peer mail.airarabia.ae parent 443 0 no-query \ originserver front-end-https=on login=PASS name=owaServer \ ssl sslcert=/etc/squid/keys/airarabia_crt.pem \ sslkey=/etc/squid/keys/airarabia_key.pem sslflags=DONT_VERIFY_PEER cache_peer_access owaServer allow OWA cache_peer_access proxy1.emirates.net.ae allow !OWA hierarchy_stoplist cgi-bin ? cache_mem 600 MB maximum_object_size_in_memory 20 KB memory_replacement_policy heap GDSF cache_replacement_policy heap GDSF cache_dir aufs /cache 29000 16 256 store_dir_select_algorithm least-load max_open_disk_fds 0 minimum_object_size 0 KB maximum_object_size 1096 MB cache_swap_low 90 cache_swap_high 95 logformat squid %ts.%03tu %6tr %a %Ss/%03Hs %st %rm %ru %un %Sh/%A % mt logformat mysql_columns %ts.%03tu %6tr %a %Ss %03Hs %st %rm %ru %un % Sh %A %mt access_log
[squid-users] reverse proxy with SSL offloader issue
Hi All, I downloaded SSL Certificate from verisign and exported pvt key from windows 2003 server in squid.conf I have this https_port 10.200.22.49:443 accel \ cert=/etc/squid/keys/mail.airarabia.ae_cert.pem \ key=/etc/squid/keys/pvtkey.pem defaultsite=mail.airarabia.ae when access https://mail.airarabia.ae browser gives error Secure Connection Failed mail.airarabia.ae uses an invalid security certificate. The certificate is not trusted because the issuer certificate is unknown. (Error code: sec_error_unknown_issuer) * This could be a problem with the server's configuration, or it could be someone trying to impersonate the server. * If you have connected to this server successfully in the past, the error may be temporary, and you can try again later. and in cache.log I get this clientNegotiateSSL: Error negotiating SSL connection on FD 23: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert unknown ca (1/0) What could be the problem please help //Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] reverse proxy with SSL offloader issue
Hi Amos, I don't know how to check the chain of trust I concatenated the csr and the certficate but how to do so i don't know can you please tell me? === squid.conf https_port 10.200.22.49:443 accel \ cert=/etc/squid/keys/mail.airarabia.ae_cert.pem \ key=/etc/squid/keys/newpvtkey.pem defaultsite=mail.airarabia.ae cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS \ front-end-https=on name=owaServer sslflags=DONT_VERIFY_PEER //Remy On Wed, 2009-06-03 at 12:51 +1200, Amos Jeffries wrote: On Tue, 02 Jun 2009 16:56:08 +0400, Mario Remy Almeida malme...@isaaviation.ae wrote: Hi All, I downloaded SSL Certificate from verisign and exported pvt key from windows 2003 server in squid.conf I have this https_port 10.200.22.49:443 accel \ cert=/etc/squid/keys/mail.airarabia.ae_cert.pem \ key=/etc/squid/keys/pvtkey.pem defaultsite=mail.airarabia.ae when access https://mail.airarabia.ae browser gives error Secure Connection Failed mail.airarabia.ae uses an invalid security certificate. The certificate is not trusted because the issuer certificate is unknown. (Error code: sec_error_unknown_issuer) * This could be a problem with the server's configuration, or it could be someone trying to impersonate the server. * If you have connected to this server successfully in the past, the error may be temporary, and you can try again later. and in cache.log I get this clientNegotiateSSL: Error negotiating SSL connection on FD 23: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert unknown ca (1/0) What could be the problem please help SSL chain of trust is broken on one of the SSL links. Two things to try: 1) adding sslflags=DONT_VERIFY_PEER - If that works its the cache_peer link broken. If still fails then its the https_port certificate. Next look at the certificate itself, see if it contains the whole chain of trust (concatenated certificate + signing authority cert). I'm a bit hazy about whether the https_port needs the signing authority in it or not when the certs are of the unlinked chain type (I forget what the right name is even). But I think cache_peer needs the full chain to be in the cert. Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Reverse Proxy
Hi Amos, One thing I forgot to mentioned /etc/hosts has this entry 10.200.22.12mail.airarabia.ae Output of host mail.airarabia.ae from dns is - mail.airarabia.ae has address 10.200.9.20 User (browser) reads the host file from individual PCs cat /etc/hosts | grep mail.airarabia.ae 10.200.22.49mail.airarabia.ae 10.200.22.49 - squid proxy ip 10.200.22.12 - OWA ip Please find the answers below. //Remy On Sun, 2009-05-17 at 18:16 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, I followed the instruction as per http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess But I am some how failing to configure https. My squid.conf https_port 443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem Okay two extra things about the port: 1) unless you have the wilcard cert its best to specify the IP:port combo and generate the cert for those IP:port. That way you can use other IP for other domains and be sure Squid is sending SSL on the right IP. changed it to - https_port 10.200.22.49:443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem 2) check that the cert/key are correct for the IP:port squid is listening on. use this command to generate the ssl certificate openssl req -x509 -days 365 -newkey rsa:1024 -keyout key.pem -nodes \-out cert.pem cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS \ front-end-https=on login=PASS name=owaServer So OWA is listening on port 80? yes on port 80 no issue cache_peer_access owaServer allow OWA acl OWA dstdomain mail.airarabia.ae http_access allow OWA miss_access allow OWA miss_access deny all Missing: never_direct allow OWA Actually I forgot to mention it here It is specified in squid.conf that bit is important to prevent Squid even attempting to request a connection direct to OWA without the peerage settings. Amos cache.log 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:13| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) Error on the browser While trying to retrieve the URL: https://mail.airarabia.ae/exchweb/ The following error was encountered: * Connection to 10.200.22.12 Failed The system returned: (71) Protocol error The remote host or network may be down. Please try the request again. Please help //Remy On Fri, 2009-05-15 at 16:35 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Need to setup Reverse proxy I have Squid 2.7STABLE6 OS Centos Web server= Microsoft Outlook Web Access SSL enabled port 443 My squid config is as below acl vhosts1_domains dstdomain mail.airarabiauae.com http_port 443 accel defaultsite=mail.airarabiauae.com vhost cache_peer 10.200.22.12 parent 443 0 no-query originserver name=vhost1 \ ssl cache_peer_access vhost1 allow vhosts1_domains Please someone tell me it that is the right way to configure it. No. Here is the tutorial: http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess port 443 is often encrypted. It requires the https_port option instead of http_port, and the certificate as well. The peer part may be correct, or further ssl-related options may be needed. It depends on your peer so I can't say for certain unless you actually hit a problem. Amos Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Reverse Proxy
please correct me if I am wrong 10.200.22.49- SquidProxy 10.200.22.12- OWA 10.200.2.22 - DNS Server DNS Entires, mail.airarabia.com pointing to 10.200.22.12 OR mail.airarabia.com pointing to 10.200.22.49 On Squid Proxy Server cat /etc/resolv.conf nameserver 10.200.2.22 cat /etc/hosts emtpy User (browser) ProxySettings: 10.200.22.49 port 80 Do I have to by-pass mail.airarabia.com? //Remy On Sun, 2009-05-17 at 19:33 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, One thing I forgot to mentioned /etc/hosts has this entry 10.200.22.12mail.airarabia.ae Output of host mail.airarabia.ae from dns is - mail.airarabia.ae has address 10.200.9.20 User (browser) reads the host file from individual PCs cat /etc/hosts | grep mail.airarabia.ae 10.200.22.49mail.airarabia.ae 10.200.22.49 - squid proxy ip 10.200.22.12 - OWA ip This could cause you some problems administering it. My advice on this is to setup DNS pointing at Squid for the HTTPS domain name, set squid.conf with the right OWA IP as a peer, and not have the individual hosts file overrides. The fact that the public IP for the domain is different to both the squid IP and the real OWA/Exchange IP is worrying. I trust that you know what destinations should be. Amos Please find the answers below. //Remy On Sun, 2009-05-17 at 18:16 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, I followed the instruction as per http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess But I am some how failing to configure https. My squid.conf https_port 443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem Okay two extra things about the port: 1) unless you have the wilcard cert its best to specify the IP:port combo and generate the cert for those IP:port. That way you can use other IP for other domains and be sure Squid is sending SSL on the right IP. changed it to - https_port 10.200.22.49:443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem 2) check that the cert/key are correct for the IP:port squid is listening on. use this command to generate the ssl certificate openssl req -x509 -days 365 -newkey rsa:1024 -keyout key.pem -nodes \-out cert.pem The keys do need to be signed in some way before they are valid for use. This looks like a key creation-only command, though with SSL certs I only know enough to follow the tutorials. Doing that (for all key steps) I've never had a problem. Amos cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS \ front-end-https=on login=PASS name=owaServer So OWA is listening on port 80? yes on port 80 no issue cache_peer_access owaServer allow OWA acl OWA dstdomain mail.airarabia.ae http_access allow OWA miss_access allow OWA miss_access deny all Missing: never_direct allow OWA Actually I forgot to mention it here It is specified in squid.conf that bit is important to prevent Squid even attempting to request a connection direct to OWA without the peerage settings. Amos cache.log 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:13| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) Error on the browser While trying to retrieve the URL: https://mail.airarabia.ae/exchweb/ The following error was encountered: * Connection to 10.200.22.12 Failed The system returned: (71) Protocol error The remote host or network may be down. Please try the request again. Please help //Remy On Fri, 2009-05-15 at 16:35 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Need to setup Reverse proxy I have Squid 2.7STABLE6 OS Centos Web server= Microsoft Outlook Web Access SSL enabled port 443 My squid config is as below acl vhosts1_domains dstdomain mail.airarabiauae.com http_port 443 accel defaultsite=mail.airarabiauae.com vhost cache_peer 10.200.22.12 parent 443 0 no-query originserver name=vhost1 \ ssl cache_peer_access vhost1 allow vhosts1_domains Please someone tell me it that is the right way to configure it. No. Here is the tutorial: http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess port 443 is often encrypted
Re: [squid-users] Reverse Proxy
My squid.conf acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT acl localnet src 10.200.2.0/24 acl OWA dstdomain webmail.airarabia.ae http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow OWA all http_access allow localnet http_access allow localnet http_access allow localhost http_access deny all icp_access allow localnet icp_access deny all miss_access allow OWA miss_access deny all http_port 10.200.22.49:80 defaultsite=webmail.airarabia.ae https_port 10.200.22.49:443 defaultsite=webmail.airarabia.ae cert=/etc/squid/keys/proxycert.pem key=/etc/squid/keys/proxykey.pem cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS front-end-https=on login=PASS name=owaServer cache_peer proxy1.emirates.net.ae parent 8080 0 no-query default cache_peer_access owaServer allow OWA hierarchy_stoplist cgi-bin ? cache_dir aufs /cache 29000 16 256 logformat squid %ts.%03tu %6tr %a %Ss/%03Hs %st %rm %ru %un %Sh/%A % mt logformat mysql_columns %ts.%03tu %6tr %a %Ss %03Hs %st %rm %ru %un % Sh %A %mt access_log /var/log/squid/access.log squid access_log daemon:/usr/lib64/squid/db.cf mysql_columns logfile_daemon /usr/lib64/squid/logmysqldb_daemon pid_filename /var/run/squid.pid refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9] upgrade_http0.9 deny shoutcast acl apache rep_header Server ^Apache broken_vary_encoding allow apache prefer_direct off never_direct allow OWA coredump_dir /var/spool/squid OUTPUT of host webmail.airarabia.ae taking from DNS webmail.airarabia.ae has address 10.200.22.12 clients browser proxy set to 10.200.22.49 port 80 NO by-pass Now confused with DNS what should be the DNS entires. the clients will not by-pass. should the DNS entry for webmail.airarabia.ae point to the OWA IP or to Squid Proxy? Please help as I am confused. //Remy On Sun, 2009-05-17 at 19:33 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, One thing I forgot to mentioned /etc/hosts has this entry 10.200.22.12mail.airarabia.ae Output of host mail.airarabia.ae from dns is - mail.airarabia.ae has address 10.200.9.20 User (browser) reads the host file from individual PCs cat /etc/hosts | grep mail.airarabia.ae 10.200.22.49mail.airarabia.ae 10.200.22.49 - squid proxy ip 10.200.22.12 - OWA ip This could cause you some problems administering it. My advice on this is to setup DNS pointing at Squid for the HTTPS domain name, set squid.conf with the right OWA IP as a peer, and not have the individual hosts file overrides. The fact that the public IP for the domain is different to both the squid IP and the real OWA/Exchange IP is worrying. I trust that you know what destinations should be. Amos Please find the answers below. //Remy On Sun, 2009-05-17 at 18:16 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, I followed the instruction as per http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess But I am some how failing to configure https. My squid.conf https_port 443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem Okay two extra things about the port: 1) unless you have the wilcard cert its best to specify the IP:port combo and generate the cert for those IP:port. That way you can use other IP for other domains and be sure Squid is sending SSL on the right IP. changed it to - https_port 10.200.22.49:443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem 2) check that the cert/key are correct for the IP:port squid is listening on. use this command to generate the ssl certificate openssl req -x509 -days 365 -newkey rsa:1024 -keyout key.pem -nodes \-out cert.pem The keys do need to be signed in some way before they are valid for use. This looks like a key creation-only command, though with SSL certs I only know enough to follow the tutorials. Doing that (for all key steps) I've never had a problem. Amos
Re: [squid-users] Reverse Proxy
...@airarabia.com mail_program mail cache_effective_user squid cache_effective_group squid httpd_suppress_version_string on visible_hostname vsquid-01-shj umask 027 snmp_port 3401 snmp_access allow snmppublic localhost snmp_access deny all icon_directory /usr/share/squid/icons global_internal_static on short_icon_urls on nonhierarchical_direct on prefer_direct off never_direct allow OWA max_filedescriptors 0 check_hostnames off allow_underscore on dns_timeout 2 minutes hosts_file /etc/hosts ignore_unknown_nameservers on ipcache_size 2048 ipcache_low 90 ipcache_high 95 fqdncache_size 1024 forwarded_for on cachemgr_passwd disable all client_db off uri_whitespace strip coredump_dir /var/spool/squid windows_ipaddrchangemonitor off Thanks for the help //Remy On Mon, 2009-05-18 at 00:57 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: My squid.conf acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT acl localnet src 10.200.2.0/24 acl OWA dstdomain webmail.airarabia.ae http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow OWA all http_access allow localnet http_access allow localnet http_access allow localhost http_access deny all icp_access allow localnet icp_access deny all miss_access allow OWA miss_access deny all http_port 10.200.22.49:80 defaultsite=webmail.airarabia.ae https_port 10.200.22.49:443 defaultsite=webmail.airarabia.ae cert=/etc/squid/keys/proxycert.pem key=/etc/squid/keys/proxykey.pem cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS front-end-https=on login=PASS name=owaServer cache_peer proxy1.emirates.net.ae parent 8080 0 no-query default cache_peer_access owaServer allow OWA hierarchy_stoplist cgi-bin ? cache_dir aufs /cache 29000 16 256 logformat squid %ts.%03tu %6tr %a %Ss/%03Hs %st %rm %ru %un %Sh/%A %mt logformat mysql_columns %ts.%03tu %6tr %a %Ss %03Hs %st %rm %ru %un %Sh %A %mt access_log /var/log/squid/access.log squid access_log daemon:/usr/lib64/squid/db.cf mysql_columns logfile_daemon /usr/lib64/squid/logmysqldb_daemon pid_filename /var/run/squid.pid refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9] upgrade_http0.9 deny shoutcast acl apache rep_header Server ^Apache broken_vary_encoding allow apache prefer_direct off never_direct allow OWA coredump_dir /var/spool/squid OUTPUT of host webmail.airarabia.ae taking from DNS webmail.airarabia.ae has address 10.200.22.12 clients browser proxy set to 10.200.22.49 port 80 NO by-pass Now confused with DNS what should be the DNS entires. the clients will not by-pass. should the DNS entry point to the OWA IP or to Squid Proxy? Please help as I am confused. Oh, I see... You need this: 10.200.22.49- SquidProxy 10.200.22.12- OWA 10.200.2.22 - DNS Server DNS Entires, webmail.airarabia.com pointing to 10.200.22.49 (HTTP, HTTPS stuff) mail.airarabia.com pointing to 10.200.22.12(SMTP stuff) On Squid Proxy Server, /etc/resolv.conf: nameserver 10.200.2.22 /etc/hosts: 127.0.0.1 localhost squid.conf as above but: http_port 10.200.22.49:80 accel defaultsite=webmail.airarabia.ae https_port 10.200.22.49:443 accel defaultsite=webmail.airarabia.ae \ cert=/etc/squid/keys/proxycert.pem key=/etc/squid/keys/proxykey.pem cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS \ front-end-https=on name=owaServer cache_peer_access owaServer allow OWA cache_peer proxy1.emirates.net.ae parent 8080 0 no-query default cache_peer_access proxy1.emirates.net.ae allow !OWA NOTE the 'accel' option on ports and !OWA on default parent peer access. Amos //Remy On Sun, 2009-05-17 at 19:33 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, One thing I forgot to mentioned /etc/hosts has this entry 10.200.22.12 mail.airarabia.ae Output of host mail.airarabia.ae from dns is - mail.airarabia.ae has address 10.200.9.20 User (browser) reads the host file from individual
Re: [squid-users] Reverse Proxy
Thanks Amos, Finally got it working. Once again thanks for all the support. Any idea where to start for scanning of https sites I mean documentation //Remy On Mon, 2009-05-18 at 02:04 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi Amos, Thanks for the configuration I managed to access http and https (mail.airarabia.ae) webmail.airarabia.ae is discarded. now one more issue Any external sites http I can access but not https example https://gmail.com not accessable access.log file I get === 1242580515.608 0 10.200.2.172 TCP_DENIED/400 1570 CONNECT :0 - NONE/- text/html 1242580517.224 0 10.200.2.172 TCP_DENIED/400 1570 CONNECT :0 - NONE/- text/html 1242580536.539 0 10.200.2.172 TCP_DENIED/400 1570 CONNECT :0 - NONE/- text/html 1242580538.999 0 10.200.2.172 TCP_DENIED/400 1570 CONNECT :0 - NONE/- text/html browser I get == While trying to process the request: CONNECT www.google.com:443 HTTP/1.0 User-Agent: Opera/9.64 (X11; Linux i686; U; en) Presto/2.1.1 Host: www.google.com:443 The following error was encountered: Invalid Request Some aspect of the HTTP Request is invalid. Possible problems: Missing or unknown request method Missing URL Missing HTTP Identifier (HTTP/1.0) Request is too large Content-Length missing for POST or PUT requests Illegal character in hostname; underscores are not allowed I think you are trying to use a reverse-proxy port (as configured below) as a forward-proxy (general web requests). The accel ports we setup below for OWA is not applicable for general web access. To use is for general access you need to setup a basic http_port 3128 and configure that in the client browsers. Amos My squid.conf is as below acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT acl localnet src 10.200.2.0/24 acl snmppublic snmp_community public acl OWA dstdomain mail.airarabia.ae http_access allow manager localhost http_access deny manager http_access deny !Safe_ports http_access allow OWA all http_access deny CONNECT !SSL_ports http_access allow localnet http_access allow localhost http_access deny all icp_access allow localnet icp_access deny all reply_body_max_size 52428800 allow all follow_x_forwarded_for allow localnet follow_x_forwarded_for allow localhost follow_x_forwarded_for deny all acl_uses_indirect_client on delay_pool_uses_indirect_client on log_uses_indirect_client on ssl_unclean_shutdown on http_port 10.200.22.49:80 accel defaultsite=mail.airarabia.ae vhost https_port 10.200.22.49:443 accel cert=/etc/squid/keys/proxycert.pem key=/etc/squid/keys/proxykey.pem defaultsite=mail.airarabia.ae cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS front-end-https=on login=PASS name=owaServer cache_peer proxy1.emirates.net.ae parent 8080 0 no-query default cache_peer_access owaServer allow OWA cache_peer_access proxy1.emirates.net.ae allow !OWA hierarchy_stoplist cgi-bin ? cache_mem 600 MB maximum_object_size_in_memory 20 KB memory_replacement_policy heap GDSF cache_replacement_policy heap GDSF cache_dir aufs /cache 29000 16 256 store_dir_select_algorithm least-load max_open_disk_fds 0 minimum_object_size 0 KB maximum_object_size 1096 MB cache_swap_low 90 cache_swap_high 95 logformat squid %ts.%03tu %6tr %a %Ss/%03Hs %st %rm %ru %un %Sh/%A % mt logformat mysql_columns %ts.%03tu %6tr %a %Ss %03Hs %st %rm %ru %un % Sh %A %mt access_log /var/log/squid/access.log squid access_log daemon:/usr/lib64/squid/db.cf mysql_columns logfile_daemon /usr/lib64/squid/logmysqldb_daemon cache_log /var/log/squid/cache.log cache_store_log /var/log/squid/store.log logfile_rotate 30 emulate_httpd_log on log_ip_on_direct on mime_table /etc/squid/mime.conf log_mime_hdrs on useragent_log /var/log/squid/useragent.lo referer_log /var/log/squid/referer.log pid_filename /var/run/squid.pid debug_options ALL,1 log_fqdn off strip_query_terms on buffered_logs off netdb_filename /var/log/squid/netdb.state ftp_list_width 64 ftp_passive on ftp_sanitycheck on ftp_telnet_protocol on diskd_program /usr/lib64/squid/diskd-daemon unlinkd_program /usr/lib64
Re: [squid-users] Reverse Proxy
Hi Amos, I followed the instruction as per http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess But I am some how failing to configure https. My squid.conf https_port 443 defaultsite=mail.airarabia.ae \ cert=/etc/squid/keys/cert.pem key=/etc/squid/keys/key.pem cache_peer 10.200.22.12 parent 80 0 no-query originserver login=PASS \ front-end-https=on login=PASS name=owaServer cache_peer_access owaServer allow OWA acl OWA dstdomain mail.airarabia.ae http_access allow OWA miss_access allow OWA miss_access deny all cache.log 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:12| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) 2009/05/17 13:32:13| fwdNegotiateSSL: Error negotiating SSL connection \ on FD 24: error::lib(0):func(0):reason(0) (5/-1/104) Error on the browser While trying to retrieve the URL: https://mail.airarabia.ae/exchweb/ The following error was encountered: * Connection to 10.200.22.12 Failed The system returned: (71) Protocol error The remote host or network may be down. Please try the request again. Please help //Remy On Fri, 2009-05-15 at 16:35 +1200, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Need to setup Reverse proxy I have Squid 2.7STABLE6 OS Centos Web server= Microsoft Outlook Web Access SSL enabled port 443 My squid config is as below acl vhosts1_domains dstdomain mail.airarabiauae.com http_port 443 accel defaultsite=mail.airarabiauae.com vhost cache_peer 10.200.22.12 parent 443 0 no-query originserver name=vhost1 \ ssl cache_peer_access vhost1 allow vhosts1_domains Please someone tell me it that is the right way to configure it. No. Here is the tutorial: http://wiki.squid-cache.org/ConfigExamples/Reverse/OutlookWebAccess port 443 is often encrypted. It requires the https_port option instead of http_port, and the certificate as well. The peer part may be correct, or further ssl-related options may be needed. It depends on your peer so I can't say for certain unless you actually hit a problem. Amos -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] Reverse Proxy
Hi All, Need to setup Reverse proxy I have Squid 2.7STABLE6 OS Centos Web server= Microsoft Outlook Web Access SSL enabled port 443 My squid config is as below acl vhosts1_domains dstdomain mail.airarabiauae.com http_port 443 accel defaultsite=mail.airarabiauae.com vhost cache_peer 10.200.22.12 parent 443 0 no-query originserver name=vhost1 \ ssl cache_peer_access vhost1 allow vhosts1_domains Please someone tell me it that is the right way to configure it. -- Remy Almeida -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] Log Request Header
Hi I have not enabled squidmime. But logformat headers %ts.%03tu %tg %a %rp [%h] [%h] Regards, Remy On Tue, 2009-05-12 at 16:14 -0800, Chris Robertson wrote: Mario Remy Almeida wrote: Hi All, Can someone help me in understanding why there is NONE:// [-] for Request Header in the logs logformat - %ts.%03tu %tg %a %ru [%h] [%h] for [%h] I get NONE:// [-] in the logs Need help Regards, Remy Do you get expected results using the predefined squidmime log format? Chris -- Mario Remy Almeida Linux System Administrator ISA O: 06588817 M: 0508643912 E: malme...@isaaviation.ae -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
[squid-users] Log Request Header
Hi All, Can someone help me in understanding why there is NONE:// [-] for Request Header in the logs logformat - %ts.%03tu %tg %a %ru [%h] [%h] for [%h] I get NONE:// [-] in the logs Need help Regards, Remy -- Disclaimer and Confidentiality This material has been checked for computer viruses and although none has been found, we cannot guarantee that it is completely free from such problems and do not accept any liability for loss or damage which may be caused. Please therefore check any attachments for viruses before using them on your own equipment. If you do find a computer virus please inform us immediately so that we may take appropriate action. This communication is intended solely for the addressee and is confidential. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. The views expressed in this message are those of the individual sender, and may not necessarily be that of ISA.
Re: [squid-users] load balancing
Hi All, What I mean to say is.. E.G:- SP 1 = 10.200.2.1 SP 2 = 10.200.2.2 LAN USERS = 10.200.2.x All lan users should connect to SP1 or SP2 depending upon the load and if one of the SP is down the other should take the load. One way of achieving load balance is with DNS proxy1.example.com IN A 10.200.2.1 proxy1.example.com IN A 10.200.2.2 And what if the DNS Server is down and also how to do fail over //Remy On Tue, 2008-12-23 at 09:05 -0600, Luis Daniel Lucio Quiroz wrote: Just remember when using load balancing, if you use digest auth, then you MUST use source persistence. On Tuesday 23 December 2008 08:38:27 Ken Peng wrote: Hi All, any links on how to configure load balancing of squid See the default squid.conf, :)
RE: [squid-users] load balancing
Hi All, I was on leave so could not reply. What I mean to say is.. E.G:- SP 1 = 10.200.2.1 SP 2 = 10.200.2.2 LAN USERS = 10.200.2.x All lan users should connect to SP1 or SP2 depending upon the load and if one of the SP is down the other should take the load. One way of achieving load balance is with DNS proxy1.example.com IN A 10.200.2.1 proxy1.example.com IN A 10.200.2.2 And what if the DNS Server is down and also how to do fail over //Remy On Tue, 2008-12-23 at 13:43 +, Mehmet CELIK wrote: Hi, what do you want ? so you mean load-balance.. -- Mehmet CELIK From: malme...@isaaviation.ae To: squid-users@squid-cache.org Date: Tue, 23 Dec 2008 16:21:58 +0400 Subject: [squid-users] load balancing Hi All, any links on how to configure load balancing of squid Regards, Mario __ Windows Live™ ile e-posta kutunuzdaki işlevlerin çok ötesine geçin. Diğer Windows Live™ özelliklerine göz atın.
[squid-users] load balancing
Hi All, any links on how to configure load balancing of squid Regards, Mario
[squid-users] GET and POST Method Characters
Hi All, Can someone tell me what is the max number of characters allowed in GET and POST method. When I access the below URL (mentioned in the access.log file) I get Invalid URL ERROR message int he browser message in access.log file 1229585541.757 - 10.200.2.75 TCP_DENIED/400 5595 GET http://xbe.airarabia.com/ccentre/private/showCancelReservationConfirmUpdate.do?hdnAction=CANCELRESERVATIONpnrNo=16407471st rPassData=0%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0% 20^2628.8%20^%20^ACC%20^18684301%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^46%20^1%20^%20^A%20^AD%20^%20^0 %20^~1%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^ %20^ACC%20^18684302%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^47% 20^1%20^%20^A%20^AD%20^%20^1%20^~ 2%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^% 20^ACC%20^18684303%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^48% 20^1%20^%20^A%20^AD%20^%20^2%20^~3%20^ %20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC% 20^18684304%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^49%20^1%20^ %20^A%20^AD%20^%20^3%20^~4%20^%20^T %20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC% 20^18684254%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^50%20^1%20^ %20^A%20^AD%20^%20^4%20^~5%20^%20^T%20B% 20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684253% 20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^51%20^1%20^%20^A%20^AD% 20^%20^5%20^~6%20^%20^T%20B%20A%2 0^T%20B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684252%20^% 20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^52%20^1%20^%20^A%20^AD%20^% 20^6%20^~7%20^%20^T%20B%20A%20^T%2 0B%20A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684251%20^%20^% 20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^53%20^1%20^%20^A%20^AD%20^% 20^7%20^~8%20^%20^T%20B%20A%20^T%20B%20 A%20^32%20^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684250%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^54%20^1%20^%20^A%20^AD%20^%20^8%20^~9% 20^%20^T%20B%20A%20^T%20B%20A%20^ 32%20^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684249%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^55%20^1%20^%20^A%20^AD%20^%20^9% 20^~10%20^%20^T%20B%20A%20^T%20B%20A%20^32%2 0^2628.8%20^.0%20^2628.8%20^%20^ACC%20^18684248%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^56%20^1%20^%20^A%20^AD%20^%20^10% 20^~11%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2 628.8%20^.0%20^2628.8%20^%20^ACC%20^18684247%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^57%20^1%20^%20^A%20^AD%20^%20^11% 20^~12%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628 .8%20^.0%20^2628.8%20^%20^ACC%20^18684246%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^58%20^1%20^%20^A%20^AD%20^%20^12% 20^~13%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8% 20^.0%20^2628.8%20^%20^ACC%20^18684245%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^59%20^1%20^%20^A%20^AD%20^%20^13% 20^~14%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^ .0%20^2628.8%20^%20^ACC%20^18684244%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^60%20^1%20^%20^A%20^AD%20^%20^14% 20^~15%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0% 20^2628.8%20^%20^ACC%20^18684243%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^61%20^1%20^%20^A%20^AD%20^%20^15% 20^~16%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^ 2628.8%20^%20^ACC%20^18684242%20^%20^%20^SSR% 20^2079.0,80.0,469.8,.0,.0,.0,%20^62%20^1%20^%20^A%20^AD%20^%20^16% 20^~17%20^%20^T%20B%20A%20^T%20B%20A%20^32%20^2628.8%20^.0%20^262 8.8%20^%20^ACC%20^18684241%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0, %20^63%20^1%20^%20^A%20^AD%20^%20^17%20^~18%20^%20^T%20B%20A%20^T%20B% 20A%20^32%20^2628.8%20^.0%20^2628.8 %20^%20^ACC%20^18684240%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,% 20^64%20^1%20^%20^A%20^AD%20^%20^18%20^~19%20^%20^T%20B%20A%20^T%20B%20A %20^32%20^2628.8%20^.0%20^2628.8%20 ^%20^ACC%20^18684239%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,% 20^65%20^1%20^%20^A%20^AD%20^%20^19%20^~20%20^%20^T%20B%20A%20^T%20B%20A %20^32%20^2628.8%20^.0%20^2628.8%20^%2 0^ACC%20^18684238%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^66% 20^1%20^%20^A%20^AD%20^%20^20%20^~21%20^%20^T%20B%20A%20^T%20B%20A% 20^32%20^2628.8%20^.0%20^2628.8%20^%20^A CC%20^18684237%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^67%20^1% 20^%20^A%20^AD%20^%20^21%20^~22%20^%20^T%20B%20A%20^T%20B%20A%20^32% 20^2628.8%20^.0%20^2628.8%20^%20^ACC% 20^18684236%20^%20^%20^SSR%20^2079.0,80.0,469.8,.0,.0,.0,%20^68%20^1%20^ %20^A - NONE/- text/html
Re: [squid-users] GET and POST Method Characters
OK thanks Amos, the size of the requested URL is 12k and my squid version is 2.6STABLE20 I'll be moving to squid 2.7STABLE5 justing waiting for the new hardware. any other suggestions. //Remy On Fri, 2008-12-19 at 00:03 +1300, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, Can someone tell me what is the max number of characters allowed in GET and POST method. When I access the below URL (mentioned in the access.log file) I get Invalid URL ERROR message int he browser message in access.log file snip huge URL Depends on your squid version. Older Squid have increased the limit from 2KB to 4KB, and the most recent releases have bumped it again to 8KB. Amos
Re: [squid-users] always_direct for src
Sorry My mistake i used src instead of dst //Remy On Tue, 2008-12-16 at 21:22 +1300, Amos Jeffries wrote: Mario Remy Almeida wrote: Hi All, I am using squid 3.1.0.2 I want squid to connect 213.42.24.11 ip directly without connect to the parent squid below is the settings acl intranet_src src 213.42.24.11 always_direct allow intranet_src but its going to the parent proxy in the log file i get this 1229406697.569 4897 10.200.2.172 TCP_MISS/503 1005 GET http://213.42.24.11/ - DEFAULT_PARENT/proxy1.emirates.net.ae text/html but if i use dstdomain instead or src it is working fine destination domain (dstdomain) and source client IP (src) are two very opposite things. Are you sure you didn't mean to use destination IP (dst) ? how can i use src to have direct connection Should work exactly as you configured. If indeed the source client IP was what you were meaning to catch. Amos
[squid-users] always_direct for src
Hi All, I am using squid 3.1.0.2 I want squid to connect 213.42.24.11 ip directly without connect to the parent squid below is the settings acl intranet_src src 213.42.24.11 always_direct allow intranet_src but its going to the parent proxy in the log file i get this 1229406697.569 4897 10.200.2.172 TCP_MISS/503 1005 GET http://213.42.24.11/ - DEFAULT_PARENT/proxy1.emirates.net.ae text/html but if i use dstdomain instead or src it is working fine how can i use src to have direct connection //Remy
RE: [squid-users] errors while opening sites
Are you using /etc/resolve.conf or dns_nameservers in squid.conf if /etc/resolve.conf used how may entry are there i mean the number of name servers Do the following eg: If you have 3 nameserver 10.0.0.1 nameserver 10.0.0.2 nameserver 10.0.0.3 comment out one by one and run time host google.com check how much time taken by each of them. //Remy On Mon, 2008-12-01 at 11:47 +, Sameer Shinde wrote: Dear Amos, Thanks for the reply. We are using the same DNS servers for our existing proxy server which is ISA2000 is working properly without any problems. We want to shift from MS-ISA2000 proxy server to Linux Squid proxy server, so we are building up this new proxy server. More over if I access the internet from the Ubuntu proxy server itself, we don't get any such errors. Can you / anyone let me know how can we resolve these issue. Also for your information that, so far ubuntu server is just a member server in our enviornment there is no DNS server configured in Ubuntu server. May be this will help you for finding some outputs. ~~~ Thanks Regards, Sameer Shinde. Sr. Customer Support Engineer, Email:- [EMAIL PROTECTED] M:- +91 98204 61580 http://www.geocities.com/s9sameer If everyone is thinking alike, then somebody isn't thinking. Date: Mon, 1 Dec 2008 23:19:58 +1300 From: [EMAIL PROTECTED] To: [EMAIL PROTECTED] CC: squid-users@squid-cache.org Subject: Re: [squid-users] errors while opening sites Sameer Shinde wrote: Dear All, I've installed squid 2.6 installed on my ubuntu8.0.4 server. I'm getting errors like attached one frequently. It says, Unable to determine the IP address from host name anything, but after some time say even after 5 mins the same page / site opens. I'm not able to figure out why is this happening. The error appears for any domain is not particular to any domain. Can anyone help me out in this. I'm not able to bring the server in live mode due to this problem. The DNS server being used by your system for lookups is having major problems. You will need to do some serious debugging to find out why. Things to look for are DNS server overload, traffic congestion, lack of recursive queries, etc. Amos -- Error Message: The requested URL could not be retrieved. While trying to retrieve the URL: http://www.squidguard.org/Doc/install.html The following error was encountered: Unable to determine IP address from the host name for www.squidguard.org The dnsserver returned: Refused: The name server refuses to perform the specified operation. This means that: The cache was not able to resolve the hostname presented in the URL. Check if the address is correct. - ~~~ Thanks Regards, Sameer Shinde. Sr. Customer Support Engineer, Email:- [EMAIL PROTECTED] M:- +91 98204 61580 http://www.geocities.com/s9sameer If everyone is thinking alike, then somebody isn't thinking. _ Access your email online and on the go with Windows Live Hotmail. http://windowslive.com/Explore/Hotmail?ocid=TXT_TAGLM_WL_hotmail_acq_access_112008 -- Please be using Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10 Current Beta Squid 3.1.0.2 _ Get more done, have more fun, and stay more connected with Windows Mobile®. http://clk.atdmt.com/MRT/go/119642556/direct/01/
[squid-users] ICAP help
Hi All, Need help on how to configure c-icap to scan http,https and ftp request Sample virus to test http://www.eicar.org/download/eicar.com my configuration is as below to test my setup I used the above link but it was not scanned for virus and I was able to downloaded it nothing is working what am i missing? can someone help me in this? #squid.conf icap_enable on icap_preview_enable on icap_preview_size 128 icap_send_client_ip on icap_service service_avi_req reqmod_precache 0 icap://localhost:1344/srv_clamav icap_service service_avi respmod_precache 1 icap://localhost:1344/srv_clamav #c-icap.conf + PidFile /var/run/c-icap.pid CommandsSocket /var/run/c-icap/c-icap.ctl Timeout 300 KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 600 StartServers 3 MaxServers 10 MinSpareThreads 10 MaxSpareThreads 20 ThreadsPerChild 10 MaxRequestsPerChild 0 Port 1344 User proxy Group nobody TmpDir /var/tmp MaxMemObject 131072 ServerLog /usr/local/c_icap/var/log/server.log AccessLog /usr/local/c_icap/var/log/access.log DebugLevel 3 ModulesDir /usr/lib/c_icap Module logger sys_logger.so sys_logger.Prefix C-ICAP: sys_logger.Facility local1 Logger file_logger AclControllers default_acl acl localsquid_respmod src 127.0.0.1 type respmod acl localsquid_options src 127.0.0.1 type options acl localsquid src 127.0.0.1 acl externalnet src 0.0.0.0/0.0.0.0 acl localnet_respmod src 10.200.2.0/255.255.255.0 type respmod acl localnet_options src 10.200.2.0/255.255.255.0 type options acl localnet src 10.200.2.0/255.255.255.0 icap_access allow localsquid_respmod icap_access allow localsquid_options icap_access allow localsquid icap_access allow localnet_respmod icap_access allow localnet_options icap_access allow localnet icap_access deny externalnet icap_access log localsquid icap_access log localnet icap_access log externalnet ServicesDir /usr/lib/c_icap Service echo_module srv_echo.so Service url_check_module srv_url_check.so Service antivirus_module srv_clamav.so ServiceAlias avscan srv_clamav?allow204=onsizelimit=offmode=simple srv_clamav.ScanFileTypes TEXT DATA EXECUTABLE ARCHIVE GIF JPEG MSOFFICE StartSendPercentDataAfter size srv_clamav.SendPercentData 5 srv_clamav.StartSendPercentDataAfter 2M previews for srv_clamav srv_clamav.Allow204Responces off srv_clamav.MaxObjectSize 5M srv_clamav.ClamAvMaxFilesInArchive 0 srv_clamav.ClamAvMaxFileSizeInArchive 100M srv_clamav.ClamAvMaxRecLevel 5 srv_clamav.VirSaveDir /tmp/download/ get_file.pl script in contrib dir) srv_clamav.VirHTTPServer http://fortune/cgi-bin/get_file.pl?usename=% fremove=1file= srv_clamav.VirUpdateTime 15 srv_clamav.VirScanFileTypes ARCHIVE EXECUTABLE //Remy
Re: [squid-users] ICAP help
Hi Christos, I used icap_class and icap_access but I get this 2008/11/27 17:07:44| Processing Configuration File: /etc/squid/squid.conf (depth 0) 2008/11/27 17:07:44| WARNING: 'icap_class' is depricated. Use 'adaptation_service_set' instead 2008/11/27 17:07:44| WARNING: 'icap_access' is depricated. Use 'adaptation_access' instead 2008/11/27 17:07:44| Initializing https proxy context //Remy On Thu, 2008-11-27 at 07:53 -0500, Christos Tsantilas wrote: Hi All, Need help on how to configure c-icap to scan http,https and ftp request Sample virus to test http://www.eicar.org/download/eicar.com my configuration is as below to test my setup I used the above link but it was not scanned for virus and I was able to downloaded it nothing is working what am i missing? can someone help me in this? #squid.conf icap_enable on icap_preview_enable on icap_preview_size 128 icap_send_client_ip on icap_service service_avi_req reqmod_precache 0 icap://localhost:1344/srv_clamav icap_service service_avi respmod_precache 1 icap://localhost:1344/srv_clamav You need to define an icap_class and define access list for this icap_class Why do you need virus scan for http requests? Your configuration should also contain something like the following: icap_class class_avi service_avi icap_access class_avi allow all Regards, Christos
Re: [squid-users] compilation errors on rhel 5
Thanks JD, after installing compat-libstdc++-* the debug.o errors were solved was able to compile it but what about the errors of missing files in config.log conftest.c:11:28: error: ac_nonexistent.h: No such file or directory conftest.c:11:28: error: ac_nonexistent.h: No such file or directory conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory conftest.c:92:18: error: sasl.h: No such file or directory conftest.c:59:18: error: sasl.h: No such file or directory conftest.c:67:28: error: ac_nonexistent.h: No such file or directory conftest.c:105:21: error: bstring.h: No such file or directory conftest.c:72:21: error: bstring.h: No such file or directory conftest.c:113:23: error: gnumalloc.h: No such file or directory conftest.c:80:23: error: gnumalloc.h: No such file or directory conftest.c:114:23: error: ip_compat.h: No such file or directory conftest.c:81:23: error: ip_compat.h: No such file or directory conftest.c:114:27: error: ip_fil_compat.h: No such file or directory conftest.c:81:27: error: ip_fil_compat.h: No such file or directory conftest.c:114:20: error: ip_fil.h: No such file or directory conftest.c:81:20: error: ip_fil.h: No such file or directory conftest.c:114:20: error: ip_nat.h: No such file or directory conftest.c:81:20: error: ip_nat.h: No such file or directory conftest.c:114:17: error: ipl.h: No such file or directory conftest.c:81:17: error: ipl.h: No such file or directory conftest.c:114:18: error: libc.h: No such file or directory conftest.c:81:18: error: libc.h: No such file or directory conftest.c:118:19: error: mount.h: No such file or directory conftest.c:85:19: error: mount.h: No such file or directory conftest.c:121:35: error: netinet/ip_fil_compat.h: No such file or directory conftest.c:88:35: error: netinet/ip_fil_compat.h: No such file or directory conftest.c:140:23: error: sys/bswap.h: No such file or directory conftest.c:107:23: error: sys/bswap.h: No such file or directory conftest.c:140:24: error: sys/endian.h: No such file or directory conftest.c:107:24: error: sys/endian.h: No such file or directory conftest.c:144:21: error: sys/md5.h: No such file or directory conftest.c:111:21: error: sys/md5.h: No such file or directory conftest.c:162:18: error: glib.h: No such file or directory conftest.c:129:18: error: glib.h: No such file or directory conftest.c:165:24: error: nss_common.h: No such file or directory conftest.c:132:24: error: nss_common.h: No such file or directory conftest.c:168:28: error: sys/capability.h: No such file or directory conftest.c:135:28: error: sys/capability.h: No such file or directory conftest.c:171:44: error: linux/netfilter_ipv4/ip_tproxy.h: No such file or directory conftest.c:194:31: error: netinet/ip_compat.h: No such file or directory conftest.c:194:28: error: netinet/ip_fil.h: No such file or directory conftest.c:195:28: error: netinet/ip_nat.h: No such file or directory conftest.c:195:25: error: netinet/ipl.h: No such file or directory conftest.c:195:23: error: net/pfvar.h: No such file or directory conftest.c:177:27: error: libxml/parser.h: No such file or directory conftest.c:144:27: error: libxml/parser.h: No such file or directory conftest.c:177:27: error: libxml/parser.h: No such file or directory conftest.c:144:27: error: libxml/parser.h: No such file or directory 2nd any idea where am i failing to compile it in ubuntu 8.10 errors g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I. -I. -I../include -I. -I. -I../include -I../include -I../lib/libTrie/include -I../lib -I../lib -I/usr/include/libxml2 -Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR= \/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc -fPIC -DPIC -o ICAP/.libs/AsyncJob.o cc1plus: warnings being treated as errors ICAP/AsyncJob.cc: In member function ‘virtual const char* AsyncJob::status() const’: ICAP/AsyncJob.cc:158: error: format not a string literal and no format arguments make[1]: *** [ICAP/AsyncJob.lo] Error 1 make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src' make: *** [all-recursive] Error 1 Regards, Remy On Wed, 2008-11-26 at 02:12 -0800, John Doe wrote: Maybe missing a c++ lib? But I never tried Squid 3.x, only 2.x... On my CentOS I have those ( rpm -qa | grep ++ ): libstdc++-4.1.2-42.el5 libstdc++-devel-4.1.2-42.el5 libsigc++20-2.0.17-1.el5.rf gcc-c++-4.1.2-42.el5 compat-libstdc++-296-2.96-138 compat-libstdc++-33-3.2.3-61 JD - Original Message From: Mario Remy Almeida [EMAIL PROTECTED] To: Squid Users squid-users@squid-cache.org Sent: Wednesday, November 26, 2008 6:42:08 AM Subject: [squid-users] compilation errors on rhel 5 Hi All Since compilation failed on Ubuntu 8.10 thought to give a try on rhel 5 32bit but no luck had all the No such file
[squid-users] scanning request with squid
Hi All, Can someone tell me how can I scan http,https and ftp request for virus etc... with squid 3.1.x Without DG is it possible? Regards, Mario
Re: [squid-users] compilation errors on rhel 5
Thnaks Christos. After applying the patch I managed to install Regards, Remy On Wed, 2008-11-26 at 10:14 -0500, Christos Tsantilas wrote: Hi Remy, 2nd any idea where am i failing to compile it in ubuntu 8.10 errors g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I. -I. -I../include -I. -I. -I../include -I../include -I../lib/libTrie/include -I../lib -I../lib -I/usr/include/libxml2 -Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR= \/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc -fPIC -DPIC -o ICAP/.libs/AsyncJob.o cc1plus: warnings being treated as errors ICAP/AsyncJob.cc: In member function β#65533;#65533;virtual const char* AsyncJob::status() constβ#65533;#65533;: ICAP/AsyncJob.cc:158: error: format not a string literal and no format arguments make[1]: *** [ICAP/AsyncJob.lo] Error 1 make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src' make: *** [all-recursive] Error 1 I think this is the bug 2527: http://www.squid-cache.org/bugs/show_bug.cgi?id=2527 There is a small patch for this bug can you try it? Regards, Christos Regards, Remy
[squid-users] error compiling squid-3.1.0.2 on ubuntu 8.10
Hi All, tried to compile squid squid-3.1.0.2 on ubuntu 8.10 with the following options ./configure \ --prefix=/usr \ --localstatedir=/var \ --libexecdir=${prefix}/lib/squid \ --srcdir=. \ --datadir=${prefix}/share/squid \ --sysconfdir=/etc/squid \ --with-default-user=prox \ --with-logdir=/var/log \ --enable-arp-acl below are all the errors found in config.log file conftest.c:11:28: error: ac_nonexistent.h: No such file or directory conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory ./configure: line 6626: --version: command not found configure:6628: $? = 127 configure:6635: -v 5 ./configure: line 6636: -v: command not found configure:6638: $? = 127 configure:6645: -V 5 ./configure: line 6646: -V: command not found configure:6648: $? = 127 configure:6656: checking whether we are using the GNU Fortran 77 compiler configure:6675: -c conftest.F 5 ./configure: line 6676: -c: command not found /home/remy/rnd/squid-3.1.0.2/conftest.c:56: undefined reference to `shl_load' /home/remy/rnd/squid-3.1.0.2/conftest.c:56: undefined reference to `dlopen' /home/remy/rnd/squid-3.1.0.2/configure:10955: warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking configure:10974: $? = 0 /lib/: cannot read file data: Is a directory conftest.c:68:23: error: gnumalloc.h: No such file or directory conftest.c:69:27: error: ip_fil_compat.h: No such file or directory conftest.c:69:20: error: ip_nat.h: No such file or directory conftest.c:102:17: error: ipl.h: No such file or directory conftest.c:73:19: error: mount.h: No such file or directory conftest.c:128:23: error: sys/bswap.h: No such file or directory conftest.c:95:24: error: sys/endian.h: No such file or directory conftest.c:132:21: error: sys/md5.h: No such file or directory conftest.c:117:18: error: glib.h: No such file or directory conftest.c:153:24: error: nss_common.h: No such file or directory conftest.c:154:16: error: db.h: No such file or directory conftest.c:154:20: error: db_185.h: No such file or directory conftest.c:121:28: error: sys/capability.h: No such file or directory conftest.c:157:44: error: linux/netfilter_ipv4/ip_tproxy.h: No such file or directory conftest.c:180:31: error: netinet/ip_compat.h: No such file or directory conftest.c:180:28: error: netinet/ip_fil.h: No such file or directory conftest.c:181:28: error: netinet/ip_nat.h: No such file or directory conftest.c:181:25: error: netinet/ipl.h: No such file or directory conftest.c:181:23: error: net/pfvar.h: No such file or directory conftest.c:163:27: error: libxml/parser.h: No such file or directory executed make and got this error de -I../lib/libTrie/include -I../lib -I../lib-Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments -DDEFAULT_SQUID_CONFIG_DIR= \/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=\/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF $depbase.Tpo -c -o ICAP/AsyncJob.lo ICAP/AsyncJob.cc; \ then mv -f $depbase.Tpo $depbase.Plo; else rm -f $depbase.Tpo; exit 1; fi mkdir ICAP/.libs g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I. -I. -I../include -I. -I. -I../include -I../include -I../lib/libTrie/include -I../lib -I../lib -Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=\/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc -fPIC -DPIC -o ICAP/.libs/AsyncJob.o cc1plus: warnings being treated as errors ICAP/AsyncJob.cc: In member function ‘virtual const char* AsyncJob::status() const’: ICAP/AsyncJob.cc:158: error: format not a string literal and no format arguments make[1]: *** [ICAP/AsyncJob.lo] Error 1 make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src' make: *** [all-recursive] Error 1 can someone tell me what other packages are required to fix the above errors Regards, Remy
[squid-users] compilation errors on rhel 5
Hi All Since compilation failed on Ubuntu 8.10 thought to give a try on rhel 5 32bit but no luck had all the No such file or directory error message posted in the early mail and also got this error message run make and got the below errors debug.o: In function `operator std::char_traitschar ': /usr/include/c++/4.3/ostream:517: undefined reference to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' /usr/include/c++/4.3/ostream:517: undefined reference to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' debug.o: In function `operator std::char_traitschar ': /home/remy/rnd/squid-3.1.0.2/src/debug.cc:776: undefined reference to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' debug.o: In function `operator std::char_traitschar ': /usr/include/c++/4.3/ostream:517: undefined reference to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' /usr/include/c++/4.3/ostream:517: undefined reference to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' debug.o:/home/remy/rnd/squid-3.1.0.2/src/debug.cc:778: more undefined references to `std::basic_ostreamchar, std::char_traitschar std::__ostream_insertchar, std::char_traitschar (std::basic_ostreamchar, std::char_traitschar , char const*, int)' follow collect2: ld returned 1 exit status make[1]: *** [cf_gen] Error 1 make[1]: Leaving directory `/root/squid-3.1.0.2/src' make: *** [all-recursive] Error 1 need help what should i do next to solve the above errors? Regards Remy
Re: [squid-users] squid-3.1.0.2 compilation error
Thanks, Yes I had to do that. I downloaded the latest daily snapshot last night and copied the file to squid-3.1.0.2 and it worked fine now no problem Regards, Remy On Mon, 2008-11-24 at 10:34 +1300, Amos Jeffries wrote: Hi All, Tried to compile squid on Ubuntu 8.04 got the bellow error message squid_kerb_auth.c:121:20: error: base64.h: No such file or directory base64.h file not found under the below directory helpers/negotiate_auth/squid_kerb_auth/ instead base64.c file found would like to know if anyone had the above problem and how it was solved downloaded squid-3.1.0.2.tar.bz2 the file twice form the below location http://www1.it.squid-cache.org/Versions/v3/3.1/ Thank you for testing. This is a known problem with the bundle. I believe it to be corrected in the daily snapshots. Please also note the updated compile details for 3.1 on Debian Ubuntu http://wiki.squid-cache.org/SquidFaq/CompilingSquid Amos
[squid-users] squid-3.1.0.2 compilation error
Hi All, Tried to compile squid on Ubuntu 8.04 got the bellow error message squid_kerb_auth.c:121:20: error: base64.h: No such file or directory base64.h file not found under the below directory helpers/negotiate_auth/squid_kerb_auth/ instead base64.c file found would like to know if anyone had the above problem and how it was solved downloaded squid-3.1.0.2.tar.bz2 the file twice form the below location http://www1.it.squid-cache.org/Versions/v3/3.1/ Regards, Remy