On 21/04/2016 8:18 a.m., Markey, Bruce wrote:
> I'm curious as to why this is happening.
> 
> Proxy was implemented last week and since then I've been dealing with all the 
> sites that don't work. Not a problem, knew it was going to happen. I'd like 
> to understand why the following is happening.
> 
> 
> 1.       User goes to https://www.whatever.com
> 
> 2.       Browser, mostly chrome, gives the following error.   Connection not 
> private. NET:ERR_CERT_AUTHORITY_INVALID
> 

Typing that into search engine produces a thread explaining that it is
the browser message shown when HSTS is in effect on a website and the
server cert is not trusted by the browser.



> 3.       If you view the cert it shows the dynamic cert listed.
> 
> 4.       Click the "Proceed to www.whatever.com<http://www.whatever.com> 
> (unsafe )
> 
> 5.       Now I get a squid error.  Requested url could not be retrieved.  
> Access denied while trying to retrieve https:// some ip address/*
> 

And that #5 explains why. It was actually not the web server producing
the cert. But Squid doing SSL-Bumping in order to show you the error page.



> Thing is I don't have an acl blocking that ip?   ( Small sub question here, 
> is there a way to tell which acl blocks something? )
> 

Something clearly is. But not what you expect, or you would not be here
asking about it.

> What I've had to do to get around this is add 
> www.whatever.com<http://www.whatever.com> to my broken_sites.acl.    Then add 
> the ip to an allowed_ips.acl.
> 
> Then I http_access allow the ips list
> 
> And skip peeking at the broken site.
> 
> acl broken_sites ssl::server_name_regex "/etc/squid3/acls/http_broken.txt"
> ssl_bump peek !broken_sites
> ssl_bump splice all
> 
> I'm trying to understand why this is breaking and if I'm doing the right 
> thing in fixing it.
> 

Please provide your whole squid.conf (except empty or # comment lines).
We might need to see it all to find what the problem is.


> 
> The second error I'm getting is:
> 
> 
> The following error was encountered while trying to retrieve the URL: 
> https://*.agentimediaservices.com/*<https://%2A.agentimediaservices.com/*>
> 
> Failed to establish a secure connection to 63.240.52.151
> 
> The system returned:
> 
> (71) Protocol error (TLS code: X509_V_ERR_UNABLE_TO_GET_ISSUER_CERT_LOCALLY)
> 
> SSL Certficate error: certificate issuer (CA) not known: /C=GB/ST=Greater 
> Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Organization 
> Validation Secure Server CA
> Same question.  From what I've read this means that I don't have the correct 
> root ca?  Is that correct?  If so is the fix to then go try to find the 
> correct .crt and add it to the standard ca-cert store? ( I'm on debian so 
> /usr/share/ca-certificates/Mozilla )
> 
> Again, is this correct as to what is going wrong and the correct fix?

Well, first step is to ensure your ca-certificates package is up to
date. That usually solves these.

But not always, especially if the CA has been caught doing bad things
and suddenly dropped. Or if they have begun issuing certs to clients
before being accepted by the Mozilla CA list people.

It could also be a problem with intermediary cert just being omitted by
the server. In that case adding it to your server-wide cert store or
configuring it to be loaded by Squid will be needed.

Amos

_______________________________________________
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

Reply via email to