On Wed, Jul 30, 2014 at 12:17 PM, Kathleen Wilson <kwil...@mozilla.com> wrote:
> On 7/28/14, 11:00 AM, Brian Smith wrote:
>>
>> I suggest that, instead of including the cross-signing certificates in
>> the NSS certificate database, the mozilla::pkix code should be changed
>> to look up those certificates when attempting to find them through NSS
>> fails. That way, Firefox and other products that use NSS will have a
>> lot more flexibility in how they handle the compatibility logic.
>
> There's already a bug for fetching missing intermediates:
> https://bugzilla.mozilla.org/show_bug.cgi?id=399324
>
> I think it would help with removal of roots (the remaining 1024-bit roots,
> non-BR-complaint roots, SHA1 roots, retired roots, etc.), and IE has been
> supporting this capability for a long time.

First of all, there is no such thing as a SHA1 root. Unlike the public
key algorithm, the hash algorithm is NOT fixed per root. That means
any RSA-2048 root can already issue certificates signed using SHA256
instead of SHA1. AFAICT, there's no reason for a CA to insist on
adding new roots for SHA256 support.

Other desktop browsers do support AIA certificate fetching, but many
mobile browsers don't. For example, Chrome on Android does not support
AIA fetching (at least, at the time I tried it) but Chrome on desktop
does support it. So, if Firefox were to add support for AIA
certificate fetching, it would be encouraging website administrators
to create websites that don't work on all browsers.

The AIA fetching mechanism is not reliable, for the same reasons that
OCSP fetching is not reliable. So, if Firefox were to add support for
AIA certificate fetching, it would be encouraging websites to create
websites that don't work reliably.

The AIA fetching process and OCSP fetching are both very slow--much
slower than the combination of all other SSL handshaking and
certificate verification. So, if Firefox were to add support for AIA
certificate fetching, it would be encouraging websites to create slow
websites.

The AIA fetching mechanism and OCSP fetching require an HTTP
implementation in order to verify certificates, and both of those
mechanisms require (practically, if not theoretically) the fetching to
be done over unauthenticated and unencrypted channels. It is not a
good idea to add the additional attack surface of an entire HTTP stack
to the certificate verification process.

If we are willing to encourage administrators to create websites that
don't work with all browsers, then we should just preload the
commonly-missing intermediate certificates into Firefox and/or NSS.
This would avoid all the performance problems, reliability problems,
and additional attack surface, and still provide a huge compatibility
benefit. In fact, most misconfigured websites would then work better
(faster, more reliably) in Firefox than in other browsers.

One of the motivations for creating mozilla::pkix was to make it easy
for Firefox to preload these certificates without having to have them
preloaded into NSS, because Wan-Teh had objected to preloading them
into NSS when I proposed it a couple of years ago. So, I think the
best course of action would be for us to try the preloading approach
first, and then re-evaluate whether AIA fetching is necessary later,
after measuring the results of preloading.

Cheers,
Brian
_______________________________________________
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy

Reply via email to