Re: Unknown Intermediates

2017-06-22 Thread Alex Gaynor via dev-security-policy
I definitely consider increased visibility into the vast iceberg that is
the public PKI to be a good thing!

What set of intermediates are you using? If it's reasonably complete, I
doubt we'll do any better than you, though maybe someone here has a
particularly clever technique for processing these.

As these are all from PDFs, an interesting follow-up project for someone
might be to look at S/MIME signatures sent to public mailing lists and
seeing what interesting certificates can be found there.

Alex

On Thu, Jun 22, 2017 at 10:45 AM, Tavis Ormandy  wrote:

> I think you're right, it was probably me submitting my corpus - I hope
> that's a good thing! :-)
>
> I only submitted the ones I could verify, would you be interested in the
> others? Many are clearly not interesting, but others seem like they may be
> interesting if I had an intermediate I haven't seen.
>
> Tavis.
>
>
> On Thu, Jun 22, 2017 at 6:15 AM, Alex Gaynor  wrote:
>
>> One of my hobbies is keeping track of publicly trusted (by any of the
>> major root programs) CAs, for which there are no logged certificates.
>> There's over 1000 of these. In the last day, presumably as a result of
>> these efforts, 50-100 CAs were removed from the list.
>>
>> Cheers,
>> Alex
>>
>> On Thu, Jun 22, 2017 at 5:51 AM, Rob Stradling 
>> wrote:
>>
>>> On 19/06/17 20:41, Tavis Ormandy via dev-security-policy wrote:
>>>
 Thanks Alex, I took a look, it looks like the check pings crt.sh - is
 doing
 that for a large number of certificates acceptable Rob?

>>>
>>> Hi Tavis.  Yes, Alex's tool uses https://crt.sh/gen-add-chain to find a
>>> suitable cert chain and build the JSON that can then be submitted to a
>>> log's /ct/v1/add-chain.  It should be fine to do that for a large number of
>>> certs.  crt.sh exists to be used.  ;-)
>>>
>>> I made a smaller set, the certificates that have 'SSL server: Yes' or
 'Any
 Purpose : Yes', there were only a few thousand that verified, so I just
 checked those and found 551 not in crt.sh.

 (The *vast* majority are code signing certificates, many are individual
 apple developer certificates)

 Is this useful? if not, what key usage is interesting?

 https://lock.cmpxchg8b.com/ServerOrAny.zip

>>>
>>> Thanks for this, Tavis.  I pointed my certscraper (
>>> https://github.com/robstradling/certscraper) at this URL a couple of
>>> days ago.  This submitted many of the certs to the Dodo and Rocketeer logs.
>>>
>>> However, it didn't manage to build chains for all of them.  I haven't
>>> yet had a chance to investigate why.
>>>
>>>
>>> Tavis.

 On Mon, Jun 19, 2017 at 7:03 AM, Alex Gaynor 
 wrote:

 If you're interested in playing around with submitting them yourself, or
> checking if they're already submitted, I've got some random tools for
> working with CT: https://github.com/alex/ct-tools
>
> Specifically ct-tools check  will get what
> you
> want. It's all serial, so for 8M certs you probably want to Bring Your
> Own
> Parallelism (I should fix this...)
>
> Alex
>
> On Mon, Jun 19, 2017 at 6:51 AM, Rob Stradling via dev-security-policy
> <
> dev-security-policy@lists.mozilla.org> wrote:
>
> On 16/06/17 20:11, Andrew Ayer via dev-security-policy wrote:
>>
>> On Fri, 16 Jun 2017 10:29:45 -0700 Tavis Ormandy wrote:
>>>
>>> 
>>
>> Is there an easy way to check which certificates from my set you're
>>>
 missing? (I'm not a PKI guy, I was collecting unusual extension OIDs
 for fuzzing).

 I collected these from public sources, so can just give you my whole
 set if you already have tools for importing them and don't mind
 processing them, I have around ~8M (mostly leaf) certificates, the
 set with isCa will be much smaller.


>>> Please do post the whole set.  I suspect there are several people on
>>> this list (including myself and Rob) who have the tools and
>>> experience
>>> to process large sets of certificates and post them to public
>>> Certificate Transparency logs (whence they will be fed into crt.sh).
>>>
>>> It would be useful to include the leaf certificates as well, to catch
>>> CAs which are engaging in bad practices such as signing non-SSL certs
>>> with SHA-1 under an intermediate that is capable of issuing SSL
>>> certificates.
>>>
>>> Thanks a bunch for this!
>>>
>>>
>> +1
>>
>> Tavis, please do post the whole set.  And thanks!
>>
>
>>> --
>>> Rob Stradling
>>> Senior Research & Development Scientist
>>> COMODO - Creating Trust Online
>>>
>>
>>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org

Re: On GitHub, Leaked Keys, and getting practical about revocation

2017-06-22 Thread Ryan Sleevi via dev-security-policy
On Thu, Jun 22, 2017 at 3:53 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> On 22/06/2017 15:02, Ryan Sleevi wrote:
> > On Thu, Jun 22, 2017 at 1:59 PM Jakob Bohm via dev-security-policy <
> > dev-security-policy@lists.mozilla.org> wrote:
> >
>  > (Snip long repeat of the same opinion)
>
> You seem to argue:


Your summary is neither accurate nor supported by what I said. However, as
your reply doesn't seek to learn or understand, and rather chooses to
belittle and misrepresent, nor does it provide any new or useful
information to refute the generally accepted conclusions you appear to
disagree with, it does seem that we should simply agree to disagree.

If you should have interest in trying to provide persuasive arguments,
factually supported, in a respectful way, perhaps there is an opportunity
to learn from eachother, but as this is otherwise an unnecessarily
combative, overly reducative, and textually unsupported strawman, it might
be best if you were to take a break from this discussion.
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Unknown Intermediates

2017-06-22 Thread Tavis Ormandy via dev-security-policy
I think you're right, it was probably me submitting my corpus - I hope
that's a good thing! :-)

I only submitted the ones I could verify, would you be interested in the
others? Many are clearly not interesting, but others seem like they may be
interesting if I had an intermediate I haven't seen.

Tavis.

On Thu, Jun 22, 2017 at 6:15 AM, Alex Gaynor  wrote:

> One of my hobbies is keeping track of publicly trusted (by any of the
> major root programs) CAs, for which there are no logged certificates.
> There's over 1000 of these. In the last day, presumably as a result of
> these efforts, 50-100 CAs were removed from the list.
>
> Cheers,
> Alex
>
> On Thu, Jun 22, 2017 at 5:51 AM, Rob Stradling 
> wrote:
>
>> On 19/06/17 20:41, Tavis Ormandy via dev-security-policy wrote:
>>
>>> Thanks Alex, I took a look, it looks like the check pings crt.sh - is
>>> doing
>>> that for a large number of certificates acceptable Rob?
>>>
>>
>> Hi Tavis.  Yes, Alex's tool uses https://crt.sh/gen-add-chain to find a
>> suitable cert chain and build the JSON that can then be submitted to a
>> log's /ct/v1/add-chain.  It should be fine to do that for a large number of
>> certs.  crt.sh exists to be used.  ;-)
>>
>> I made a smaller set, the certificates that have 'SSL server: Yes' or 'Any
>>> Purpose : Yes', there were only a few thousand that verified, so I just
>>> checked those and found 551 not in crt.sh.
>>>
>>> (The *vast* majority are code signing certificates, many are individual
>>> apple developer certificates)
>>>
>>> Is this useful? if not, what key usage is interesting?
>>>
>>> https://lock.cmpxchg8b.com/ServerOrAny.zip
>>>
>>
>> Thanks for this, Tavis.  I pointed my certscraper (
>> https://github.com/robstradling/certscraper) at this URL a couple of
>> days ago.  This submitted many of the certs to the Dodo and Rocketeer logs.
>>
>> However, it didn't manage to build chains for all of them.  I haven't yet
>> had a chance to investigate why.
>>
>>
>> Tavis.
>>>
>>> On Mon, Jun 19, 2017 at 7:03 AM, Alex Gaynor 
>>> wrote:
>>>
>>> If you're interested in playing around with submitting them yourself, or
 checking if they're already submitted, I've got some random tools for
 working with CT: https://github.com/alex/ct-tools

 Specifically ct-tools check  will get what
 you
 want. It's all serial, so for 8M certs you probably want to Bring Your
 Own
 Parallelism (I should fix this...)

 Alex

 On Mon, Jun 19, 2017 at 6:51 AM, Rob Stradling via dev-security-policy <
 dev-security-policy@lists.mozilla.org> wrote:

 On 16/06/17 20:11, Andrew Ayer via dev-security-policy wrote:
>
> On Fri, 16 Jun 2017 10:29:45 -0700 Tavis Ormandy wrote:
>>
>> 
>
> Is there an easy way to check which certificates from my set you're
>>
>>> missing? (I'm not a PKI guy, I was collecting unusual extension OIDs
>>> for fuzzing).
>>>
>>> I collected these from public sources, so can just give you my whole
>>> set if you already have tools for importing them and don't mind
>>> processing them, I have around ~8M (mostly leaf) certificates, the
>>> set with isCa will be much smaller.
>>>
>>>
>> Please do post the whole set.  I suspect there are several people on
>> this list (including myself and Rob) who have the tools and experience
>> to process large sets of certificates and post them to public
>> Certificate Transparency logs (whence they will be fed into crt.sh).
>>
>> It would be useful to include the leaf certificates as well, to catch
>> CAs which are engaging in bad practices such as signing non-SSL certs
>> with SHA-1 under an intermediate that is capable of issuing SSL
>> certificates.
>>
>> Thanks a bunch for this!
>>
>>
> +1
>
> Tavis, please do post the whole set.  And thanks!
>

>> --
>> Rob Stradling
>> Senior Research & Development Scientist
>> COMODO - Creating Trust Online
>>
>
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: When are public applications embedding certificates pointing to 127.0.0.1 OK?

2017-06-22 Thread andrewm.bpi--- via dev-security-policy
On Thursday, June 22, 2017 at 6:29:17 AM UTC-5, Jakob Bohm wrote:
> The most obvious concern to me is random web servers, possibly through
> hidden web elements (such as script tags) gaining access to anything
> outside the Browser's sandbox without clear and separate user
> action.  For example, if I visit a site that carries an advertisement
> for Spotify, I don't want that site to have any access to my locally
> running Spottify software, its state or even its existence.


That's a good point. Even if you might be able to trust the software running on 
your computer not to reveal sensitive information or accept commands from 
random, unauthenticated sites, it's still a potential privacy concern if those 
sites can detect what software you're running in the first place (by, for 
example, checking to see if an image known to be hosted by that program 
successfully loads).

A properly-designed application could take steps to mitigate this problem (such 
as checking the referer header before serving resources like images to an 
external site), but not all such applications may be sensitive enough to 
privacy issues to actually implement such features.
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On GitHub, Leaked Keys, and getting practical about revocation

2017-06-22 Thread Jakob Bohm via dev-security-policy

On 22/06/2017 15:02, Ryan Sleevi wrote:

On Thu, Jun 22, 2017 at 1:59 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:


> (Snip long repeat of the same opinion)

You seem to argue:

- Because the recent research on efficient central CRL distribution was
 based on a novel optimization of a previously inefficient algorithm,
 then it is nothing new and should be ignored.

- Operating a central CRL distribution service is a suspect commercial
 enterprise with a suspect business model, not a service to the
 community.

- OCSP stapling of intermediary certificates is inefficient because we
 should all just use central CRL distribution in a form where not all
 revocations are included.

- Because most/all browsers contain security holes in their revocation
 checking, making those holes bigger is not a problem.

- Revocation by the issuing CA is not trustworthy, thus nothing is,
 therefore everybody should just trust compromised keys as if they were
 not compromised.

- An attacker with a stolen/compromised key not doing OCSP stapling is
 the fault of the legitimate key holder.

- Forcing people to use OCSP stapling will magically cause software that
 allows this to spring into existence overnight.  If this doesn't happen
 it is the fault of the server operator, not because the demand was
 premature.



Enjoy

Jakob
--
Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
This public discussion message is non-binding and may contain errors.
WiseMo - Remote Service Management for PCs, Phones and Embedded
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Unknown Intermediates

2017-06-22 Thread Alex Gaynor via dev-security-policy
One of my hobbies is keeping track of publicly trusted (by any of the major
root programs) CAs, for which there are no logged certificates. There's
over 1000 of these. In the last day, presumably as a result of these
efforts, 50-100 CAs were removed from the list.

Cheers,
Alex

On Thu, Jun 22, 2017 at 5:51 AM, Rob Stradling 
wrote:

> On 19/06/17 20:41, Tavis Ormandy via dev-security-policy wrote:
>
>> Thanks Alex, I took a look, it looks like the check pings crt.sh - is
>> doing
>> that for a large number of certificates acceptable Rob?
>>
>
> Hi Tavis.  Yes, Alex's tool uses https://crt.sh/gen-add-chain to find a
> suitable cert chain and build the JSON that can then be submitted to a
> log's /ct/v1/add-chain.  It should be fine to do that for a large number of
> certs.  crt.sh exists to be used.  ;-)
>
> I made a smaller set, the certificates that have 'SSL server: Yes' or 'Any
>> Purpose : Yes', there were only a few thousand that verified, so I just
>> checked those and found 551 not in crt.sh.
>>
>> (The *vast* majority are code signing certificates, many are individual
>> apple developer certificates)
>>
>> Is this useful? if not, what key usage is interesting?
>>
>> https://lock.cmpxchg8b.com/ServerOrAny.zip
>>
>
> Thanks for this, Tavis.  I pointed my certscraper (
> https://github.com/robstradling/certscraper) at this URL a couple of days
> ago.  This submitted many of the certs to the Dodo and Rocketeer logs.
>
> However, it didn't manage to build chains for all of them.  I haven't yet
> had a chance to investigate why.
>
>
> Tavis.
>>
>> On Mon, Jun 19, 2017 at 7:03 AM, Alex Gaynor  wrote:
>>
>> If you're interested in playing around with submitting them yourself, or
>>> checking if they're already submitted, I've got some random tools for
>>> working with CT: https://github.com/alex/ct-tools
>>>
>>> Specifically ct-tools check  will get what you
>>> want. It's all serial, so for 8M certs you probably want to Bring Your
>>> Own
>>> Parallelism (I should fix this...)
>>>
>>> Alex
>>>
>>> On Mon, Jun 19, 2017 at 6:51 AM, Rob Stradling via dev-security-policy <
>>> dev-security-policy@lists.mozilla.org> wrote:
>>>
>>> On 16/06/17 20:11, Andrew Ayer via dev-security-policy wrote:

 On Fri, 16 Jun 2017 10:29:45 -0700 Tavis Ormandy wrote:
>
> 

 Is there an easy way to check which certificates from my set you're
>
>> missing? (I'm not a PKI guy, I was collecting unusual extension OIDs
>> for fuzzing).
>>
>> I collected these from public sources, so can just give you my whole
>> set if you already have tools for importing them and don't mind
>> processing them, I have around ~8M (mostly leaf) certificates, the
>> set with isCa will be much smaller.
>>
>>
> Please do post the whole set.  I suspect there are several people on
> this list (including myself and Rob) who have the tools and experience
> to process large sets of certificates and post them to public
> Certificate Transparency logs (whence they will be fed into crt.sh).
>
> It would be useful to include the leaf certificates as well, to catch
> CAs which are engaging in bad practices such as signing non-SSL certs
> with SHA-1 under an intermediate that is capable of issuing SSL
> certificates.
>
> Thanks a bunch for this!
>
>
 +1

 Tavis, please do post the whole set.  And thanks!

>>>
> --
> Rob Stradling
> Senior Research & Development Scientist
> COMODO - Creating Trust Online
>
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On GitHub, Leaked Keys, and getting practical about revocation

2017-06-22 Thread Ryan Sleevi via dev-security-policy
On Thu, Jun 22, 2017 at 1:59 PM Jakob Bohm via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:

> Please note that Apache and NGINX are by far not the only TLS servers
> that will need working OCSP stapling code before must-staple can become
> default or the only method checked by Browsers and other TLS clients.
>
> What is needed is:
>
> 1. An alternative checking mechanism, such as the compact super-CRL
>developed by some researchers earlier this year.  This needs to be
>deployed and used *before* turning off traditional CRL and OCSP
>checking in relying party code, since leaving a relying party
>application without any checking for revoked certificates is a pretty
>obvious security hole.


There is reasonable and well-argued disagreement with you on this. Without
hardfail, it is not a security hole, and now mainstream client has ever
shipped hardfail, so it is not a reasonable requirement to introduce into
this discussion.

There's equally a host of pragmatic and practical concerns, which you can
find in the years of discussion on this topic and related solutions (as the
"super-CRL" you refer to is just the application of one particular
technology, which had previously been outlined years prior, and which
alternatives of similar constraints also existed even before that).

I suspect this may simply resolve to irreconcilable differences, as some
people may incorrectly choose to believe that soft-fail revocation provides
a defensible security boundary, or may believe that CAs revocation is a
trustworthy reason to deny access, both of which ample evidence and
arguments exist to show otherwise. I mention this largely to avoid
rehashing the same conversations that have been had, but if you are
unfamiliar with them, and are interested in learning of this other
perspective, I would be happy to provide them. I just figure as shorthand
that we disagree on this.

>
> 2. Full OCSP stapling support in all TLS libraries, including the LTS
>branches of OpenSSL, mbedTLS, NSS, Java, Android etc.  Providing this
>only in the "next great release that is incompatible with many
>existing users" is a common malpractice among security library
>vendors, just as there are still systems that refuse to support TLS
>1.2 and ubiquitous SHA-256 signatures in todays world, simply because
>the original system/library vendor refuses to backport the needed
>changes.


There is zero reason to introduce this as a dependency beforehand. Perhaps
your assumption is that it is unreasonable to require of the ecosystem what
the ecosystem does not support, but equally realize, the ecosystem will not
support it until it is required.

It is already true that a sufficient and meaningful majority support most
of the necessary work, and so the argument that ALL servers or libraries
must support is to ignore both the market realities and to place the
(unreasonable) perfect ahead of the achievable good.

Further, you conflate OCSP Multi-Staple as necessary, when in fact it is
entirely undesirable from a performance perspective and largely unnecessary
from a deployment scenario, given the existence of CRLSets, CDLs,
OneCRL-et-al.

>
> 3. Once #2 is achieved, actual TLS servers and clients using those
>libraries can begin to enable OCSP stapling.
>
> 4. Once #3 is achieved and deployed, then OCSP stapling might become
>mandatory by default.


This is to completely upend the only action that has been seen to
meaningfully improve the ecosystem, which is the mandate that thus spurs
implementation or innovation.

On a more pragmatic level, there is nothing wrong with saying "The only
revocation supported will be stapling and OneCRL". There is no need to go
above and beyond this, because collectively, this achieves the goal of
providing a compelling revocation story.

The disconnect that results in proposals like yours is that they presume
revocation is for the benefit of the relying party, as opposed to being for
the benefit of the site operator.

A site operator cares about revocation for cases of key compromise and
impersonation. A relying party may care about revocation for reasons like
misrepresentation (which is not, despite some views contrary, an accepted
concern of the Mozilla policies - c.f. malware and phishing), apathetic
server compromise (that is, they did not enable stapling. However, the root
cause/risk is the apathy, for which revocation does not fix), or should the
user want to deny themselves access to a site (which no user does).

If we focus on stapling, the position is that it is not necessary for the
browser to protect the user from servers' apathy (in not enabling
stapling), or from CAs' capricious opinions about certificates (which the
so-called supercrls try to enable, as a business model), but to allow
servers to protect themselves. There is similarly no concern given to CAs
that want to use OCSP or CRLs to "rent a cert" (as some tried to in the
past), 

Re: Root Store Policy 2.5: Call For Review and Phase-In Periods

2017-06-22 Thread Gervase Markham via dev-security-policy
On 21/06/17 16:58, Doug Beattie wrote:
>> It's worth noting that if we had discovered this situation for SSL - that an
>> unconstrained intermediate or uncontrolled power of issuance had been
>> given to a company with no audit - we would be requiring the intermediate
>> be revoked today, and probably taking further action as well.
> 
> Agree

After consultation, I have decided to implement this requirement with a
phase-in period of six months, for already-existing intermediates. So
before 15th January 2018 (add a bit because of Christmas) these
customers, and any others like them at any other CA, need to have audits
(over at least 30 days of operations), move to a name-constrained
intermediate, or move to a managed service which does domain ownership
validation on each domain added to the system. I expect these two
intermediates to be revoked on or before 15th January 2018.

I realise this is not what you were hoping for, but it's not reasonable
to leave unconstrained intermediates in the hands of those not qualified
to hold them for a further 2 years. I am allowing six months because,
despite the weakness of the previous controls, you were in compliance
with them and so it's not reasonable to ask for a super-quick move.

https://github.com/mozilla/pkipolicy/commit/44ae763f24d6509bb2311d33950108ec5ec87082

(ignore the erroneously-added logfile).

> Are there any other CAs or mail vendors that have tested name constrained 
> issuing CAs? If using name constrained CAs don’t work with some or all of the 
> mail applications, it seems like we might as well recommend a change to the 
> requirement.

I am open to hearing further evidence on this point.

Gerv
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: On GitHub, Leaked Keys, and getting practical about revocation

2017-06-22 Thread Jakob Bohm via dev-security-policy

On 21/06/2017 19:40, Matthew Hardeman wrote:

Hi all,

I'm sure questions of certificates leaked to the public via GitHub and other 
file sharing / code sharing / deployment repository hosting and sharing sites 
have come up before, but last night I spent a couple of hours constructing 
various search criteria which I don't think were even especially clever, but 
still I was shocked and amazed at what I found:

At least 10 different Apple Development and Production environment certificates 
and key pairs.  Of these minimum of 10 that I am counting, I validated that the 
certificate is within its validity period.  Of these, I validated that the key 
I found matches the public key information in the certificate.  Most of these 
certificates are TLS Client Authentication certificates which also have 
additional Apple proprietary extended key usages.  These certificates are 
utilized for authenticating to the Apple Push Notification System.  A couple of 
certificates were Apple Developer ID certificates appropriate for development 
and production environment deployment of executable code to Apple devices.  
(Those developer ID certificates I have reported to Apple for revocation.)  
There were more Apple Push authentication certificates than I cared to write up 
and send over.

I was shocked at the level of improper distribution and leaking of these keys 
and certificates.

Once in a while, the key was represented in encrypted form.  In _every_ 
instance for which I found an encrypted key and dug further, either a piece of 
code, a configuration file, or sometimes a README-KEY-PASSWORD.txt (or similar) 
within the same repository successfully decrypted the encrypted key.

Additionally, I did find some TLS server certificates.  There were many more 
that I did not bother to carefully analyze.  Some were expired.  One was a 
in-validity-window DV certificate issued by Let's Encrypt.  Utilizing the 
certificate's private key, I was able to successfully use the Let's Encrypt 
ACME API to automatically request revocation of that certificate.  Minutes 
later, I verified that OCSP responses for that certificate were, in fact, 
indicating that the certificate was revoked.

Of course, revocation even with a really nice OCSP responder system is not very 
effective today.

I have this suspicion that human nature dictates that eliminating these kinds 
of key material leaks is not even a goal worth having.  Disappointment, I 
suspect, lives down that road.

Because live OCSP checks for certificates en-masse is not appealing to either 
the CAs or the browsers or the end users (consequences of network delay, 
reliability, etc.), revocation means very little pragmatically today.

This only reinforces the value and importance of either/both:

- Quite short lived certificates, automatically replaced and deployed, to 
reduce the risks associated with key compromise

and/or

- OCSP must-staple, which I believe is only pragmatically gated at the moment 
by a number of really poor server-side implementations of OCSP stapling.  
Servers must cache good responses.  Servers must use those while awaiting a new 
good response further into the OCSP response validity period.  Servers must 
validate the response and not server random garbage as if OCSP.  Etc, etc.  
Ryan Sleevi's work documenting the core issues is clearly a step in the right 
direction.

Both NGINX's and Apache HTTPD's implementations of OCSP stapling are lacking in 
several material respects.

It would certainly be a significant undertaking, but I believe that 
organizations who are working to ensure a secure Web (and that reap the 
benefits of a secure and trustworthy web) could do much to achieve better 
deployment of OCSP stapling in relatively short time:

1.  Direct contribution of funds / bounty to the core developers of each of 
those two web server projects for building a server-side OCSP stapling 
implementation which is trivial to configure and which meets the needs of an 
ideal implementation with respect to caching of good results, validating new 
responses to staple, scheduling the deployment of successful new responses or 
scheduling retries of fails, etc.  Insist that the code be written with a view 
to maximal back-port capability for said implementations.

2.  If such contributions are infeasible, funding competent external 
development of code which achieves the same as item 1 above.

3.  High level engagement with major distributions.  Tackle the technical and 
administrative hurdles to get these changes into the stable and development 
builds of all currently shipping versions of at least RedHat's, Canonical's, 
and Debian's distributions.  Get these changes into the standard default 
version httpd and nginx updates.

4.  Same as above but for common docker images, prevalent VM images, etc.

5.  Ensure that the browsers are ready to support and enforce fail-hard on 
certificates which feature the OCSP must-staple extension.

6.  Monitor progress in 

Re: When are public applications embedding certificates pointing to 127.0.0.1 OK?

2017-06-22 Thread Jakob Bohm via dev-security-policy

On 21/06/2017 22:01, andrewm@gmail.com wrote:

On Wednesday, June 21, 2017 at 1:35:13 PM UTC-5, Matthew Hardeman wrote:

Regarding localhost access, you are presently incorrect.  The browsers do not 
allow access to localhost via insecure websocket if the page loads from a 
secure context.  (Chrome and Firefox at least, I believe do not permit this 
presently.)  I do understand that there is some question as to whether they may 
change that.


Right, I wasn't taking about WebSockets in particular, but about any possible 
form of direct communication between the web app and desktop application. 
That's why I pointed to plain old HTTP requests as an example.


As for whether or not access to localhost from an externally sourced web site is 
"inherently a bad thing".  I understand that there are downsides to proxying 
via the server in the middle in order to communicate back and forth with the locally 
installed application.  Having said that, there is a serious advantage:

>From a security perspective, having the application make and maintain a 
connection or connections out to the server that will act as the intermediary 
between the website and the application allows for the network administrator to 
identify that there is an application installed that is being manipulated and 
controlled by an outside infrastructure.  This allows for visibility to the fact 
that it exists and allows for appropriate mitigation measures if any are needed.

For a website to silently contact a server application running on the loopback 
and influence that software while doing so in a manner invisible to the network 
infrastructure layer is begging to be abused as an extremely covert command and 
control architecture when the right poorly written software application comes 
along.


I guess I don't completely understand what your threat model here is. Are you 
saying you're worried about users installing insecure applications that allow 
remote code execution for any process that can send HTTP requests to localhost?

Or are you saying you're concerned about malware already installed on the 
user's computer using this mechanism for command and control?

Both of those are valid concerns. I'm not really sure whether they're 
significant enough though to break functionality over, since they both require 
the user to already be compromised in some way before they're of any use to 
attackers. Though perhaps requiring a permissions prompt of some kind before 
allowing requests to localhost may be worth considering...

As I said though, this is kinda straying off topic. If the ability of web apps 
to communicate with localhost is something that concerns you, consider starting 
a new topic on this mailing list so we can discuss that in detail without 
interfering with the discussion regarding TLS certificates here.



The most obvious concern to me is random web servers, possibly through
hidden web elements (such as script tags) gaining access to anything
outside the Browser's sandbox without clear and separate user
action.  For example, if I visit a site that carries an advertisement
for Spotify, I don't want that site to have any access to my locally
running Spottify software, its state or even its existence.

The most obvious way to have a local application be managed from a local
standard web browser while also using resources obtained from a central
application web site is for the local application to proxy those
resources from the web site.  Thus the Browser will exclusively be
talking to a localhost URL, probably over plain HTTP or some locally
generated localhost certificate, that may or may not be based on
existing machine certificate facilities in some system configurations.

In other words, the user might open http://localhost:45678 to see the
App user interface, consisting of local element and some elements which
the app backend might dynamically download from the vendor before
serving them within the http://localhost:45678/ URL namespace.

This greatly reduces the need for any mixing of origins in the Browser,
and also removes the need to have publicly trusted certificates revealed
to such local applications.

For some truly complex scenarios, more complex techniques are needed to
avoid distributing private keys, but that's not needed for the cases
discussed here.


Enjoy

Jakob
--
Jakob Bohm, CIO, Partner, WiseMo A/S.  https://www.wisemo.com
Transformervej 29, 2860 Søborg, Denmark.  Direct +45 31 13 16 10
This public discussion message is non-binding and may contain errors.
WiseMo - Remote Service Management for PCs, Phones and Embedded
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy


Re: Unknown Intermediates

2017-06-22 Thread Rob Stradling via dev-security-policy

On 19/06/17 20:41, Tavis Ormandy via dev-security-policy wrote:

Thanks Alex, I took a look, it looks like the check pings crt.sh - is doing
that for a large number of certificates acceptable Rob?


Hi Tavis.  Yes, Alex's tool uses https://crt.sh/gen-add-chain to find a 
suitable cert chain and build the JSON that can then be submitted to a 
log's /ct/v1/add-chain.  It should be fine to do that for a large number 
of certs.  crt.sh exists to be used.  ;-)



I made a smaller set, the certificates that have 'SSL server: Yes' or 'Any
Purpose : Yes', there were only a few thousand that verified, so I just
checked those and found 551 not in crt.sh.

(The *vast* majority are code signing certificates, many are individual
apple developer certificates)

Is this useful? if not, what key usage is interesting?

https://lock.cmpxchg8b.com/ServerOrAny.zip


Thanks for this, Tavis.  I pointed my certscraper 
(https://github.com/robstradling/certscraper) at this URL a couple of 
days ago.  This submitted many of the certs to the Dodo and Rocketeer logs.


However, it didn't manage to build chains for all of them.  I haven't 
yet had a chance to investigate why.



Tavis.

On Mon, Jun 19, 2017 at 7:03 AM, Alex Gaynor  wrote:


If you're interested in playing around with submitting them yourself, or
checking if they're already submitted, I've got some random tools for
working with CT: https://github.com/alex/ct-tools

Specifically ct-tools check  will get what you
want. It's all serial, so for 8M certs you probably want to Bring Your Own
Parallelism (I should fix this...)

Alex

On Mon, Jun 19, 2017 at 6:51 AM, Rob Stradling via dev-security-policy <
dev-security-policy@lists.mozilla.org> wrote:


On 16/06/17 20:11, Andrew Ayer via dev-security-policy wrote:


On Fri, 16 Jun 2017 10:29:45 -0700 Tavis Ormandy wrote:





Is there an easy way to check which certificates from my set you're

missing? (I'm not a PKI guy, I was collecting unusual extension OIDs
for fuzzing).

I collected these from public sources, so can just give you my whole
set if you already have tools for importing them and don't mind
processing them, I have around ~8M (mostly leaf) certificates, the
set with isCa will be much smaller.



Please do post the whole set.  I suspect there are several people on
this list (including myself and Rob) who have the tools and experience
to process large sets of certificates and post them to public
Certificate Transparency logs (whence they will be fed into crt.sh).

It would be useful to include the leaf certificates as well, to catch
CAs which are engaging in bad practices such as signing non-SSL certs
with SHA-1 under an intermediate that is capable of issuing SSL
certificates.

Thanks a bunch for this!



+1

Tavis, please do post the whole set.  And thanks!


--
Rob Stradling
Senior Research & Development Scientist
COMODO - Creating Trust Online
___
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy