On Saturday, March 10, 2018 at 4:57:43 PM UTC+1, Ryan Sleevi wrote:
> On Fri, Mar 9, 2018 at 6:51 PM, syrine.tl--- via dev-security-policy <
> dev-security-policy@lists.mozilla.org> wrote:
> 
> >
> >
> > We do use another CA's tool to check the validity of CSR. As we do use the
> > cr.sh tool developed also by another CA to check pre-certificate before
> > issuance. So why is using a tool for checking CSR is problematic whereas
> > using the crt.sh is approved ? The point here is to use an efficient tool
> > to perform certificate checks regardless of the CA that owns it. Besides,
> > given that the Tunisian government  does not have a Mozilla trusted CA,  we
> > are forced to buy SSL certificates for Tunisian e-commerce websites from
> > the CA who owns the CSR check tool that we use.
> > In order to have a consistent  RA process, we use the CSR check tool to by
> > from that trusted CA and also to check our own certificates
> >
> 
> It is important to highlight here not that it's intrinsically bad to use
> another tool - indeed, open-source and information sharing are good. The
> point is that your own examination of your software and practices was so
> deficient and lacking that you were unable to do even the most basic
> operations of a CA correctly, and having misrepresented to Mozilla what
> degree of checking you were doing.
> 
> A CSR, however, is such a basic check that even limited technical
> competency can do this. That this is relying on an online check is deeply
> disappointing and highlights a lack of technical competency - which the
> issues bore out - including the OCSP failures that seemingly still persist.
> 
> Recall - the tool was written by a recently added CA, because they simply
> read the specifications and wanted to ensure their systems followed them.
> It would appear TunRootCA2 either did not read the specifications, or could
> not be bothered to read them, or simply relies on off-the-shelf software to
> be a CA - when so much more is expected.
> 
> 
> > > Yet, at the same time, there are still-trusted CAs that have demonstrated
> > > similar issues - although perhaps not to the same degree of becoming a
> > > problematic pattern or an insufficient response - and they still remain
> > > trusted. A recommendation to instill trust in such a new CA, one that has
> > > demonstrated problematic patterns already, suggests that CAs may continue
> > > to display such patterns without risk of distrust - which would overall
> > be
> > > harmful. Yet, if the recommendation is not to trust, what should the
> > > remediation steps be to find a positive path forward?
> >
> >
> > Since there are other still-trusted CAs that has the same problems, why is
> > that the Tunisian CA treated with presumption of untrustworthiness  . The
> > decision-making process should be objective and fair for all CAs.
> >
> 
> You can see I specifically addressed that. To repeat:
> 1) The Tunisian CA has demonstrated a problematic pattern of misissuance
> and misconfiguration that has not, even until 2018-02-22, the most recent
> review, stopped.
> 2) If we believe in a fair and objective process, then the Tunisian CA will
> make the ecosystem worse by accepting, by setting a new low bar of
> competency and correctness.
> 
> So, the fair and objective basis is to look at the pattern and trends - one
> which would reasonably start a discussion of possible distrust - and simply
> say the risk is not worth it.
> 
> 
> > > I do not believe this CA should be trusted, given these patterns. I do
> > not
> > > feel the evidence supports a confident understanding of the critical role
> > > that CAs play, nor an understanding of the technical risks and
> > mitigations.
> > > While it is good that TunRootCA2 has adopted practices such as linting,
> > it
> > > simply moves the problem to be more opaque - how many certificates fail
> > > that check will not be known, nor will it be known how many failures are
> > > now for policy reasons, rather than technical reasons. The community
> > > largely relies on the information provided in audits - with the
> > > expectations that CAs will self-report and self-disclose these issues -
> > and
> > > yet the audits not calling this information out is deeply worrying, both
> > as
> > > to quality and to completeness.
> >
> > During  all the Mozilla review process, our team showed a willingness and
> > a seriousness in the treatment of these incidents. We have implemented all
> > the technical checks required to prevent the occurrence of other miss-
> > issued certificates (pre-issuance linting). We have also reported all
> > missiussed certificates  of our root CA since its creation. There are in
> > total 15 mississued certificates as listed Olfa's last message
> > https://groups.google.com/d/msg/mozilla.dev.security.policy/wCZsVq7AtUY/
> > fFcZ3SepAQAJ
> 
> 
> Disclosure is not something that is exceptional - it is the minimum
> required of a CA. The fact that new misissued certificates continued to be
> issued throughout the process, since its inception, shows a problem. That
> Baseline Requirements dates continued to be missed is equally problematic -
> and shows a process failure at an organization that is not mature enough
> yet to operate a publicly trusted CA.
> 
> Further, the selection of your audit scheme is one that lacks sufficient
> transparency, given the issues, for the degree of trust being requested -
> both to the community and the auditor.
> 
> 
> > The auditor who performed the conformity check of our CA is a Qualified
> > Auditor as  defined in the Baseline Requirements section 8.2. and required
> > by Mozilla. I believe that your recommendation of selecting a new auditor
> > is unacceptable. Calling into question the accreditation of the auditor is
> > exceeding limits
> >
> 
> If you will look throughout this archives, you will find auditors routinely
> questioned, and some explicitly prohibited as unacceptable. Similarly,
> there are some auditors who, due to the quality of the audits relative to
> the risk overseen as part of their examination, are no longer accepted.
> 
> The reality is that for several years, you have had critical control
> failures. Under an ETSI scheme, that can result in the suspension of your
> certificate, but, unfortunately, under ETSI, it can be reinstated after
> remediation. Given that the auditor has not disclosed whether any
> suspensions took place - or if the Tunisian Government even reported these
> material findings to their auditors - we must naturally question the entire
> sequence of audits since this roots foundation.
> 
> Given that, the only potential risk reduction the community has is to start
> over, such that the point of trust begins from a known trustworthy time
> period, without there being a continuing question as to what other
> misissuance has been conducted in the past 2+ years.
> 
> 
> > > These recommendations are based on the balance of the potential global
> > risk
> > > and the potential limited use that the existing CA is intended for.
> > I do believe that your recommendations are alarmist and lack of
> > objectivity considering the potential limited use of our national CA .
> 
> 
> Everything I stated was supported by the facts. Similarly, given that trust
> in a CA is global, and amounts to giving the Tunisian Government keys to
> the Internet. In Particular, Section 2.1 details that CAs must provide
> "some service relevant to _typical_ users of our software products"
> (emphasis added). Given the global nature of the Internet, a CA limited to
> a single country is arguably not typical, if the certificates it issues and
> people consume are limited to that. Similarly, if the purpose is to provide
> certificates that will be 'used' (relied upon) by only a limited amount of
> users, then, by definition, they are not typical users.
> 
> As a CA, however, you surely understand the process of risk assessment and
> mitigation. For example, it's unlikely that your CA is hardened against
> solar radiation causing bitflips within critical systems. Why is this
> acceptable? Because we know the probability of such flips is low, and while
> the consequence of such a flip is high, the cost of protecting and
> shielding against such stray neutrinos is prohibitive to the risk.
> Conversely, we know that jumping from plane 39,000 meters in the sky is
> possible - as Felix Baumgartner showed - yet the risk is incredibly high in
> doing so. One must mitigate that with ample and careful training,
> otherwise, it's foolishness to just do it.
> 
> Trusting a CA is like that. Operating a CA requires a high degree of
> competence and excellence, and each CA applying for inclusion should be as
> or more competent, as or more skilled, and as or more valuable, as they
> otherwise bring the ecosystem down rather than lifting it up.

your effort lifting up CA ecosystem will not pay off by rejecting new CA 
application. 
You should also consider rejecting trusted CAs that still have miss-issuance 
concerns despite their well-established certificate issuance process and this 
is  a fact. You have much more renewal request than new inclusion

If you do have a list of unacceptable auditors, it should be clearly stated in 
Mozilla Policy so that all CAs will be informed. 
Running through the archives is not considered  an appropriate way of 
information for a selection process as demanding as this.

Having a fair and objective process requires applying the same acceptance or 
rejection criteria to all CAs. 
Otherwise it will be a double standards process.
Anyway, we are looking forward to get the official outcome of Mozilla, and we 
will spare no effort to be listed among Mozilla Trusted CA 
_______________________________________________
dev-security-policy mailing list
dev-security-policy@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security-policy

Reply via email to