Now, we have done some initial work on postquantum extensions for TLS for 
privacy; the (now expired, soon to be refreshed) draft 
https://datatracker.ietf.org/doc/draft-ietf-tls-hybrid-design/

Might I suggest that any comments you make be in reference to that draft?  I 
don’t mind if you disagree with the draft (that’s rather the point of an IETF 
draft, to see if someone can suggest better ideas), but we have already 
discussed some of these issues in the working group.

In any case, I have some responses inline

From: TLS <tls-boun...@ietf.org> On Behalf Of Phillip Hallam-Baker
Sent: Friday, August 5, 2022 2:54 PM
To: Thom Wiggers <t...@thomwiggers.nl>
Cc: <tls@ietf.org> <tls@ietf.org>
Subject: [TLS] Before we PQC... Re: PQC key exchange sizes

Before we dive into applying the NIST algorithms to TLS 1.3 let us first 
consider our security goals and recalibrate accordingly.

I have the luxury of having 0 users. So I can completely redesign my 
architecture if needs be.

I would disagree; after all, we do have existing TLS 1.3 implementations.  I 
believe it is important to avoid unnecessary changes, for two reasons:

  *   To avoid disrupting the existing implementations (and make it easier to 
upgrade to postquantum)
  *   To keep the existing TLS 1.3 security proofs valid (after all, a large 
point of the TLS 1.3 design was to be provable; it would appear shortsighted to 
discard that)
Now, if there were a real need to rearchitect things to be postquantum, well, 
we’ll have to live with it.  However, at least for the security goal of 
privacy, I don’t see the need.

But the deeper I have got into Quantum Computer Cryptanalysis (QCC) PQC, the 
less that appears to be necessary.

First off, let us level set. Nobody has built a QC capable of QCC and it is 
highly unlikely anyone will in the next decade.
So what we are concerned about today is data we exchange today being 
cryptanalyzed in the future. We can argue about that if people want but can we 
at least agree that our #1 priority should be confidentiality.
Agreed; that’s what we conclude in the draft.

So the first proposal I have is to separate our concerns into two separate 
parts with different timelines:

#1 Confidentiality, we should aim to deliver a standards based proposal by 2025.
#2 Fully QCC hardened spec before 2030.

That immediately reduces our scope to confidentiality. QCC of signature keys is 
irrelevant as far as the priority is concerned. TLS can wait for the results of 
round 4 before diving into signatures at the very least.
Of course, none of the round 4 candidates are signature schemes :-)
On a less snide note, a ‘fully QCC hardened spec’ would most likely depend on 
postquantum certificates; we’re working on that in the lamps working group…

[This is not the case for the Mesh as I need a mechanism that enables me to 
upgrade from my legacy base to a PQC system. The WebPKI should probably give 
some thought to these concerns as well. We should probably be talking about 
deploying PQC root keys but that is not in scope for TLS.]

Second observation is that all we have at this point is the output of the NIST 
competition and that is not a KEM. No sorry, NIST has not approved a primitive 
that we can pass a private key to and receive that key back wrapped under the 
specified public key. What NIST actually tested was a function to which we pass 
a public key and get back a shared secret generated by the function and a blob 
that decrypts to the key.

NIST did not approve 'KYBER' at least it has not done so yet. The only 
primitive we have at this point is what NIST actually tested. Trying to extract 
the Kyber function from that code and use it independently is not kosher for a 
standards based protocol. The final report might well provide that function but 
it might not and even if it does, the commentary from the cryptographers 
strongly suggests that any use of the inner function is going to be accompanied 
by a lot of caveats.

I’m trying to figure out what you’re saying; at least one of us is confused.  
NIST asked for a Key Exchange Mechanism (KEM), and Kyber meets that definition 
(which is essentially what you describe; both sides gets a shared secret).  
That is the functionality that TLS needs, and is the functionality that NIST 
(and others) evaluated.  Yes, there are internal functions within Kyber; no one 
is suggesting those be used directly.  And, yes, NIST might tweak the precise 
definition of Kyber before it is formally approved; any such tweak would be 
minor (and there might not be any at all); if they do make such a change, it 
should not be difficult to modify any draft we put out to account for that 
change.

Since we won't have that report within the priority timeline, I suggest people 
look at the function NIST tested which is a non-interactive key establishment 
protocol. If you want to use Kyber instead of the NIST output, you are going to 
have to wait for the report before we can start the standards process.

Third observation is that people are looking at how to replace existing modules 
in the TLS exchange rather than asking where the most appropriate point to 
deploy PQC is. I have faced that exact same problem in the Mesh and I have a 
rather bigger problem because the Mesh is all about Threshold cryptography and 
there is no NIST threshold PQC algorithm yet.

The solution I am currently working with is to regard QCC at the same level as 
a single defection. So if Alice has established a separation of the decryption 
role between Bob and the Key Service, both have to defect (or be breached) for 
disclosure to occur. Until I get Threshold PQC, I am going to have to accept a 
situation in which the system remains secure against QCC but only if the key 
service does not defect.

Applying the same principles to TLS we actually have two key agreements in 
which we might employ PQC:

1) The primary key exchange
2) The forward secrecy / ephemeral / rekey

It was my understanding that TLS 1.3 didn’t do rekeys (ChangeCipherSuite).  
Instead, the key agreement is done only at connection establishment time, and 
could be broken into:

  *   Full negotiation (the client not having any context)
  *   Resumption (0-RTT or 1-RTT) negotiation (where the client has some secret 
data from a previous full negotiation)
Because those are both fresh negotiations, I don’t see any alternative to using 
a postquantum key exchange for both.

Looking at most of the proposals they seem to be looking to drop the PQC scheme 
into the ephemeral rekeying. That is one way to do it but does the threat of 
QCC really justify the performance impact of doing that?
As Uri said, the main cost is the size of the key shares; for wired networks, 
that’s fairly small.  For wireless networks, it’s more annoying, but if we’re 
interested in postquantum security, I don’t see an alternative.

PQC hardening the initial key exchange should suffice provided that we fix the 
forward secrecy exchange so that it includes the entropy from the primary. This 
would bring TLS in line with Noise and with best practice. It would be a change 
to the TLS key exchange but one that corrects an oversight in the original 
forward secrecy mechanism.
If you want to suggest that we modify how TLS provides forward secrecy, well, 
that would not appear to me to depend on whether we’re using postquantum 
crypto.  Might I suggest making that a separate proposal for the working group 
to consider?


PHB



_______________________________________________
TLS mailing list
TLS@ietf.org
https://www.ietf.org/mailman/listinfo/tls

Reply via email to