On Tuesday, December 29, 2015 02:10:25 pm Brian Smith wrote:
> Note that NIST Special Publication 800-133 [1] defines these separate
> terms, and I suggest we use them in this conversation to avoid confusion:
> 
> Key update: A procedure in which a new cryptographic key is computed as a
> function of the (old) cryptographic key that it will replace.
> 
> Rekey: A procedure in which a new cryptographic key is generated in a
> manner that is independent of the (old) cryptographic key that it will
> replace.
> 
> [1] http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-133.pdf

The current spec mostly uses the former, so yes, I guess we really should stop 
saying "rekey" if this has been defined like this. The single use of "rekey" in 
the doc should probably be changed to "key update" if we keep things as-is, 
then.


On Tuesday, December 29, 2015 02:33:38 pm Eric Rescorla wrote:
> Note: the keys here are *not* derived from the old keys. Rather, they are 
> derived via
> a KDF from the same secret that was used to generate the old keys.

Yes, but they are derived from the same entropy as the old keys. Yes, that's 
not the same thing, but they're not totally independent new keys either. If 
we're only doing this as a hack to fix ciphers that have problems (that really 
should just be fixed instead), then this is fine. I think it's just the fact 
that we _could_ add a full rekey mechanism without a fantastic amount of 
additional effort is what's giving some people pause.

If what we're really talking about is AES-GCM's data volume limits, what would 
be the best fix that just deals with precisely its issue, rather than doing a 
key update?

> Hmm... It seems to me that there are three (at least) three separate
> possible designs:
> 
> 1. Generate the new keys by a new application of the KDF with no additional 
> inputs.
> 2. Generate the new keys by a new application of the KDF with additional 
> random inputs.
> 3. Generate the new keys by a new PK key exchange (potentially also using the 
> old
>     keys and new random inputs).
> 
> These all have different security properties and it's not clear to me which 
> buckets
> these fall into.
> 
> As far as complexity goes, 3 is more complicated than 2 which is more
> complicated than 1.

>From a certain perspective, doing a full new (EC)DHE with new randoms for a 
>full rekey is less of a change to the spec than a simple key update or 
>something in between. It is somewhat akin to a stripped-down version of 
>renegotiation to just renegotiate the ephemeral key, but not any of the other 
>parameters. I think it's worth considering, but I don't know if we absolutely 
>need something here at all.


Dave

_______________________________________________
TLS mailing list
TLS@ietf.org
https://www.ietf.org/mailman/listinfo/tls

Reply via email to