Mike's considering an encoding that allows dealing with cofactor > 1 curves as if the cofactor = 1.
Below I try to list the costs of cofactor > 1 for common algorithms. The goal is to see how annoying the cofactor actually is, so we can weigh whether proposals like Mike's are worthwhile. Public-key validation --- There's no need for public-key validation with twist-secure curves and single-coordinate or compressed encoding. So there's nothing for the cofactor to affect. DH --- For DH it seems typical and easy to clear the cofactor (as done in Curve25519 by setting the private key to 0 mod cofactor). This means a public key could be tampered into a few different-but-equivalent values by adding points of low order. But a similar DH tampering is possible regardless of cofactor (multiply both public keys by the same scalar). This sort of tampering doesn't accomplish much - at worst, you could imagine Alice depositing something with Bob over a secure channel that she plans to retrieve later based on her long-term DH public key. Alice's naive assumption that Bob must be seeing the "correct" encoding for her public-key could be violated by tampering. All this is easily prevented by the common practice of hashing encoded DH public keys into the session key, or binding them with signatures or MACs. Signatures --- Ed25519 verification can clear the cofactor (fast batch verification) or not (fast single-signature verification). In an anonymity situation an attacker might be able to use a signature that passes the first implementation but fails the second to learn which choice was taken. I suggested standardizing the stricter check (don't clear cofactor) since: * it's the common implementation * batch verification seems rarely (if ever?) used * most verifiers won't worry about leaking this tiny bit of information Robert suggested standardizing the other way, since clearing the cofactor has a very small impact on single-signature performance, but allows ~2x optimization for batch verification. I'm interested in other opinions. Also, are there real uses for batch verification? "Contributory" key agreement --- The term "contributory" pops up in discussions of 25519. The idea seems to be that some protocols are insecure if someone can force different DH shared secrets to the same value, so with 25519 you'd need to either check for low-order inputs, or check for the identity on output. These checks are easy, but easier with cofactor=1 because there's no low-order inputs so nothing to check. However: do protocols that require this property actually exist? AFAIK the term "contributory" comes from Group Key Agreements to describe calculating a shared key based on DH public keys from several parties. But this doesn't seem to be either a security property or a requirement on the DH - a malicious insider to the GKA doesn't need to force the session key, she just executes the GKA and learns the session key. George Danezis has a good analysis [1]. One could argue that cofactor=1 is more robust because forcing the same key in different sessions might assist attacks if the protocol has other mistakes. That actually happened with TLS triple handshake, so can't be dismissed, but it wouldn't be relevant to a well-designed protocol. Summary --- I'm not seeing cofactor>1 as a big deal, or worth much effort to change. It seems like cofactor=1 might be a tiny bit more robust if your protocol has other flaws, and a smidgen more anonymous if you want to batch-verify signatures. Is that it? What am I missing? Trevor [1] https://conspicuouschatter.wordpress.com/2014/06/28/should-group-key-agreement-be-symmetric-and-contributory/ _______________________________________________ Curves mailing list [email protected] https://moderncrypto.org/mailman/listinfo/curves
