Why are we discussing ad hoc crypto restrictions on n & e, instead of simply 
fixing an obviously flawed method of fingerprinting a key?

--
James Manger

----- Reply message -----
From: "Nat Sakimura" <[email protected]>
Date: Wed, Feb 26, 2014 7:11 pm
Subject: [security] OIC self-issued mode is insecure
To: "Manger, James" <[email protected]>
Cc: "Mike Jones" <[email protected]>, 
"[email protected]" <[email protected]>

Hi James,

Thanks for pointing it out.

2014-02-25 16:48 GMT-08:00 Manger, James 
<[email protected]<mailto:[email protected]>>:
> Having implementations verify that RSA key lengths are powers of two
> seems like it could be one mitigation.

I don’t think so. There are 1536-bit keys (just as historically there were 
768-bit keys). I’m sure some will pick 3072-bit keys.
Also, many supposedly 2048-bit keys are actually 2047-bit keys (still a product 
of two 1024-bit primes). Reject those and you will break things.


ANSI X9.31 requires the key length to be the multiple of 256 bit, does it not?
In which case, is not 2047-bit keys rejected?

Also, I was wondering if requiring exponent to be selected from small set of 
candidates or even just requiring one would help.

i.e., either e={3, 5, 17, 257, 65537} or e=65537.

Some specs requires e>=65537 so just requiring e=65537 is not unreasonable, I 
think. Also, there is a study [1]  that over 95% of the e value used is 65537. 
Adding the fact that Windows' CAPI only accepts 65537 as e in key generation 
makes me think that most if not all libraries will accept 65537 as an input 
value for signature verification that in this particular case of self-issued 
provider, it would not hart to mandate that e be 65537. Do not you think it 
would help in the case of RSA?

[1] https://eprint.iacr.org/2012/064.pdf
--
Nat Sakimura (=nat)
Chairman, OpenID Foundation
http://nat.sakimura.org/
@_nat_en
_______________________________________________
security mailing list
[email protected]
http://lists.openid.net/mailman/listinfo/openid-security

Reply via email to