> On Jan 20, 2015, at 9:59 AM, Tero Kivinen <kivi...@iki.fi> wrote: > ... > Also there is no such thing as running out of entropy. Entropy is not > consumed when it is used. If you have 256 bits of randomess in the > system, if you use it once, the attacker still needs to know all 256 > bits of randomess to break the system (we are assuming this is secure > crypto system). If you use it 2^256 times, the attacker still needs to > know all 256 bits to be able to break any one of them. ...
Yes, “entropy used up” is an unfortunate misconception created by the authors of early /dev/random implementations. I would put it not quite as strongly as you did. If you have an RNG that uses a strong bit stream extender to extend the entropy pool content, your statement is valid since the strong extender hides the content of the entropy pool. But if the RNG uses a weak extender, then your RNG security basically comes from the entropy in its entropy pool, and knowledge of that pool’s content is leaked by the RNG output. I think some early /dev/random implementations were indeed of that second form, so the “entropy used up” notion was perhaps valid back then. But it isn’t any longer. The Yarrow paper by Scheier et al. makes this analysis quite clear (and they come to the same conclusion as you did as a result). paul _______________________________________________ IPsec mailing list IPsec@ietf.org https://www.ietf.org/mailman/listinfo/ipsec