On 2002-07-27, David Wagner uttered to [EMAIL PROTECTED]: >In particular, there is no function f which doesn't waste entropy, >unless |Y|=1. The proof is simple enough that you can probably find it >with a moment's contemplation, but let me know if you want to see it.
I would be lead to pigeonhole; still, this does seem like an eminently teachable moment to me. However, what we're working with in the case of a typical RNG isn't functions between finite buffer-fulls of data, but functions between infinite sets of entire bitstreams which need to be implemented within a finite memory constraint. Whatever the algorithm, it can have state. I'm thinking that is the reasoning which leads people to think that the problem is simple. In this framework one can very well gather statistics and try and normalize them based on what one's seen so far. (The minimalist example here is bias removal.) After that, what goes into the hash is (approximately) white and even with minimal assumptions (i.e. the hash is a one-way permutation) our RNG would seem to approach ideality. True, such an approach will always become a race in modelling the statistics. But still, I would think the generator has the advantage if sufficient margin is left for error. (That is, hash some constant times the number of bits containing the entropy you suspect to be there, based on the best model you've got.) -- Sampo Syreeni, aka decoy - mailto:[EMAIL PROTECTED], tel:+358-50-5756111 student/math+cs/helsinki university, http://www.iki.fi/~decoy/front openpgp: 050985C2/025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2 --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]