Re: [cryptography] random number generator

2014-11-22 Thread James A. Donald

On 2014-11-22 03:01, d...@deadhat.com wrote:



Rather than me listing names, why not just let it rip and run your own
randomness tests on it?


Because that won't tell me if you are performing entropy extraction.

Jytter assumes an x86 machine with multiple asynchronous clocks and
nondeterministic physical devices. This is not a safe assumption. Linux
assumes entropy in interrupt timing and this was the result
https://factorable.net/weakkeys12.extended.pdf.

This falls under the third model of source in my earlier email. Your
extractor might look simple, but your system is anything but simple and
entropy extracted from rdtsc and interrupts amounts to squish.

Looking at the timing on your system and saying it looks random to me
does not cut it. Portable code has to have a way to know system timing is
random on every platform it runs on. The above paper shows that it isn't.

Jytter does something neat but the broad claims you are making and the
broader claims the Jytter web site makes do not pass the sniff test.



By and large, usually, interrupt timing is somewhat random, and, if not 
random, unknowable to the adversary.


But this is not guaranteed, and likely to be untrue if you have several 
identical systems, such as routers, which need randomness at boot up. 
All your routers are likely to wind up generating keys from a rather 
small set of possible keys.


It is extremely easy to get true randomness, or at least randomness 
unknowable to the adversary.  It is extremely hard to get true 
randomness reliably in an unknown or arbitrary system.  You really have 
to tinker your entropy collection to your situation, to your particular 
system.


128 bits of entropy is enough for forever, so the big problem is start up.

A long running system is bound to have plenty of entropy - anything more 
than 128 is plenty.  So if it writes a unique secret key to each boot up 
image, and each boot up has access to a good approximation to the 
current time, we are golden.



___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread James A. Donald

On 2014-11-22 06:31, d...@deadhat.com wrote:

OK, if you think my Jytter TRNG is weak,


I did not say it was weak. I said Jytter (and any other algorithm) is
deterministic when run on an entropy free platform. This is a simple fact.


All platforms have entropy.

If they boot from a physical disk, microturbulence creates true 
randomness in data availability.


If they are on the net, packets arrive at random times with random delays.

If they are using wifi, not only are packets arriving at random times, 
but are affected by random noise.


The question is, does all this entropy show up in Jytter?  I rather 
think it does.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread stef
On Sat, Nov 22, 2014 at 08:13:31PM +1000, James A. Donald wrote:
 The question is, does all this entropy show up in Jytter?  I rather think it
 does.

the question is: is your adversary nature, or human nature?

-- 
otr fp: https://www.ctrlc.hu/~stef/otr.txt
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread Russell Leidich
All, in the interest of clarity:

1. Let's do the math. Let's assume that we have a really dumb entropy
extractor which waits around for 128 interrupts to occur. It just sits in a
loop sampling the timestamp until this criterion is satisfied. It saves all
these time stamps to a big chunk of memory. Suppose, further, that this
system is so very quiescent (but not _perfectly_ so) that the timing of
each interrupt arrives predictably, but for an error of 1 CPU clock tick,
at random. So for instance the time between interrupts is either 10 or 11
ticks. Thus 128 interrupts gives us 128 bits of entropy. In other words,
taken in the aggregate, the timestamp stream will be worth 128 bits of
entropy. But of course we need to produce a short output in the interest
of practicality, so let's say we hash this long timestamp stream through a
cryptographically wonderful PRNG, yielding 128 bits of noise. Applying the
reflexive density constant, we expect that (1-1/e) or so of the 2^128
_theoretically_ possible hashes will be _actually_ possible. So, roughly
speaking, we drop down to 127 bits of entropy. Next, adjust for the fact
that maybe our PRNG ain't so wonderful after all because it has unseen
biases, and maybe we're down to 120 bits. Whatever. We still have a
freaking strong random number at the end of the day -- all from a very
coldbootish system.

2. Empirically, it seems to me that, historically, entropy extractors have
paid none too careful attention to hash quality. So for instance, we end up
doing a commutative operation on the aforementioned timestamp stream. In
that case, the _sum_ would be uncertain by something like +/-11 (root 128).
So then we somehow manage to get this code into general circulation, and no
one one looks at it until stuff starts getting hacked. Finally, we end up
with a paper like in DJ's link, which makes some of us assume that
coldbootishness is too blame for poor entropy, when in fact it was the bias
of the timestamp stream hash. Jytter's hash, in particular, is
noncommutative and reasonably fair; there are vastly better hashes which
could be implemented, but not without touching memory in such a way as to
risk autopseudorandomness (i.e. confusing the TRNG's own pseudorandom
timing with true background system entropy).

3. world's leading experts: Stu has spoken to way more people than I have
about Jytter. But most of his conversations are covered under NDA. I only
met one of these experts personally, who said he had run the gammot of
statistical tests on Jytter and couldn't find a weakness. Then again, he
didn't do the biasing trick that I previously suggested. I would recommend
that everyone assume that no one has ever tested Jytter at all, so
therefore you need to test it yourself and demonstrate how weak it is.

Russell Leidich



On Sat, Nov 22, 2014 at 10:13 AM, James A. Donald jam...@echeque.com
wrote:

 On 2014-11-22 06:31, d...@deadhat.com wrote:

 OK, if you think my Jytter TRNG is weak,


 I did not say it was weak. I said Jytter (and any other algorithm) is
 deterministic when run on an entropy free platform. This is a simple fact.


 All platforms have entropy.

 If they boot from a physical disk, microturbulence creates true randomness
 in data availability.

 If they are on the net, packets arrive at random times with random delays.

 If they are using wifi, not only are packets arriving at random times, but
 are affected by random noise.

 The question is, does all this entropy show up in Jytter?  I rather think
 it does.

 ___
 cryptography mailing list
 cryptography@randombit.net
 http://lists.randombit.net/mailman/listinfo/cryptography

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread Kevin

On 11/22/2014 4:08 AM, James A. Donald wrote:

On 2014-11-22 03:01, d...@deadhat.com wrote:


Rather than me listing names, why not just let it rip and run your 
own

randomness tests on it?


Because that won't tell me if you are performing entropy extraction.

Jytter assumes an x86 machine with multiple asynchronous clocks and
nondeterministic physical devices. This is not a safe assumption. Linux
assumes entropy in interrupt timing and this was the result
https://factorable.net/weakkeys12.extended.pdf.

This falls under the third model of source in my earlier email. Your
extractor might look simple, but your system is anything but simple and
entropy extracted from rdtsc and interrupts amounts to squish.

Looking at the timing on your system and saying it looks random to me
does not cut it. Portable code has to have a way to know system 
timing is
random on every platform it runs on. The above paper shows that it 
isn't.


Jytter does something neat but the broad claims you are making and the
broader claims the Jytter web site makes do not pass the sniff test.



By and large, usually, interrupt timing is somewhat random, and, if 
not random, unknowable to the adversary.


But this is not guaranteed, and likely to be untrue if you have 
several identical systems, such as routers, which need randomness at 
boot up. All your routers are likely to wind up generating keys from a 
rather small set of possible keys.


It is extremely easy to get true randomness, or at least randomness 
unknowable to the adversary.  It is extremely hard to get true 
randomness reliably in an unknown or arbitrary system. You really have 
to tinker your entropy collection to your situation, to your 
particular system.


128 bits of entropy is enough for forever, so the big problem is start 
up.


A long running system is bound to have plenty of entropy - anything 
more than 128 is plenty.  So if it writes a unique secret key to each 
boot up image, and each boot up has access to a good approximation to 
the current time, we are golden.



___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
If this was already brought up I apologize, but how about looking into 
the NIST Randomness Beacon?



--
Kevin


---
This email is free from viruses and malware because avast! Antivirus protection 
is active.
http://www.avast.com

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread Sandy Harris
On Sat, Nov 22, 2014 at 11:58 PM, Russell Leidich pke...@gmail.com wrote:

 1. Let's do the math. Let's assume that we have a really dumb entropy
 extractor ... that the timing of each
 interrupt arrives predictably, but for an error of 1 CPU clock tick, at
 random. ... 128 interrupts gives us 128 bits of entropy. ...
 ... let's say we hash this long timestamp stream through a
 cryptographically wonderful PRNG, yielding 128 bits of noise. Applying the
 reflexive density constant, we expect that (1-1/e) or so of the 2^128
 _theoretically_ possible hashes will be _actually_ possible. So, roughly
 speaking, we drop down to 127 bits of entropy. Next, adjust for the fact
 that maybe our PRNG ain't so wonderful after all because it has unseen
 biases, and maybe we're down to 120 bits. Whatever. We still have a freaking
 strong random number at the end of the day -- all from a very coldbootish
 system.

John Denker's Turbid paper treats the math for this in some detail
with explicit, fairly weak, assumptions about properties of the hash.
It shows that, given a reliable figure for minimum input entropy per
sample (in Turbid, proven, but you could use an estimate  get a
weaker result) you can get within epsilon of full output entropy by
using slightly more inputs.

in your case, hash 128+N samples to get, say, 127.99 bits of entropy
per hash output. N is small, under 20 I think.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread Russell Leidich
in your case, hash 128+N samples to get, say, 127.99 bits of entropy per
hash output. N is small, under 20 I think.

Yeah this certainly inspiring with respect to milking decent entropy from
coldbootish environments. If we assume the use of a good hash, then the
problem reduces to one of asking how much entropy a sample is worth.

But this is where Pandora's box opens up: modern systems -- even mobile
phones -- are so complicated that autopseudorandomness can look very
convincingly like a TRNG. For instance, we could have predictable cache
stalls, bus stalls, pipeline stalls, etc. which interact like a decent PRNG
in order to render the appearance of physical entropy even in the absence
of interrupts. But we could still end up with a painfully narrow set of
possible outputs, which would still be too large to perceive. For instance,
our 128-bit random number might be worth only 70 bits, so we likely
wouldn't detect that weakness until it comes back to bite us in the future.

Massively parallel testing is the way to go. While we can't hope to cover
any decent fraction of a 128-bit space, we _can_ smell the quality of a
TRNG from the LMICBR test with rootish numbers of samples relative to the
whole space: http://jytter.blogspot.com/2012/08/characterization.html

Russell Leidich


On Sat, Nov 22, 2014 at 6:46 PM, Sandy Harris sandyinch...@gmail.com
wrote:

 On Sat, Nov 22, 2014 at 11:58 PM, Russell Leidich pke...@gmail.com
 wrote:

  1. Let's do the math. Let's assume that we have a really dumb entropy
  extractor ... that the timing of each
  interrupt arrives predictably, but for an error of 1 CPU clock tick, at
  random. ... 128 interrupts gives us 128 bits of entropy. ...
  ... let's say we hash this long timestamp stream through a
  cryptographically wonderful PRNG, yielding 128 bits of noise. Applying
 the
  reflexive density constant, we expect that (1-1/e) or so of the 2^128
  _theoretically_ possible hashes will be _actually_ possible. So, roughly
  speaking, we drop down to 127 bits of entropy. Next, adjust for the fact
  that maybe our PRNG ain't so wonderful after all because it has unseen
  biases, and maybe we're down to 120 bits. Whatever. We still have a
 freaking
  strong random number at the end of the day -- all from a very coldbootish
  system.

 John Denker's Turbid paper treats the math for this in some detail
 with explicit, fairly weak, assumptions about properties of the hash.
 It shows that, given a reliable figure for minimum input entropy per
 sample (in Turbid, proven, but you could use an estimate  get a
 weaker result) you can get within epsilon of full output entropy by
 using slightly more inputs.

 in your case, hash 128+N samples to get, say, 127.99 bits of entropy
 per hash output. N is small, under 20 I think.
 ___
 cryptography mailing list
 cryptography@randombit.net
 http://lists.randombit.net/mailman/listinfo/cryptography

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] random number generator

2014-11-22 Thread James A. Donald

On 2014-11-23 09:47, Russell Leidich wrote:

in your case, hash 128+N samples to get, say, 127.99 bits of entropy
per hash output. N is small, under 20 I think.

Yeah this certainly inspiring with respect to milking decent entropy
from coldbootish environments. If we assume the use of a good hash,
then the problem reduces to one of asking how much entropy a sample is
worth.

But this is where Pandora's box opens up: modern systems -- even mobile
phones -- are so complicated that autopseudorandomness can look very
convincingly like a TRNG. For instance, we could have predictable cache
stalls, bus stalls, pipeline stalls, etc. which interact like a decent
PRNG in order to render the appearance of physical entropy even in the
absence of interrupts. But we could still end up with a painfully narrow
set of possible outputs, which would still be too large to perceive. For
instance, our 128-bit random number might be worth only 70 bits, so we
likely wouldn't detect that weakness until it comes back to bite us in
the future.


If there is any true randomness in the system, autopseudorandomness will 
mix it with everything else, and so Jytter will collect it.


But in coldboot system, there may well be very little true randomness.

So, every boot image should have its own unique 128 or 256 bit secret 
unpredictable to an adversary.


___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography