Re: very high speed hardware RNG

2009-01-04 Thread Ivan Krstić

On Dec 30, 2008, at 5:11 PM, Jerry Leichter wrote:
This is a highly chaotic process, and after a short time, you see a  
completely random-looking dispersion of dye through the liquid.   
Present this to someone and any likely test will say this is quite  
random.  But ... if you slow turn the inner cylinder backwards -  
"slowly", for both directions of turn, depending on details of the  
system - the original line of dye miraculously reappears.


(Laminar flow: .)

--
Ivan Krstić  | http://radian.org

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


Re: very high speed hardware RNG

2008-12-30 Thread Jon Callas


On Dec 30, 2008, at 2:11 PM, Jerry Leichter wrote:


On Dec 30, 2008, at 4:40 PM, Jon Callas wrote:
We don't have a formal definition of what we mean by random. My  
definition is that it needs to be unguessable. If I have a random  
number and the work factor for you to guess it is more or less its  
randomness. It's a Shannonesque way of looking things, but not  
precisely information-theoretic.
I don't think this quite captures the situation.  It's easy to give  
a formal definition of randomness; it's just not clear that such a  
thing can ever be realized.


And that is, pretty much, the point. A formal definition that is no  
guidance to people trying to build things is mathematics as art. Not  
that art is a bad thing -- I adore mathematics as art, and my time as  
a mathematician was all in the art department. It's just not useful to  
the part of me that's an engineer. Pretty has worth, just different  
worth than useful.





Here's one definition:  A random bitstream generator is an  
"isolated" source of an infinite stream of bits, numbered starting  
at zero, with the property that a Turing machine, given as input  
anything about the universe except the internal state of the  
"isolated" source, and bits 0 to n generated by the source, has no  
better than a 50% chance of correctly guessing bit n+1.  The  
difficulty is entirely in that quoted "isolated".  It's not so much  
that we can't define it as that given any definition that captures  
the intended meaning, there are no known systems that we can  
definitely say are "isolated" in that sense.  (Well, there's kind of  
an exception:  Quantum mechanics tells us that the outcome of  
certain experiments is "random", and Bell's Theorem gives us some  
kind of notion of "isolation" by saying there are no hidden  
variables - but this is very technically complex and doesn't really  
say anything nearly so simple.)


A while back, I wrote to this list about some work toward a stronger  
notion of "computable in principle", based on results in quantum  
mechanics that limit the amount of computation - in the very basic  
sense of bit flips - that can be done in a given volume of space- 
time.  The argument is that a computation that needs more than this  
many bit flips can't reasonably be defined as possible "in  
principle" just because we can describe what such a computation  
would look like, if the universe permitted it!  One might produce a  
notion of "strongly computationally random" based on such a theory.   
Curiously, as I remarked i that message, somewhere between 128 and  
256 bits of key, a brute force search transitions from "impractical  
for the forseeable future" to "not computable in principle".  So at  
least for brute force attacks - we're actually at the limits  
already.  Perhaps it might actually be possible to construct such a  
"random against any computation that's possible in principle" source.


A deterministic, but chaotic system that is sufficiently opaque  
gets pretty close to random. Let's just suppose that the model they  
give of photons bouncing in their laser is Newtonian. If there's  
enough going on in there, we can't model it effectively and it can  
be considered random because we can't know its outputs.
I don't like the notion of "opaqueness" in the context.  That just  
means we can't see any order that might be in there.  There's a  
classic experiment - I think Scientific American had pictures of  
this maybe 10 years back - in which you take a pair of concentric  
cylinders, fill the gap with a viscous fluid in which you draw a  
line with dye parallel to the cylinders' common axis.  Now slowly  
turn the inner cylinder, dragging the dye along.  This is a highly  
chaotic process, and after a short time, you see a completely random- 
looking dispersion of dye through the liquid.  Present this to  
someone and any likely test will say this is quite random.  But ...  
if you slow turn the inner cylinder backwards - "slowly", for both  
directions of turn, depending on details of the system - the  
original line of dye miraculously reappears.


That's why it's not enough to have chaos, not enough to have  
opaqueness.  The last thing you want to say is "this system is so  
complicated that I can't model it, so my opponent can't model it  
either, so it's random".  To the contrary, you *want* a model that  
tells you something about *why* this system is hard to predict!


Exactly. You've described a chaotic but easily reversible system, and  
that makes it unsuitable to be random by my "unguessability" metric.  
There exists an algorithm that's faster than guessing, and so  
therefore it isn't random.





However, on top of that, there's a problem that hardware people  
(especially physicists) just don't get about useful randomness,  
especially cryptographic random variables. Dylan said that to live  
outside the law, you must be honest. A cryptographic random  
variable has to look a certain way, 

Re: very high speed hardware RNG

2008-12-30 Thread Jerry Leichter

On Dec 30, 2008, at 4:40 PM, Jon Callas wrote:
We don't have a formal definition of what we mean by random. My  
definition is that it needs to be unguessable. If I have a random  
number and the work factor for you to guess it is more or less its  
randomness. It's a Shannonesque way of looking things, but not  
precisely information-theoretic.
I don't think this quite captures the situation.  It's easy to give a  
formal definition of randomness; it's just not clear that such a thing  
can ever be realized.


Here's one definition:  A random bitstream generator is an "isolated"  
source of an infinite stream of bits, numbered starting at zero, with  
the property that a Turing machine, given as input anything about the  
universe except the internal state of the "isolated" source, and bits  
0 to n generated by the source, has no better than a 50% chance of  
correctly guessing bit n+1.  The difficulty is entirely in that quoted  
"isolated".  It's not so much that we can't define it as that given  
any definition that captures the intended meaning, there are no known  
systems that we can definitely say are "isolated" in that sense.   
(Well, there's kind of an exception:  Quantum mechanics tells us that  
the outcome of certain experiments is "random", and Bell's Theorem  
gives us some kind of notion of "isolation" by saying there are no  
hidden variables - but this is very technically complex and doesn't  
really say anything nearly so simple.)


A while back, I wrote to this list about some work toward a stronger  
notion of "computable in principle", based on results in quantum  
mechanics that limit the amount of computation - in the very basic  
sense of bit flips - that can be done in a given volume of space- 
time.  The argument is that a computation that needs more than this  
many bit flips can't reasonably be defined as possible "in principle"  
just because we can describe what such a computation would look like,  
if the universe permitted it!  One might produce a notion of "strongly  
computationally random" based on such a theory.  Curiously, as I  
remarked i that message, somewhere between 128 and 256 bits of key, a  
brute force search transitions from "impractical for the forseeable  
future" to "not computable in principle".  So at least for brute force  
attacks - we're actually at the limits already.  Perhaps it might  
actually be possible to construct such a "random against any  
computation that's possible in principle" source.


A deterministic, but chaotic system that is sufficiently opaque gets  
pretty close to random. Let's just suppose that the model they give  
of photons bouncing in their laser is Newtonian. If there's enough  
going on in there, we can't model it effectively and it can be  
considered random because we can't know its outputs.
I don't like the notion of "opaqueness" in the context.  That just  
means we can't see any order that might be in there.  There's a  
classic experiment - I think Scientific American had pictures of this  
maybe 10 years back - in which you take a pair of concentric  
cylinders, fill the gap with a viscous fluid in which you draw a line  
with dye parallel to the cylinders' common axis.  Now slowly turn the  
inner cylinder, dragging the dye along.  This is a highly chaotic  
process, and after a short time, you see a completely random-looking  
dispersion of dye through the liquid.  Present this to someone and any  
likely test will say this is quite random.  But ... if you slow turn  
the inner cylinder backwards - "slowly", for both directions of turn,  
depending on details of the system - the original line of dye  
miraculously reappears.


That's why it's not enough to have chaos, not enough to have  
opaqueness.  The last thing you want to say is "this system is so  
complicated that I can't model it, so my opponent can't model it  
either, so it's random".  To the contrary, you *want* a model that  
tells you something about *why* this system is hard to predict!


However, on top of that, there's a problem that hardware people  
(especially physicists) just don't get about useful randomness,  
especially cryptographic random variables. Dylan said that to live  
outside the law, you must be honest. A cryptographic random variable  
has to look a certain way, it has to be honest. It's got to be  
squeaky clean in many ways. A true random variable does not. A true  
random variable can decide that it'll be evenly distributed today,  
normal tomorrow, or perhaps Poisson -- the way we decide what  
restaurant to go to. No, no, not Italian; I had Italian for lunch.


That's why we cryptographers always run things through a lot of  
software. It's also why we want to see our hardware randomness, so  
we can correct for the freedom of the physical process. Imagine a  
die that is marked with a 1, four 4s, and a 5. This die is crap to  
play craps with, but we can still feed an RNG with it. We just need  
to know that it's not w

Re: very high speed hardware RNG

2008-12-30 Thread Jon Callas


The thing that bothers me about this description is the too-easy  
jump between "chaotic" and "random".  They're different concepts,  
and chaotic doesn't imply random in a cryptographic sense:  It may  
be possible to induce bias or even some degree of predictability in  
a chaotic system by manipulating its environment.  I believe there  
are also chaotic systems that are hard to predict in the forward  
direction, but easy to run backwards, at least sometimes.


That's not to say this system isn't good - it probably is - but just  
saying its chaotic shouldn't be enough.




You are saying pretty much what I've been saying about this (and some  
other things).


We don't have a formal definition of what we mean by random. My  
definition is that it needs to be unguessable. If I have a random  
number and the work factor for you to guess it is more or less its  
randomness. It's a Shannonesque way of looking things, but not  
precisely information-theoretic.


A deterministic, but chaotic system that is sufficiently opaque gets  
pretty close to random. Let's just suppose that the model they give of  
photons bouncing in their laser is Newtonian. If there's enough going  
on in there, we can't model it effectively and it can be considered  
random because we can't know its outputs.


However, on top of that, there's a problem that hardware people  
(especially physicists) just don't get about useful randomness,  
especially cryptographic random variables. Dylan said that to live  
outside the law, you must be honest. A cryptographic random variable  
has to look a certain way, it has to be honest. It's got to be squeaky  
clean in many ways. A true random variable does not. A true random  
variable can decide that it'll be evenly distributed today, normal  
tomorrow, or perhaps Poisson -- the way we decide what restaurant to  
go to. No, no, not Italian; I had Italian for lunch.


That's why we cryptographers always run things through a lot of  
software. It's also why we want to see our hardware randomness, so we  
can correct for the freedom of the physical process. Imagine a die  
that is marked with a 1, four 4s, and a 5. This die is crap to play  
craps with, but we can still feed an RNG with it. We just need to know  
that it's not what it seems.


So yeah -- it's a glib confusion between chaotic and random, but  
chaotic enough might be good enough. And the assumption that hardware  
can just be used is bad. Hardware that helpfully whitens is worse.


Jon

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


Re: very high speed hardware RNG

2008-12-30 Thread Jack Lloyd
On Tue, Dec 30, 2008 at 11:45:27AM -0500, Steven M. Bellovin wrote:

> Of course, every time a manufacturer has tried it, assorted people
> (including many on this list) complain that it's been sabotaged by the
> NSA or by alien space bats or some such.

Well, maybe it has. Or maybe it was just not competently implemented,
or perhaps it has a failure mode that was not accounted for. The
design might be perfect but the physical implementation that happens
to be in your computer has a manufacturing flaw such that if the CPU
core voltage is slightly low and the ambient temperature is above 95F,
the raw output becomes biased from a uniform distribution in a subtle
way - how do you detect something like that? I personally would not
trust the direct output of any physical hardware source for anything,
precisely because you cannot measure it, or test for failures, in any
meaningful way. That does not mean it is not a useful thing to have.

> It's not obvious to me that you're right.  In particular, we need to
> consider how such an instruction would interact with a virtual machine
> hypervisor.  Is it a bug or a feature that the hypervisor can't
> intercept the request?  Remember that reproducibility is often a virtue.

We already have this problem with rdtsc and equivalent cycle counter
reading instructions. ISTR that some architectures allow the kernel to
forbid access to the cycle counter - if so, similar techniques could
be used for a rdrandom instruction. For those that don't, the
non-reproducability ship has already sailed (think of rdtsc as a
rdrandom that has a very bad probability distribution).

Reproducability is sometimes a virtue, but sometimes not. I recall
discussions last year, I believe on this list, about how to design a
PRNG that was able to safely deal with VM state rollbacks. A
user-accessible RNG instruction would easily alleviate this problem.

> The JVM could just as easily open /dev/urandom today.

Except when it doesnt exist - in which case most Java software seems
to default to things like seeding a PRNG with the timestamp, because
the other alternatives that are feasible in Java, like interleaving
counter reads among multiple threads, are slow, difficult to implement
correctly, and even more difficult to test.

-Jack

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


Re: very high speed hardware RNG

2008-12-30 Thread Steven M. Bellovin
On Sun, 28 Dec 2008 23:49:06 -0500
Jack Lloyd  wrote:

> On Sun, Dec 28, 2008 at 08:12:09PM -0500, Perry E. Metzger wrote:
> > 
> > Semiconductor laser based RNG with rates in the gigabits per second.
> > 
> > http://www.physorg.com/news148660964.html
> > 
> > My take: neat, but not as important as simply including a decent
> > hardware RNG (even a slow one) in all PC chipsets would be.

Of course, every time a manufacturer has tried it, assorted people
(including many on this list) complain that it's been sabotaged by the
NSA or by alien space bats or some such.
 
> I've been thinking that much better than a chipset addition (which is
> only accessible by the OS kernel in most environments) would be a
> simple ring-3 (or equivalent) accessible instruction that writes 32 or
> 64 bits of randomness from a per-core hardware RNG, something like
> 
> ; write 32 bits of entropy from the hardware RNG to eax register
> rdrandom %eax
> 
> Which would allow user applications to access a good hardware RNG
> directly, in addition to allowing the OS to read bits to seed the
> system PRNG (/dev/random, CryptoGenRandom, or similar)

It's not obvious to me that you're right.  In particular, we need to
consider how such an instruction would interact with a virtual machine
hypervisor.  Is it a bug or a feature that the hypervisor can't
intercept the request?  Remember that reproducibility is often a virtue.
> 
> I think the JVM in particular could benefit from such an extension, as
> the abstractions it puts into place otherwise prevent most of the
> methods one might use to gather high-quality entropy for a PRNG seed.
> 
The JVM could just as easily open /dev/urandom today.

--Steve Bellovin, http://www.cs.columbia.edu/~smb

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


Re: very high speed hardware RNG

2008-12-30 Thread Jack Lloyd
On Sun, Dec 28, 2008 at 08:12:09PM -0500, Perry E. Metzger wrote:
> 
> Semiconductor laser based RNG with rates in the gigabits per second.
> 
> http://www.physorg.com/news148660964.html
> 
> My take: neat, but not as important as simply including a decent
> hardware RNG (even a slow one) in all PC chipsets would be.

I've been thinking that much better than a chipset addition (which is
only accessible by the OS kernel in most environments) would be a
simple ring-3 (or equivalent) accessible instruction that writes 32 or
64 bits of randomness from a per-core hardware RNG, something like

; write 32 bits of entropy from the hardware RNG to eax register
rdrandom %eax

Which would allow user applications to access a good hardware RNG
directly, in addition to allowing the OS to read bits to seed the
system PRNG (/dev/random, CryptoGenRandom, or similar)

I think the JVM in particular could benefit from such an extension, as
the abstractions it puts into place otherwise prevent most of the
methods one might use to gather high-quality entropy for a PRNG seed.

-Jack

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


Re: very high speed hardware RNG

2008-12-30 Thread Jerry Leichter

On Dec 28, 2008, at 8:12 PM, Perry E. Metzger wrote:



Semiconductor laser based RNG with rates in the gigabits per second.

http://www.physorg.com/news148660964.html

My take: neat, but not as important as simply including a decent
hardware RNG (even a slow one) in all PC chipsets would be.

True.

The thing that bothers me about this description is the too-easy jump  
between "chaotic" and "random".  They're different concepts, and  
chaotic doesn't imply random in a cryptographic sense:  It may be  
possible to induce bias or even some degree of predictability in a  
chaotic system by manipulating its environment.  I believe there are  
also chaotic systems that are hard to predict in the forward  
direction, but easy to run backwards, at least sometimes.


That's not to say this system isn't good - it probably is - but just  
saying its chaotic shouldn't be enough.


BTW, a link from this article - at least when I looked at it - went to http://www.physorg.com/news147698804.html 
 : "Quantum Computing: Entanglement May Not Be Necessary".  There are  
still tons of surprises in the quantum computing arena

-- Jerry





Perry
--
Perry E. Metzgerpe...@piermont.com

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com


very high speed hardware RNG

2008-12-28 Thread Perry E. Metzger

Semiconductor laser based RNG with rates in the gigabits per second.

http://www.physorg.com/news148660964.html

My take: neat, but not as important as simply including a decent
hardware RNG (even a slow one) in all PC chipsets would be.

Perry
-- 
Perry E. Metzgerpe...@piermont.com

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com