I seem to be channeling mathematicians this morning...

Cheers,
RAH

--- begin forwarded text


Status:  U
From: Somebody with a sheepskin...
To: "R. A. Hettinga" <[EMAIL PROTECTED]>
Subject: Re: Two ideas for random number generation
Date: Wed, 24 Apr 2002 08:44:41 -0600

Bob,

Tim's examples are unnecessarily complicated.

The logistic function f(x) = Ax(1-x) maps the interval [0,1] into itself for
A in the range [0,4].  Hence, for any such A, it can be iterated.

That is, one may start with an x|0 get x|1= f(x|0) -- where x|j means x sub
j -- and repeat, thus: x|(n+1) = f(x|n).

For small enough values of A, the iteration provably converges to a single
value. For slightly larger values, it converges to a pair of values that
alternate every other time -- known as a period 2 sequence.  For a slightly
larger value of A it converges to 4 values that come up over and over
again -- a period 4 sequence.  Some of this is provable, too.

This increase in multiple period states continues briefly for smaller and
smaller changes in the parameter A.  At some point the period becomes
infinite, and the sequence becomes not detectably different from random.
This is an empirical fact, not yet proven so far as I know.

Note that the function is completely deterministic.  If you know x exactly,
you know x|n -- exactly.  But if you know x to only finite precision, you
know very little about x|n.  Specifically, you know only that it is in the
range [0,1].

So.... Pick A large enough.  Pick an arbitrary double precision floating
point number (about 14 digits for 64 bit arithmetic) on a given machine.
Pick an integer N. Iterate the logistic function N times on it.  Take the
sequence of 7 least significant digits.  They're probably uniformly
distributed in the 7 digit integers.

If you don't know the seed, you don't know the sequence, so I guess you can
encrypt with the thing, too.

But you can't prove squat about it!


<Somebody>

----- Original Message -----
From: "R. A. Hettinga" <[EMAIL PROTECTED]>
To: <A math geek or two...>
Sent: Tuesday, April 23, 2002 6:16 PM
Subject: Re: Two ideas for random number generation


>
> --- begin forwarded text
>
>
> Status:  U
> Date: Tue, 23 Apr 2002 09:42:32 -0700
> Old-Subject: Re: Two ideas for random number generation
> From: Tim May <[EMAIL PROTECTED]>
> To: [EMAIL PROTECTED]
> Subject:  Re: Two ideas for random number generation
> Sender: [EMAIL PROTECTED]
>
> On Monday, April 22, 2002, at 11:23  PM, Joseph Ashwood wrote:
> >
> > From: <[EMAIL PROTECTED]>
> >> If a RNG runs off Johnson noise, then the ability to predict its
> >> output would imply the ability to violate the second law of
> >> thermodynamics.  If it runs off shot noise, then the ability to
> >> predict its output would disprove quantum mechanics.
> >
> > Actually there are models that fit the universe that are entirely
> > deterministic.
>
> Could you mention what they are?
>
> Boehm's "hidden variables" model is generally discredited (some would
> say "disproved"). Alternatives to the Copenhagen Interpretation, notably
> EWG/"many worlds," Hartle's "consistent histories," and Cramer's
> transactional model, are still not deterministic, in that the world an
> observer is in ("finds himself in") is still not predictable in advance.
> Operationally, all interpretations give the same results, i.e., the
> Uncertainty Principle. (Which is why I mentioned "hidden variables," the
> only alternative theory which _might_ have restored classical
> Lagrange/Laplace predictability, in theory.)
>
> And even if the world were Newtonian, in a classical billiard ball
> sense, with Planck's constant precisely equal to zero, predictability is
> a chimera. Consider a game of billiards, with perfectly spherical
> billiard balls, a perfectly flat table, etc. Trajectories depend on
> angles to a precision that keeps going deeper and deeper into the
> decimals. For example, predicting the table state after, say, 3 seconds,
> might require knowing positions, speeds, and angles (in other words, the
> vectors) to a precision of one part in a thousand. Doable, one might say.
>
> But after 30 seconds, any "errors" that are greater than one part in a
> billion would lead to "substantially different" table states. Fail to
> know the mass or position or elasticity or whatever of just one of the
> billiard balls to one part in a billion and the outcome is no longer
> "predictable."
>
> After a couple of minutes, the table positions are different if anything
> is not known to one part in, say, 10^50.
>
> Even talk of a "sufficiently powerful computer" is meaningless when the
> dimensions of objects must be known to better than the Planck-Wheeler
> scale (even ignoring issues of whether there's a quantum foam at these
> dimensions).
>
> I feel strongly about this issue, and have thought about it for many
> years. The whole "in principle the Universe can be calculated" was a
> foray down a doomed path WHETHER OR NOT quantum mechanics ever came out.
>
> The modern name for this outlook is "chaos theory," but I believe
> "chaos" gives almost mystical associations to something which is really
> quite understandable: divergences in decimal expansions.
>
> Discrepancies come marching in, fairly rapidly, from "out there in the
> expansion."
>
> Another way of looking at unpredictabality is to say that real objects
> in real space and subject to real forces from many other real objects
> are in the world of the "real numbers" and any representation of a real
> number as a 25-digit number (diameter of the solar system to within 1
> centimeter) or even as a 100-digit number (utterly beyond all hope of
> meaurement!) is just not enough.
>
> (Obscure aside: What if Wheeler and others are right in some of their
> speculations that the universe is ultimately quantized, a la "It from
> Bit"? Notwithstanding that we're now back into quantum realms and
> theories, even if the universe were quantized at Planck-scale levels,
> e.g., at 10^-33 cm levels, the gravitational pull of a distant galaxy
> would be out somewhere at the several hundredth decimal place, as a wild
> guess. The point being, even a grid-based positional system would yield
> the same "real number" issues.)
>
> In short, predictability is a physical and computational chimera: it
> does not, and cannot, exist.
>
>
> > Admittedly the current line of thinking is that entropy
> > exists, but there we still have not proven that it must exist.
>
> Just as no sequence can ever be proved to be "random," so, too, one can
> never prove that entropy "exists" except in the definition sense.
>
> Our best definition of what we mean by random is that of a thing
> (sequence, for example) with no shorter description than itself. This is
> the Kolmogorov-Chaitin definition. Much more is available on the Web
> about this, which is generally known as "algorithmic information theory."
>
> --Tim May
>
> --- end forwarded text
>
>
> --
> -----------------
> R. A. Hettinga <mailto: [EMAIL PROTECTED]>
> The Internet Bearer Underwriting Corporation <http://www.ibuc.com/>
> 44 Farquhar Street, Boston, MA 02131 USA
> "... however it may deserve respect for its usefulness and antiquity,
> [predicting the end of the world] has not been found agreeable to
> experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
>

--- end forwarded text


-- 
-----------------
R. A. Hettinga <mailto: [EMAIL PROTECTED]>
The Internet Bearer Underwriting Corporation <http://www.ibuc.com/>
44 Farquhar Street, Boston, MA 02131 USA
"... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'

Reply via email to