-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Aloha!
John Denker wrote:
> On 09/15/2013 03:49 AM, Kent Borg wrote:
>
>> When Bruce Schneier last put his hand to designing an RNG he
>> concluded that estimating entropy is doomed. I don't think he
>> would object to some coarse order-of-magnitude
John Kelsey wrote:
> I think the big problem with (b) is in quantifying the entropy you get.
Maybe don't.
When Bruce Schneier last put his hand to designing an RNG he concluded that
estimating entropy is doomed. I don't think he would object to some coarse
order-of-magnitude confirmation that t
On 09/15/2013 10:19 AM, John Kelsey wrote:
But those are pretty critical things, especially (a). You need to know
whether it is yet safe to generate your high-value keypair. For that,
you don't need super precise entropy estimates, but you do need at
least a good first cut entropy estimate--doe
On 15/09/13 00:38 AM, Kent Borg wrote:
On 09/14/2013 03:29 PM, John Denker wrote:
And once we have built such vaguely secure systems, why reject entropy
sources within those systems, merely because they you think they look
like "squish"? If there is a random component, why toss it out?
He'
On Sep 14, 2013, at 5:38 PM, Kent Borg wrote:
>> Things like clock skew are usually nothing but squish ... not reliably
>> predictable, but also not reliably unpredictable. I'm not interested in
>> squish, and I'm not interested in speculation about things that "might" be
>> random.
>
> I see
On 09/15/2013 03:49 AM, Kent Borg wrote:
> When Bruce Schneier last put his hand to designing an RNG he
> concluded that estimating entropy is doomed. I don't think he would
> object to some coarse order-of-magnitude confirmation that there is
> entropy coming in, but I think trying to meter entro
Previously I said we need to speak more carefully about these
things. Let me start by taking my own advice:
Alas on 09/14/2013 12:29 PM, I wrote:
> a) In the linux "random" device, /any/ user can mix stuff into the
> driver's pool. This is a non-privileged operation. The idea is that
> it can't
On Sat, Sep 14, 2013 at 12:29 PM, John Denker wrote:
>
> This discussion will progress more smoothly and more rapidly
> if we clarify some of the concepts and terminology.
[...]
>
> Things like clock skew are usually nothing but squish ... not
> reliably predictable, but also not reliably unpredic
On Sep 15, 2013, at 6:49 AM, Kent Borg wrote:
> John Kelsey wrote:
>> I think the big problem with (b) is in quantifying the entropy you get.
>
> Maybe don't.
>
> When Bruce Schneier last put his hand to designing an RNG he concluded that
> estimating entropy is doomed. I don't think he would
Let me a try a different way of stating (what I think is) Denker's point.
>From docs for my RNG, at:
ftp://ftp.cs.sjtu.edu.cn:990/sandy/maxwell/
Discussing Denker's Turbid, found at:
http://www.av8n.com/turbid/paper/turbid.htm
(Quoting)
The unique advantage of Turbid is that it provably delivers
At 10:04 AM 9/12/2013, John Denker wrote:
Quantum noise is the low-temperature asymptote, and thermal noise is
the high-temperature asymptote of the /same/ physical process.
So ... could we please stop talking about "radioactive" random number
generators and "quantum" random number generators?
Your first two categories are talking about the distribution of entropy--we
assume some unpredictability exists, and we want to quantify it in terms of
bits of entropy per bit of output. That's a useful distinction to make, and as
you said, if you can get even a little entropy per bit and know
On 09/14/2013 03:29 PM, John Denker wrote:
Things like clock skew are usually nothing but squish ... not reliably
predictable, but also not reliably unpredictable. I'm not interested
in squish, and I'm not interested in speculation about things that
"might" be random.
I see "theoretical" the
This discussion will progress more smoothly and more rapidly
if we clarify some of the concepts and terminology.
There are at least four concepts on the table:
1) At one extreme, there is 100% entropy density, for example
32 bits of entropy in a 32-bit word. I'm talking about real
physics
Executive summary:
The soundcard on one of my machines runs at 192000 Hz. My beat-up
old laptop runs at 96000. An antique server runs at "only" 48000.
There are two channels and several bits of entropy per sample.
That's /at least/ a hundred thousand bits per second of real
industrial-strengt
15 matches
Mail list logo