Re: Two ideas for random number generation

2002-04-29 Thread Jim Choate


On Thu, 25 Apr 2002, Ken Brown wrote:

> "Trei, Peter" wrote:
> 
> [...]
> 
> > >Exactly what is the Choatian definition of a PRNG which requires
> > >it to repeat, anyway?
> 
> Possibly confusion between 2 common English meanings of "repeat".
> 
> (1) repeatable, so if someone else runs the same algorithm on similar
> hardware with the same initial conditions they get the same results,
> which a program to calculate pi will be.
> 
> (2) repetitive, so that if you run the algorithim for long enough a
> given sequence comes round again. Which pi isn't.

Exactly, PRNG's -will- repeat in one of these two ways. RNG's will repeat
strictly speakin only for small k-distributions of characters. The smaller
the better.

As to the second, wrong. pi has lots of examples of repeats (visit the Pi
page and see for yourself) at different k-distribution scales. What pi
won't do is repeat the entire sequence; 314159...314159...

If it did that would make it rational (eg 6 or 328328328328...328...).

Not the same thing at all.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






RE: Two ideas for random number generation

2002-04-29 Thread Jim Choate


On Thu, 25 Apr 2002, Trei, Peter wrote:

> My point, I hope it is clear, was to prove that there are deterministic
> algorithms which do not repeat.

There are, AND they are continous and -not- based on NOT-AND-OR. I
-never- said there were not deterministic algorithms but then again those
algorithms won't run on a digital computer without losing precision which
-proves- my point, not yours.

Thanks.

> When Jim realized what an fool
> he'd made of himself, he decided to change the subject;

You're the fool, lying and then expecting me not to call you on it.

> first by  claiming this would be a pretty lousy PRNG to use for a cipher

Pi -is- a pretty lousy seed for anything. You are of course welcome to cut
your own head off any time you want any way you want. I'll be a tad more
cautious and considerate of the future and unknowns.

> (of course it is

Unhuh, keep telling yourself that...

 - my point concerned repeated sequences, not 
> making a good cipher), and then to blather about k-distribution 
> (which may be a characteristic of a good PRNG, but is irrelevant 
> to my point).

If you don't understand the importance of k-distribution to what a RNG can
do then  you don't know as much as you think. You should perhaps read
Knuths blathering on the subject, you might learn something.

> I suspect the if Jim were correct, he might actually 
> have a solution to the Halting Problem

There is no solution to the Halting Problem, silly goose. And that you
think there could be is a prime example of the level you operate at.

Peter, as usual, you're full of shit.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Two ideas for random number generation

2002-04-29 Thread Jim Choate


Comments inline...

On Wed, 24 Apr 2002, R. A. Hettinga wrote:

> I seem to be channeling mathematicians this morning...
> 
> Cheers,
> RAH
> 
> --- begin forwarded text
> 
> 
> Status:  U
> From: Somebody with a sheepskin...
> To: "R. A. Hettinga" <[EMAIL PROTECTED]>
> Subject: Re: Two ideas for random number generation
> Date: Wed, 24 Apr 2002 08:44:41 -0600
> 
> Bob,
> 
> Tim's examples are unnecessarily complicated.
> 
> The logistic function f(x) = Ax(1-x) maps the interval [0,1] into itself for
> A in the range [0,4].  Hence, for any such A, it can be iterated.
> 
> That is, one may start with an x|0 get x|1= f(x|0) -- where x|j means x sub
> j -- and repeat, thus: x|(n+1) = f(x|n).
> 
> For small enough values of A, the iteration provably converges to a single
> value. For slightly larger values, it converges to a pair of values that
> alternate every other time -- known as a period 2 sequence.  For a slightly
> larger value of A it converges to 4 values that come up over and over
> again -- a period 4 sequence.  Some of this is provable, too.

This is called 'bifurcation'...a mathematician should know that.

> This increase in multiple period states continues briefly for smaller and
> smaller changes in the parameter A.  At some point the period becomes
> infinite, and the sequence becomes not detectably different from random.
> This is an empirical fact, not yet proven so far as I know.

This person should start with Mandelbrots bookx=(x^2)+1 is
'deterministic' as well...but their is a catch to this...

> Note that the function is completely deterministic.  If you know x exactly,
> you know x|n -- exactly.  But if you know x to only finite precision, you
> know very little about x|n.  Specifically, you know only that it is in the
> range [0,1].

You -can't, even in -principle-, know the precision to finite levels. This
is the entire basis for chaos theory.
 
> So Pick A large enough.  Pick an arbitrary double precision floating
> point number (about 14 digits for 64 bit arithmetic) on a given machine.

The selection of a special analysis point to prove a general behaviour of
the function is a logical fallicy.

> sequence of 7 least significant digits.  They're probably uniformly
> distributed in the 7 digit integers.

Probably not, in general (there is a special name for this distribution
but I don't remember and I've got four days mail to catch up on) but the
numbers, [0...9], do -not- appear in nature in equal distribution. If you
are basing anything on this then you're in for a world of hurt.

> If you don't know the seed, you don't know the sequence, so I guess you can
> encrypt with the thing, too.

Actually that's overly general. Other keys could generate that sequence
(PRNG's repeat, and RNG's aren't predictable) so this test is really
worthless.
 
> But you can't prove squat about it!

Hardly.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Re: Two ideas for random number generation

2002-04-29 Thread Jim Choate


On Wed, 24 Apr 2002 [EMAIL PROTECTED] wrote:

> That is, to get the infinite cycle, you'd have to have some method of
> generating a uniform random integer 0 to infinity for the
> initial state, and you'd need an infinite amount of memory
> to store  the current internal state.  Neither of which
> is acheivable ion practice.

Actually neither of these are requirement for RNG's. You're fist comment
about 'some method' tends to shoot your argument in the foot right off.

As to the second, no you don't need infinite ram to create a RNG. Please
demonstrate the proof. What you need is -continuity-, not -infinity-.
They're not the same thing. If all it took was infinite memory then a
standard Turing machine could, by your thesis, generate a RNG and we know
from basic principles that isn't possible. Why? Their tape is of infinite
length and zero access time.

> Conversely, a PRNG whose cycle is "only" 2^256 bits long
> will never repeat itself during the lifetime of the device, or
> the lifetime of the universe for that matter.

Depends on the key, in general the modulus of the PRNG is key dependent.
This is another weakness PRNG's.

In addition, the vast majority of PRNG's have a modulus considerably below
2^256. Most of them I've worked with have had modulus lengths less than a
couple of million.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org





Re: Two ideas for random number generation

2002-04-29 Thread Jim Choate


On Wed, 24 Apr 2002, David Howe wrote:

> > No it isn't. You -want- a RNG but you can't have one. Nobody
> > -wants- a PRNG, they -settle- for it.
> I think there is some confusion here - if you are using a PRNG as a stream
> cypher, the last thing in the world you want is for it to be truely random -
> you need to sync up two prngs in order to decrypt the message, and
> randomness would defeat that (I can see a case where you introduce a little
> randomness and use some redundant method to strip it out before encryption,
> but that's only a second layer of obscurity of little value if the
> mainstream crypto is borken.

You (and others I'm sure) seem to miss the point.

I am -not- saying that -all- applications -want- a RNG. Why?

Repeatability. One of the two factors that describe a PRNG.

-If- you have a application using a RNG then, unless you're willing to
break out of the NOT-AND-OR straight jacket then you're stuck with a PRNG.

You want a RNG but can't have one...

You folks can go back to sleep now.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: RE: Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-27 Thread Joseph Ashwood
Title: RE: Re: disk encryption modes (Re: RE: Two ideas for random number generation)



 

  - Original Message - 
  From: 
  [EMAIL PROTECTED] 
  To: [EMAIL PROTECTED] 
  
  Sent: Saturday, April 27, 2002 12:11 
  PM
  Subject: CDR: RE: Re: disk encryption 
  modes (Re: RE: Two ideas for random number generation)
  
  Instead of adding 16 bytes to the size of each sector for 
  sector IV's how about having a separate file (which could be stored on a 
  compact flash card, CDRW or other portable media) that contains the IV's for 
  each disk sector? 
Not a very good solution.

   
  You could effectively wipe the encrypted disk merely by wiping 
  the IV file, which would be much faster than securely erasing the entire disk. 
  
   
Actually that wouldn't work, at least not in CBC mode 
(which is certainly my, and seems to be generally favored for disk encryption). 
In CBC mode, not having the IV (setting the IV to 0) only destroys the first 
block, after that everything decrypts normally, so the only wiped portion of the 
sector is the first block.

   
  If the IV file was not available, decryption would be 
  impossible even if the main encryption key was rubberhosed it otherwise 
  leaked. This could be a very desirable feature for the tinfoil-hat-LINUX 
  crowd--as long as you have posession if the compact flash card with the IV 
  file, an attacker with your laptop isn't going to get far cracking your 
  encryption, especially if you have the driver constructed to use a dummy IV 
  file on the laptop somewhere after X number of failed passphrase entries to 
  provide plausible deniability for the existence of the compact flash 
  card.
   
And then the attacker would just get all of your file 
except the first block (assuming the decryption key is found).

   
  To keep the IV file size reasonable, you might want to encrypt 
  logical blocks (1K-8K, depending on disk size, OS, and file system used, vs 
  512 bytes) instead of individual sectors, especially if the file system thinks 
  in terms of blocks instead of sectors. I don't see the value of encrypting 
  below the granularity of what the OS is ever going to write to 
disk.
 
That is a possibility, and actually I'm sure it's 
occurred to the hard drive manufacturers that the next time they do a full 
overhaul of the wire protocol they should enable larger blocks (if they haven't 
already, like I said before, I'm not a hard drive person). This would serve them 
very well as they would have to store less information increasing the disk size 
producible per cost (even if not by much every penny counts when you sell a 
billion devices). Regardless this could be useful for the disk encryption, but 
assuming worst case won't lose us anything in the long run, and should enable 
the best case to be done more easily, so for the sake of simplicity, and 
satisfying the worst case, I'll keep on calling them sectors until there's a 
reason not to.
        
                
        Joe


RE: Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-27 Thread JonathanW
Title: RE: Re: disk encryption modes (Re: RE: Two ideas for random number generation)





Instead of adding 16 bytes to the size of each sector for sector IV's how about having a separate file (which could be stored on a compact flash card, CDRW or other portable media) that contains the IV's for each disk sector? You could effectively wipe the encrypted disk merely by wiping the IV file, which would be much faster than securely erasing the entire disk. If the IV file was not available, decryption would be impossible even if the main encryption key was rubberhosed it otherwise leaked. This could be a very desirable feature for the tinfoil-hat-LINUX crowd--as long as you have posession if the compact flash card with the IV file, an attacker with your laptop isn't going to get far cracking your encryption, especially if you have the driver constructed to use a dummy IV file on the laptop somewhere after X number of failed passphrase entries to provide plausible deniability for the existence of the compact flash card.

To keep the IV file size reasonable, you might want to encrypt logical blocks (1K-8K, depending on disk size, OS, and file system used, vs 512 bytes) instead of individual sectors, especially if the file system thinks in terms of blocks instead of sectors. I don't see the value of encrypting below the granularity of what the OS is ever going to write to disk.




Re: Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-27 Thread Joseph Ashwood

- Original Message -
From: "Adam Back" <[EMAIL PROTECTED]>

> Joseph Ashwood wrote:
> > Actually I was referring to changing the data portion of the block
> > from {data} to {IV, data}
>
> Yes I gathered, but this what I was referring to when I said not
> possible.  The OSes have 512Kbytes ingrained into them.  I think you'd
> have a hard time changing it.  If you _could_ change that magic
> number, that'd be a big win and make the security easy: just pick a
> new CPRNG generated IV everytime you encrypt a block.  (CPRNG based on
> SHA1 or RC4 is pretty fast, or less cryptographic could be
> sufficient depending on threat model).

>From what I've seen of a few OSs there really isn't that much binding to 512
Kbytes in the OS per se, but the file system depends on it completely.
Regardless the logic place IMO to change this is at the disk level, if the
drive manufacturers can be convinced to produce drives that offer 512K+16
byte sectors. Once that initial break happens, all the OSs will play catchup
to support the drive, that will break the hardwiring and give us our extra
space. Of course convincing the hardware vendors to do this without a
substantial hardware reason will be extremely difficult. On our side though
is that I know that hard disks store more than just the data, they also
store a checksum, and some sector reassignment information (SCSI drives are
especially good at this, IDE does it under the hood if at all), I'm sure
there's other information, if this could be expanded by 16 bytes, that'd
supply the necessary room. Again convincing the vendors to supply this would
be a difficult task, and would require the addition of functionality to the
hard drive to either decrypt on the fly, or hand the key over to the driver.

> > Yeah the defragmentation would have to be smart, it can't simply copy
the
> > di[s]k block (with the disk block based IV) to a new location.
>
> Well with the sector level encryption, the encryption is below the
> defragmentation so file chunks get decrypted and re-encrypted as
> they're defragmented.
>
> With the file system level stuff the offset is likley logical (file
> offset etc) rather than absolute so you don't mind if the physical
> address changes.  (eg. loopback in a file, or file system APIs on
> windows).

That's true, I was thinking more as something that will for now run in
software and in the future gets pushed down to the hardware and we can use a
smartcard/USBKey/whatever comes out next to feed it the key. A
meta-filesystem would be useful as a short term measure, but it still keeps
all the keys in system memory where programs can access them, if we can
maintain the option of moving it to hardware later on, I think that would be
a better solution (although also a harder one).

I feel like I'm missing something that'll be obvious once I've found it.
Hmm, maybe there is a halfway decent solution (although not at all along the
same lines). For some reason I was just remembering SAN networks, it's a
fairly known problem to design and build secure file system protocols
(although they don't get used much). So it might actually be a simpler
concept to build a storage area network using whatever extra hardened OSs we
need, with only the BIOS being available without a smartcard, put the smart
card in, the smartcard itself decrypts/encrypts sector keys (or maybe some
larger grouping), the SAN host decrypts the rest. Pull out the smartcard,
the host can detect that, flush all caches and shut itself off. This has
some of the same problems, but at least we're not going to have to design a
hard drive, and since it's a remote file system I believe most OSs assume
very little about sector sizes. Of course as far as I'm concerned this
should still be just a stopgap measure until we can move that entire SAN
host inside the client computer.

Now for the biggest question, how do we get Joe Public to actually use this
correctly (take the smart card with them, or even not choose weak
passwords)?
Joe




Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-27 Thread Adam Back

Joseph Ashwood wrote:
> Adam Back Wrote:
> > > This becomes completely redoable (or if you're willing to sacrifice
> > > a small portion of each block you can even explicitly stor ethe IV.
> >
> > That's typically not practical, not possible, or anyway very
> > undesirable for performance (two disk hits instead of one),
> > reliability (write one without the other and you lose data).
> 
> Actually I was referring to changing the data portion of the block
> from {data} to {IV, data}

Yes I gathered, but this what I was referring to when I said not
possible.  The OSes have 512Kbytes ingrained into them.  I think you'd
have a hard time changing it.  If you _could_ change that magic
number, that'd be a big win and make the security easy: just pick a
new CPRNG generated IV everytime you encrypt a block.  (CPRNG based on
SHA1 or RC4 is pretty fast, or less cryptographic could be
sufficient depending on threat model).

> placing all the IVs at the head of every read. This of course will
> sacrifice k bits of the data space for little reason.

Security / space trade off with no performance hit (other than needing
to write 7% or 14% more data depending on size of IV) is probably more
desirable than having to doubly encrypt the block and take a 2x cpu
overhead hit.  However as I mentioned I don't think it's practical /
possible due to OS design.

> > Note in the file system level scenario an additional problem is file
> > system journaling, and on-the-fly disk defragmentation -- this can
> > result in the file system intentionally leaving copies of previous or
> > the same plaintexts encrypted with the same key and logical position
> > within a file.
> 
> Yeah the defragmentation would have to be smart, it can't simply copy the
> dick block (with the disk block based IV) to a new location. 

Well with the sector level encryption, the encryption is below the
defragmentation so file chunks get decrypted and re-encrypted as
they're defragmented.

With the file system level stuff the offset is likley logical (file
offset etc) rather than absolute so you don't mind if the physical
address changes.  (eg. loopback in a file, or file system APIs on
windows).

> > Another approach was Paul Crowley's Mercy cipher which has a 4Kbit
> > block size (= 512KB = sector sized).  But it's a new cipher and I
> > think already had some problems, though performance is much better
> > than eg AES with double CBC, and it means you can use ECB mode per
> > block and key derived with a key-derivation function salted by the
> > block-number (the cipher includes such a concept directly in it's
> > key-schedule), or CBC mode with an IV derived from the block number
> > and only one block, so you don't get the low-tide mark of edits you
> > get with CBC.
> 
> It's worse than that, there's actually an attack on the cipher. Paul details
> this fairly well on his page about Mercy.

Yes, that's what I was referring to by "already had some problems".

Adam
--
http://www.cypherspace.org/adam/



Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-27 Thread Joseph Ashwood

- Original Message -
From: "Adam Back" <[EMAIL PROTECTED]>

> On Fri, Apr 26, 2002 at 11:48:11AM -0700, Joseph Ashwood wrote:
> > From: "Bill Stewart" <[EMAIL PROTECTED]>
> > > I've been thinking about a somewhat different but related problem
lately,
> > > which is encrypted disk drives.  You could encrypt each block of the
disk
> > > with a block cypher using the same key (presumably in CBC or some
similar
> > > mode), but that just feels weak.
> >
> > Why does it feel weak? CBC is provably as secure as the block cipher
(when
> > used properly), and a disk drive is really no different from many
others. Of
> > course you have to perform various gyrations to synchronise everything
> > correctly, but it's doable.
>
> The weakness is not catastrophic, but depending on your threat model
> the attacker may see the ciphertexts from multiple versions of the
> plaintext in the edit, save cycle.

That could be a problem, you pointed out more information in your other
message, but obviously this would have to be dealt with somehow. I was goign
to suggest that maybe it would be better to encrypt at the file level, but
this can very often leak more information, and depending on how you do it,
will leak directory stucture. There has to be a better solution.

> > Well it's not all the complicated. That same key, and encrypt the disk
> > block number, or address or anything else.
>
> Performance is often at a premium in disk driver software --
> everything moving to-and-from the disk goes through these drivers.
>
> Encrypt could be slow, encrypt for IV is probably overkill.  IV
> doesn't have to be unique, just different, or relatively random
> depending on the mode.
>
> The performance hit for computing IV depends on the driver type.
>
> Where the driver is encrypting disk block at a time, then say 512KB
> divided (standard smallest disk block size) into AES block sized
> chunks 16 bytes each is 32 encrypts per IV geenration.  So if IV
> generation is done with a block encrypt itself that'll slow the system
> down by 3.125% right there.
>
> If the driver is higher level using file-system APIs etc it may have
> to encrypt 1 cipher block size at a time each with a different IV, use
> encrypt to derive IVs in this scenario, and it'll be a 100% slowdown
> (encryption will take twice as long).

That is a good point, of course we could just use the old standby solution,
throw hardware at it. The hardware encrypts at disk (or even disk cache)
speed on the drive, eliminating all issues of this type. Not a particularly
cost-effective solution in many cases, but a reasonable option for others.

> > This becomes completely redoable (or if you're willing to sacrifice
> > a small portion of each block you can even explicitly stor ethe IV.
>
> That's typically not practical, not possible, or anyway very
> undesirable for performance (two disk hits instead of one),
> reliability (write one without the other and you lose data).

Actually I was referring to changing the data portion of the block from
{data}
to
{IV, data}

placing all the IVs at the head of every read. This of course will sacrifice
k bits of the data space for little reason.

> > > I've been thinking that Counter Mode AES sounds good, since it's easy
> > > to find the key for a specific block.   Would it be good enough just
to
> > use
> > >  Hash( (Hash(Key, block# ))
> > > or some similar function instead of a more conventional crypto
function?
> >
> > Not really you'd have to change the key every time you write to
> > disk, not exactly a good idea, it makes key distribution a
> > nightmare, stick with CBC for disk encryption.
>
> CBC isn't ideal as described above.  Output feedback modes like OFB
> and CTR are even worse as you can't reuse the IV or the attacker who
> is able to see previous disk image gets XOR of two plaintext versions.
>
> You could encrypt twice (CBC in each direction or something), but that
> will again slow you down by a factor of 2.
>
> Note in the file system level scenario an additional problem is file
> system journaling, and on-the-fly disk defragmentation -- this can
> result in the file system intentionally leaving copies of previous or
> the same plaintexts encrypted with the same key and logical position
> within a file.

Yeah the defragmentation would have to be smart, it can't simply copy the
dick block (with the disk block based IV) to a new location. This problem
disappears in the {IV, data} block type, but that has other problems that
are at least as substantial.

> So it's "easy" if performance is not an issue.

Or if you decide to throw hardware at it.

> Another approach was Paul Crowley's Mercy cipher which has a 4Kbit
> block size (= 512KB = sector sized).  But it's a new cipher and I
> think already had some problems, though performance is much better
> than eg AES with double CBC, and it means you can use ECB mode per
> block and key derived with a key-derivation function salted by the
> block-number (the ciph

Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-26 Thread Adam Back

Right, it sounds like the same approach I alluded to, except I didn't
use a salt -- I just used a fast pseudon random number generator to
make the IV less structured than using the block number directly.

I did some experiments with a used disk and found that if you use the
block number directly for the IV, with CBC mode the block number and
plaintext difference cancel to result in the same input text to the
block cipher, resulting in the same ciphertext in a fair proportion of
cases (don't have the figures handy, but clearly this
not-insignificant number of collisions represents a leakage about the
plaintext on the disk).

The with aforementioned fast pseudo-random number generator I got no
collisions on the disk sample size (10Gig disk almost full of windows
application software and data).

I figure that's good empirical evidence of the soundness of the
approach, however another glitch may be if you consider that the
attacker can work partly from the inside -- eg influencing the
plaintext choice, as well as having read-only access to the ciphertext
-- in this case he could perhaps build up a partial codebook for the
cipher with the disk key, by influencing plaintext choices to create
values equal to the suspected differences between plaintext and
predicatable IVs.

How do you salt the random number generator?  Is it resistant to the
above type of attack do you think?

Adam

On Sat, Apr 27, 2002 at 11:19:04AM +1000, Julian Assange wrote:
> > You could encrypt twice (CBC in each direction or something), but that
> > will again slow you down by a factor of 2.
> 
> You can't easily get away with storing the IV as multiple parts of the IO
> pipe like to see blocks in 2^n form. 
> 
> The approach I take in Rubberhose is to calculate the IV from a
> very fast checksum routine and salt.



Re: disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-26 Thread Julian Assange

> You could encrypt twice (CBC in each direction or something), but that
> will again slow you down by a factor of 2.

You can't easily get away with storing the IV as multiple parts of the IO
pipe like to see blocks in 2^n form. 

The approach I take in Rubberhose is to calculate the IV from a
very fast checksum routine and salt.

--
 Julian Assange|If you want to build a ship, don't drum up people
   |together to collect wood or assign them tasks and
 [EMAIL PROTECTED]  |work, but rather teach them to long for the endless
 [EMAIL PROTECTED]  |immensity of the sea. -- Antoine de Saint Exupery



disk encryption modes (Re: RE: Two ideas for random number generation)

2002-04-26 Thread Adam Back

On Fri, Apr 26, 2002 at 11:48:11AM -0700, Joseph Ashwood wrote:
> From: "Bill Stewart" <[EMAIL PROTECTED]>
> > I've been thinking about a somewhat different but related problem lately,
> > which is encrypted disk drives.  You could encrypt each block of the disk
> > with a block cypher using the same key (presumably in CBC or some similar
> > mode), but that just feels weak.
> 
> Why does it feel weak? CBC is provably as secure as the block cipher (when
> used properly), and a disk drive is really no different from many others. Of
> course you have to perform various gyrations to synchronise everything
> correctly, but it's doable.

The weakness is not catastrophic, but depending on your threat model
the attacker may see the ciphertexts from multiple versions of the
plaintext in the edit, save cycle.

This could happen for example the attacker has read access to your
disk, or if the attacker gained temporary physical access to the
machine but didn't have enough resources to install software trojan,
or if good disk checksumming is in place, didn't have enough resources
to install hardware trojan.

> > So you need some kind of generator of pretty-random-looking keys so
> > that each block of the disk gets a different key, or at the very
> > least a different IV for each block of the disk, so in some sense
> > that's a PRNG.  (You definitely need a different key for each block
> > if you're using RC4, but that's only usable for Write-Once media,
> > i.e. boring.)  Obviously you need repeatability, so you can't use a
> > real random number generator.
> 
> Well it's not all the complicated. That same key, and encrypt the disk
> block number, or address or anything else. 

Performance is often at a premium in disk driver software --
everything moving to-and-from the disk goes through these drivers.

Encrypt could be slow, encrypt for IV is probably overkill.  IV
doesn't have to be unique, just different, or relatively random
depending on the mode.

The performance hit for computing IV depends on the driver type.

Where the driver is encrypting disk block at a time, then say 512KB
divided (standard smallest disk block size) into AES block sized
chunks 16 bytes each is 32 encrypts per IV geenration.  So if IV
generation is done with a block encrypt itself that'll slow the system
down by 3.125% right there.

If the driver is higher level using file-system APIs etc it may have
to encrypt 1 cipher block size at a time each with a different IV, use
encrypt to derive IVs in this scenario, and it'll be a 100% slowdown
(encryption will take twice as long).

> This becomes completely redoable (or if you're willing to sacrifice
> a small portion of each block you can even explicitly stor ethe IV.

That's typically not practical, not possible, or anyway very
undesirable for performance (two disk hits instead of one),
reliability (write one without the other and you lose data).

> > I've been thinking that Counter Mode AES sounds good, since it's easy
> > to find the key for a specific block.   Would it be good enough just to
> use
> >  Hash( (Hash(Key, block# ))
> > or some similar function instead of a more conventional crypto function?
> 
> Not really you'd have to change the key every time you write to
> disk, not exactly a good idea, it makes key distribution a
> nightmare, stick with CBC for disk encryption.

CBC isn't ideal as described above.  Output feedback modes like OFB
and CTR are even worse as you can't reuse the IV or the attacker who
is able to see previous disk image gets XOR of two plaintext versions.

You could encrypt twice (CBC in each direction or something), but that
will again slow you down by a factor of 2.

Note in the file system level scenario an additional problem is file
system journaling, and on-the-fly disk defragmentation -- this can
result in the file system intentionally leaving copies of previous or
the same plaintexts encrypted with the same key and logical position
within a file.

So it's "easy" if performance is not an issue.

Another approach was Paul Crowley's Mercy cipher which has a 4Kbit
block size (= 512KB = sector sized).  But it's a new cipher and I
think already had some problems, though performance is much better
than eg AES with double CBC, and it means you can use ECB mode per
block and key derived with a key-derivation function salted by the
block-number (the cipher includes such a concept directly in it's
key-schedule), or CBC mode with an IV derived from the block number
and only one block, so you don't get the low-tide mark of edits you
get with CBC.

But Mercy as a set of design criteria is very interesting for this
application.

Adam
--
http://www.cypherspace.org/adam/




Re: RE: Two ideas for random number generation

2002-04-26 Thread Joseph Ashwood

- Original Message -
From: "Bill Stewart" <[EMAIL PROTECTED]>

> I've been thinking about a somewhat different but related problem lately,
> which is encrypted disk drives.  You could encrypt each block of the disk
> with a block cypher using the same key (presumably in CBC or some similar
> mode),
> but that just feels weak.

Why does it feel weak? CBC is provably as secure as the block cipher (when
used properly), and a disk drive is really no different from many others. Of
course you have to perform various gyrations to synchronise everything
correctly, but it's doable.

> So you need some kind of generator of
> pretty-random-looking keys so that each block of the disk gets a different
key,
> or at the very least a different IV for each block of the disk,
> so in some sense that's a PRNG.  (You definitely need a different key for
each
> block if you're using RC4, but that's only usable for Write-Once media,
> i.e. boring.)
> Obviously you need repeatability, so you can't use a real random number
> generator.

Well it's not all the complicated. That that same key, and encrypt the disk
block number, or address or anything else. This becomes completely redoable
(or if you're willing to sacrifice a small portion of each block you can
even explicitly stor ethe IV.

> I've been thinking that Counter Mode AES sounds good, since it's easy
> to find the key for a specific block.   Would it be good enough just to
use
>  Hash( (Hash(Key, block# ))
> or some similar function instead of a more conventional crypto function?

Not really you'd have to change the key every time you write to disk, not
exactly a good idea, it makes key distribution a nightmare, stick with CBC
for disk encryption.
Joe




Re: Two ideas for random number generation

2002-04-25 Thread Ken Brown

"Trei, Peter" wrote:

[...]

> >Exactly what is the Choatian definition of a PRNG which requires
> >it to repeat, anyway?

Possibly confusion between 2 common English meanings of "repeat".

(1) repeatable, so if someone else runs the same algorithm on similar
hardware with the same initial conditions they get the same results,
which a program to calculate pi will be.

(2) repetitive, so that if you run the algorithim for long enough a
given sequence comes round again. Which pi isn't.

Of course of pi will have "repeats", in the trivial sense that any given
sequence will many times in a sufficiently long enough rendition. But
there is no cycle there, just randomness. The decimal digits "12" will
occur many times, and "123" will occur rather fewer times, and so on,
but it is not repetitive in this sense because seeing "1" followed by
"2" gives you no clue that "3" comes next. And seeing
"9863770323499906322" gives you no clue that next time you see 
"986377032349990632"  you will get "2" next.  (I think - I make no
pretence to know much about maths)

Ken




Re: Two ideas for random number generation

2002-04-25 Thread Tim May

On Thursday, April 25, 2002, at 07:45  AM, Major Variola (ret) wrote:

> At 09:42 AM 4/23/02 -0700, Tim May wrote:
>>
>> And even if the world were Newtonian, in a classical billiard ball
>> sense, with Planck's constant precisely equal to zero, predictability
> is
>> a chimera. Consider a game of billiards, with perfectly spherical
>> billiard balls, a perfectly flat table, etc. Trajectories depend on
>> angles to a precision that keeps going deeper and deeper into the
>> decimals. For example, predicting the table state after, say, 3
> seconds,
>> might require knowing positions, speeds, and angles (in other words,
> the
>> vectors) to a precision of one part in a thousand. Doable, one might
> say.
>
> Predictability gets much worse if one of the walls of a pool-table is
> curved,
> then the uncertainty in a perfectly-round ball's momentum is
> magnified after reflection, compared to a pool-table of 3 or more
> flat walls.

Yes, of course. There are many sources of divergence, and curved walls 
certainly add divergence. But, as you acknowledge, the curvature of the 
spherical balls is a source. In fact, the radius of curvature of a ball 
is much smaller than that of curved side walls, so of course they are 
huge sources of divergence.

And it's important for people not to think that the curved walls or 
curved balls are important to the phenomenon: if all surfaces were 
nominally flat (say, to a sixteenth wavelength, about the best a 
telescope mirror is ground to), the divergences would _still_ occur. 
Tiny alternations in temperature would affect dimensions, friction, 
speeds, and hence would alter arrival times. At some point, objects in 
one history would bounce and in another history would miss...the changes 
at this point are _huge_.

(I actually did a project at Intel which exploited this. I devised a 
scheme whereby "known good" microprocessors would be imaged by an 
electron microcope, state by state (using "beam blanking" to only 
illuminate the chip during a specific state), and then would be 
digitally subtracted or otherwise compared to the internal states of 
chips  having some speed problem, or some voltage problem, or just plain 
failing. By examining the time evolution of divergent states, especially 
by running the history backwards, we could pinpoint the "first 
divergence," the first state where voltage levels differed noticeably. 
This was usually a place where a design needed to be tweaked, or where a 
marginality was present, or a flaw, etc. This machine, the Dynamic Fault 
Imager, was used to get speeds up on the 80286 and later processors.)


>
> You may have meant to imply this --if spherical balls
> hit other balls the uncertainty is similarly magnified-- but its worth
> noting the difference in predictability between flat and curved-wall
> abstract billiards.

And it's also useful for people to understand, deeply, that the 
divergences may "enter" at the fourth decimal place, or the sixth, or 
the twelfth. But they enter, and enter quickly, and the cascade of 
divergences doesn't much involve issues of flat vs. curved walls. The 
divergences are at a much more profound level.

BTW, someone was speculating about "history healing itself" (this is my 
term for it, a familiar trope in SF). A look at the billiard ball model 
will show how this cannot conceivably happen: as soon as one ball 
"misses" another, all of the later trajectories are radically different. 
If this is not immediately clear, spend several minutes drawing pictures.

If the writer was talking about "conserved quantities" (I think he 
mentioned vapors in an elevator, for example), this is not at all what 
we are talking about. Yes, the total energy of the billiard balls will 
remain roughly the same in both histories, though divergences will occur 
even in total energy, just more slowly. Yes, the earth's overall climate 
is not dramatically affected by butterflies.

But the point here is that the positions and veocities of the billard 
balls are "unpredictable" after some time t, where t is probably on the 
order of tens of seconds, even if Planck's constant were zero and there 
were no quantum effects whatsoever.


--Tim May
"To those who scare peace-loving people with phantoms of lost liberty, 
my message is this: Your tactics only aid terrorists."  --John Ashcroft, 
U.S. Attorney General




RE: Two ideas for random number generation

2002-04-25 Thread Trei, Peter

> Sandy Harris[SMTP:[EMAIL PROTECTED]]
> 
> Jim Choate wrote:
> 
> > > PRNG output is fixed/repeatable too - that is a properly you *want*
> from a
> > > PRNG.
> > 
> > No it isn't. You -want- a RNG but you can't have one. Nobody -wants- a
> > PRNG, they -settle- for it. 
> 
> That is nearly true for crypto applications, but it certainly isn't for
> some others. e.g. If you're debugging simulation software, you may need
> to be able to make the PRNG produce repeatable output by giving it the
> same seed on every run.
> 
> For crypto, it absolutely clear that you need a true RNG for some
> things,
> if only seeding and re-seeding a PRNG, and that using a PRNG introduces
> one more thing that could contain dangerous weaknesses.
> 
> Given a well-designed PRNG, though, it is not clear that there's any
> real benefit to using a true RNG instead. If you're generating 128-bit
> session keys, there is no practical difference between using the true
> RNG directly and using a good PRNG with, say, 256-bit key.
> 
Here we've sort of come full circle. My first post mentioning  a pi based
PRNG was to call out Jim's nonsense.

In his post of Monday, April 22, 6:38 PM, jim had written:
>On Mon, 22 Apr 2002, Trei, Peter wrote:

>> The defining difference between the two is that if you know the
>> algorithm and seed, the output of a PRNG can be reproduced,
>> at a different time, place. or both. There are circumstances in 
>> which this is very much a desired quality.
>
>Actually you left something out, the PRNG by definition must have a
>modulus of repetition. At some point it starts the sequence over.
>
>In general, this is -never- a desired quality and is the primary
>distinction between the cost-utility of PRNG's versus RNG's.

Peter (that's me) responded on  April 23, 10:29 AM
>As usual, Jim is wrong. There are deterministic systems which never
>repeat. For example, there is an algorithm which will give you the
>nth digit of pi. If I use this as my PRNG (one way I could seed it would
>be to use key to pick a starting point) how long does Jim think it will run
>before it repeats??

>Exactly what is the Choatian definition of a PRNG which requires
>it to repeat, anyway?

---
My point, I hope it is clear, was to prove that there are deterministic
algorithms which do not repeat. When Jim realized what an fool
he'd made of himself, he decided to change the subject; first by 
claiming this would be a pretty lousy PRNG to use for a cipher
(of course it is - my point concerned repeated sequences, not 
making a good cipher), and then to blather about k-distribution 
(which may be a characteristic of a good PRNG, but is irrelevant 
to my point). I suspect the if Jim were correct, he might actually 
have a solution to the Halting Problem

Of particular humor is his repeated insistance that anywhere one
might use a PRNG, a RNG would be better. Jim, try implementing
SSL with a true RNG instead of RC4. The ciphertext may be quite 
secure, but it's not very useful.

Peter Trei











Re: Two ideas for random number generation

2002-04-25 Thread Ben Laurie

"Major Variola (ret)" wrote:
> There is a fascinating demo-photograph that shows reflections off
> 4 stacked steel balls is a classical fractal.

"Topology in chaotic scattering" - DAVID SWEET, EDWARD OTT & JAMES A.
YORKE 

http://www.nature.com/cgi-taf/DynaPage.taf?file=/nature/journal/v399/n6734/abs/399315a0_fs.html

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff




Re: Two ideas for random number generation

2002-04-25 Thread David Howe

<[EMAIL PROTECTED]> wrote:
> On 24 Apr 2002 at 17:41, David Howe wrote:
> > its probably a better (if much slower) stream cypher than most currently
in
> > use; I can't think of any that have larger than a 256 internal state,
and
> > that implies a 2^256 step cycle at best; for pi to be worse, it would
have
> > to have less than 2^256 digits.
> This is putting sillines on top of silliness.  It's true that in principle
> that the decimal expansion of pi has an infinite number of digits,
> but any practical implementation of a PRNG based on pi
> would still have to have a finite number of accessable states.
Indeed my point (the mentioned hardware implimentation limitations) -
however, you don't need an infinite pi - a prng based on a subset that has
2^257 bits of the sequence has by definition a longer cycle time than a 256
state prng.

> Conversely, a PRNG whose cycle is "only" 2^256 bits long
> will never repeat itself during the lifetime of the device, or
> the lifetime of the universe for that matter.
which is why a subset is sufficient.





Re: Two ideas for random number generation

2002-04-25 Thread Major Variola (ret)

At 09:42 AM 4/23/02 -0700, Tim May wrote:
>
>And even if the world were Newtonian, in a classical billiard ball
>sense, with Planck's constant precisely equal to zero, predictability
is
>a chimera. Consider a game of billiards, with perfectly spherical
>billiard balls, a perfectly flat table, etc. Trajectories depend on
>angles to a precision that keeps going deeper and deeper into the
>decimals. For example, predicting the table state after, say, 3
seconds,
>might require knowing positions, speeds, and angles (in other words,
the
>vectors) to a precision of one part in a thousand. Doable, one might
say.

Predictability gets much worse if one of the walls of a pool-table is
curved,
then the uncertainty in a perfectly-round ball's momentum is
magnified after reflection, compared to a pool-table of 3 or more
flat walls.

You may have meant to imply this --if spherical balls
hit other balls the uncertainty is similarly magnified-- but its worth
noting the difference in predictability between flat and curved-wall
abstract billiards.

There is a fascinating demo-photograph that shows reflections off
4 stacked steel balls is a classical fractal.

>But after 30 seconds, any "errors" that are greater than one part in a
>billion would lead to "substantially different" table states. Fail to
>know the mass or position or elasticity or whatever of just one of the
>billiard balls to one part in a billion and the outcome is no longer
>"predictable."


Exactly.  This is why some of us severe quantum skeptics still accept
atomic-level generated uncertainty (resistor, junction, radioactive)
and the entropy harvested therefrom.


---
Lorentz' weather-sim was deterministic, but screwing up a small decimal
fraction
as he retyped something totally hosed his expected results ---that's the
concept.




Re: Two ideas for random number generation

2002-04-24 Thread georgemw

On 24 Apr 2002 at 17:41, David Howe wrote:

> > Maybe for you, I sure as hell wouldn't use it either as a key or as a
> > seed into a known hashing/whiting algorithm.
> its probably a better (if much slower) stream cypher than most currently in
> use; I can't think of any that have larger than a 256 internal state, and
> that implies a 2^256 step cycle at best; for pi to be worse, it would have
> to have less than 2^256 digits.
> 


This is putting sillines on top of silliness.  It's true that in principle
that the decimal expansion of pi has an infinite number of digits,
but any practical implementation of a PRNG based on pi
would still have to have a finite number of accessable states.

That is, to get the infinite cycle, you'd have to have some method of
generating a uniform random integer 0 to infinity for the
initial state, and you'd need an infinite amount of memory
to store  the current internal state.  Neither of which
is acheivable ion practice.

Conversely, a PRNG whose cycle is "only" 2^256 bits long
will never repeat itself during the lifetime of the device, or
the lifetime of the universe for that matter.

George




Re: Two ideas for random number generation

2002-04-24 Thread R. A. Hettinga

I seem to be channeling mathematicians this morning...

Cheers,
RAH

--- begin forwarded text


Status:  U
From: Somebody with a sheepskin...
To: "R. A. Hettinga" <[EMAIL PROTECTED]>
Subject: Re: Two ideas for random number generation
Date: Wed, 24 Apr 2002 08:44:41 -0600

Bob,

Tim's examples are unnecessarily complicated.

The logistic function f(x) = Ax(1-x) maps the interval [0,1] into itself for
A in the range [0,4].  Hence, for any such A, it can be iterated.

That is, one may start with an x|0 get x|1= f(x|0) -- where x|j means x sub
j -- and repeat, thus: x|(n+1) = f(x|n).

For small enough values of A, the iteration provably converges to a single
value. For slightly larger values, it converges to a pair of values that
alternate every other time -- known as a period 2 sequence.  For a slightly
larger value of A it converges to 4 values that come up over and over
again -- a period 4 sequence.  Some of this is provable, too.

This increase in multiple period states continues briefly for smaller and
smaller changes in the parameter A.  At some point the period becomes
infinite, and the sequence becomes not detectably different from random.
This is an empirical fact, not yet proven so far as I know.

Note that the function is completely deterministic.  If you know x exactly,
you know x|n -- exactly.  But if you know x to only finite precision, you
know very little about x|n.  Specifically, you know only that it is in the
range [0,1].

So Pick A large enough.  Pick an arbitrary double precision floating
point number (about 14 digits for 64 bit arithmetic) on a given machine.
Pick an integer N. Iterate the logistic function N times on it.  Take the
sequence of 7 least significant digits.  They're probably uniformly
distributed in the 7 digit integers.

If you don't know the seed, you don't know the sequence, so I guess you can
encrypt with the thing, too.

But you can't prove squat about it!




- Original Message -
From: "R. A. Hettinga" <[EMAIL PROTECTED]>
To: 
Sent: Tuesday, April 23, 2002 6:16 PM
Subject: Re: Two ideas for random number generation


>
> --- begin forwarded text
>
>
> Status:  U
> Date: Tue, 23 Apr 2002 09:42:32 -0700
> Old-Subject: Re: Two ideas for random number generation
> From: Tim May <[EMAIL PROTECTED]>
> To: [EMAIL PROTECTED]
> Subject:  Re: Two ideas for random number generation
> Sender: [EMAIL PROTECTED]
>
> On Monday, April 22, 2002, at 11:23  PM, Joseph Ashwood wrote:
> >
> > From: <[EMAIL PROTECTED]>
> >> If a RNG runs off Johnson noise, then the ability to predict its
> >> output would imply the ability to violate the second law of
> >> thermodynamics.  If it runs off shot noise, then the ability to
> >> predict its output would disprove quantum mechanics.
> >
> > Actually there are models that fit the universe that are entirely
> > deterministic.
>
> Could you mention what they are?
>
> Boehm's "hidden variables" model is generally discredited (some would
> say "disproved"). Alternatives to the Copenhagen Interpretation, notably
> EWG/"many worlds," Hartle's "consistent histories," and Cramer's
> transactional model, are still not deterministic, in that the world an
> observer is in ("finds himself in") is still not predictable in advance.
> Operationally, all interpretations give the same results, i.e., the
> Uncertainty Principle. (Which is why I mentioned "hidden variables," the
> only alternative theory which _might_ have restored classical
> Lagrange/Laplace predictability, in theory.)
>
> And even if the world were Newtonian, in a classical billiard ball
> sense, with Planck's constant precisely equal to zero, predictability is
> a chimera. Consider a game of billiards, with perfectly spherical
> billiard balls, a perfectly flat table, etc. Trajectories depend on
> angles to a precision that keeps going deeper and deeper into the
> decimals. For example, predicting the table state after, say, 3 seconds,
> might require knowing positions, speeds, and angles (in other words, the
> vectors) to a precision of one part in a thousand. Doable, one might say.
>
> But after 30 seconds, any "errors" that are greater than one part in a
> billion would lead to "substantially different" table states. Fail to
> know the mass or position or elasticity or whatever of just one of the
> billiard balls to one part in a billion and the outcome is no longer
> "predictable."
>
> After a couple of minutes, the table positions are different if anything
> is not known to one part in, say, 10^50.
>
> Even talk of a "sufficiently powerful computer" is meaningless when the
> dimensions o

Re: Two ideas for random number generation

2002-04-24 Thread Riad S. Wahby

Optimizzin Al-gorithym <[EMAIL PROTECTED]> wrote:
> You can also use common guard structures to isolate the "HV" part of
> the chip, without dicking with the Delicate Recipes (process) which
> you Don't Want To Do And Probably Wouldn't Be Allowed To Anyway.
> Also helps keep digital switching noise out of the source.

True, but if the customer wants low supply voltages, you're screwed.

Consulting a process guide I happen to have handy (it's a pretty
recent fast bipolar process), emitter-base breakdown is quoted at 2.6V
nominal, so I guess from anything greater than 2.6V + Vcesat supplies
(maybe 3V, certainly 3.3V, 5V), you're all set.  Of course, if you're
willing to sacrifice some current protection, put resistors on your
wafer, and reduce PSRR, you could go as low as 2.7V for the cost of
one 10 ohm resistor.

> Actually, we're interested...

I don't have any detailed data on it handy, but there are some
indications in the same process guide that emitter-base breakdown
carries with it some pretty serious hot-carrier problems.  I believe
that over time this effect drops off (after the worst of it has had
its run), so the transistor won't continually degrade.  Furthermore,
most problems associated with hot-carrier effects won't concern us if
the transistor is being run only in reverse breakdown---you'll lose
some beta, BVeb might go down a little, etc.  Thus, you'll almost
certainly want to sacrifice the headroom and use a mirror to drive the
emitter instead of a resistor (else your breakdown will increase over
time, which will probably make the degradation more severe), but you
should be OK.

If I ever get the opportunity, I'll do some emitter-base burn-in
testing and see how the entropy of the output is affected by
hot-carrier-induced transistor degradation.

> man, you don't want to have had too much coffee trying to land the
> probes..  ..looking at analogue measurements with spectral analyzers
> and sampled data with statistical tools)

I've done some probing with way too much coffee.  The worst was
probing MEMS devices that way.  Man, oh man.  :-)

--
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2/A 2002




Re: Two ideas for random number generation

2002-04-24 Thread David Howe

> No it isn't. You -want- a RNG but you can't have one. Nobody
> -wants- a PRNG, they -settle- for it.
I think there is some confusion here - if you are using a PRNG as a stream
cypher, the last thing in the world you want is for it to be truely random -
you need to sync up two prngs in order to decrypt the message, and
randomness would defeat that (I can see a case where you introduce a little
randomness and use some redundant method to strip it out before encryption,
but that's only a second layer of obscurity of little value if the
mainstream crypto is borken.

> What one wants is a bit sequence which is
> -random-.
if you want random numbers, get a rng - a prng is never going to be a rng,
and everyone knows it. given you are using a prng in any case, does it
matter if the prng sequence being used happens to be a sequence of pi, or
any other fixed sequence?

> > any subset of the digits of pi is as close to RNG output as you would
> > need to satisfy any entropy tests - unless you *knew* you had derived it
> > from pi you couldn't distinguish it from a true random string of the
same
> > size.
> Satisfying an -entropy test- is -not- equivalent to -being- a RNG. It only
> says that within a particular error margin you're -close enogh-.
indeed so. but if someone has said a prng is truely a rng, I must have
missed it.

> Really? The offset into the sequence is a fixed width and the result is
> alaways a single character. Where do you add a bit?
what makes you think the offset is a fixed width? pi is of infinite length
(or so I am told) so any offset is also at least potentially of infinite
size. speed and physical construction constraints limit that, but not enough
to fit your claims of it being easily defeatable.

> > the single-digit-of-pi formula is too slow to form a good stream cypher,
but
> > is otherwise ok;
> Maybe for you, I sure as hell wouldn't use it either as a key or as a
> seed into a known hashing/whiting algorithm.
its probably a better (if much slower) stream cypher than most currently in
use; I can't think of any that have larger than a 256 internal state, and
that implies a 2^256 step cycle at best; for pi to be worse, it would have
to have less than 2^256 digits.

> Let me ask you a more pointy question. Are you selecting some offset and
> then taking the sequence of digits from pi, or are you selecting the
> digits out of order? In either of these cases it isn't the sequence of pi
> that is providing the randomness (which is apparently the claim) but
> rather the selection process; which is both undescribed at this point
> -and- simply moves the argument from one area to another - this -proving-
> nothing.
no, you are using a subset of a pseudo-random stream of infinite length;
there is little benefit in selecting digits at random, if you are relying on
the pseudorandomness of the stream itself.  I am at a loss to see what you
are driving at, so am forced to assume we are considering radically
different cases.




Re: Two ideas for random number generation

2002-04-24 Thread Optimizzin Al-gorithym

At 11:55 AM 4/24/02 +0300, Sampo Syreeni wrote:
>On Tue, 23 Apr 2002, Riad S. Wahby wrote:
>
>>This may take more voltage than you want to use in your process, but
you
>>can engineer the base-emitter junction if you've got a friend in
process
>>engineering.

You can also use common guard structures to isolate the "HV" part of the

chip, without dicking with the Delicate Recipes (process) which you
Don't
Want To Do And Probably Wouldn't Be Allowed To Anyway.  Also helps
keep digital switching noise out of the source.


>Aren't there dedicated avalanche diodes available with low breakdown
>voltages, precisely for this reason? I think they're used in
applications
>where zeners could be, except for higher breakdown current.
>
>>One other potential problem is long-term reliability, but that's a
>>subject for another email.

Actually, we're interested...


>Shouldn't be a problem, if you limit the breakdown current. If you're
>after entropy, you'd likely want to use a constant current source
anyway.

And constant-current sources are *sooo* tough to make out of transistors
:-)

My small point is confirming that junction/avalanche RNG sources are
very compatible with standard CMOS fabrication.  (I've actually probed
test
structures on production wafers in a stuffy metal room examining this...

man, you don't want to have had too much coffee trying to land the
probes..
..looking at analogue measurements with spectral analyzers and sampled
data with
statistical tools)

The junction structures are "louder" than resistors --they produce more
entropy per watt.
Intel may have had other, valid reasons for using resistive sources in
its real RNG.




Re: Two ideas for random number generation

2002-04-24 Thread Sunder


On Tue, 23 Apr 2002 [EMAIL PROTECTED] wrote:

> --
> Jim Choate wrote:
> > > > If you can't develop a RNG in software (ie you'd be in a
> > > > state of sin), what makes you think you can do it using
> > > > -only- digital gates in hardware? You can't.
> 
> James A. Donald:
> > > Classic Choatian physics.
> > >
> > > Of course you can.
> 
> Jim Choate:
> > Not if you use -only- digital gates and derived functions (eg
> > flip flop or a shift register)
> 
> One can build a true random generator using a two resistors, a
> capacitor, three unbuffered inverters and a shift register.  A
> second shift register and an XOR gate will serve to quite
> adequately whiten the output. 

Yeah, the shit for brains will probably tell you that resistors and
capacitors aren't digital gates.

In which case you can just use a bunch of flip-flop that feed back to
themselves at different rates, so you get different streams of 1's and
0's, and xor'em all together.  Inefficient as compared to using analog
components, but it would work if the timing between them is different
enough and you only get data from them sporadically.  Lots of whitening to
do afterwards of course...




Re: Two ideas for random number generation

2002-04-24 Thread Ben Laurie

Tim May wrote:
> 
> On Monday, April 22, 2002, at 11:23  PM, Joseph Ashwood wrote:
> >
> > From: <[EMAIL PROTECTED]>
> >> If a RNG runs off Johnson noise, then the ability to predict its
> >> output would imply the ability to violate the second law of
> >> thermodynamics.  If it runs off shot noise, then the ability to
> >> predict its output would disprove quantum mechanics.
> >
> > Actually there are models that fit the universe that are entirely
> > deterministic.
> 
> Could you mention what they are?
> 
> Boehm's "hidden variables" model is generally discredited (some would
> say "disproved").

As I understand it, Bell's inequality definitively cannot be explained
by hidden variables, hence the whole action-at-a-distance thing.

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff




Re: Two ideas for random number generation

2002-04-24 Thread Sandy Harris

Jim Choate wrote:

> > PRNG output is fixed/repeatable too - that is a properly you *want* from a
> > PRNG.
> 
> No it isn't. You -want- a RNG but you can't have one. Nobody -wants- a
> PRNG, they -settle- for it. 

That is nearly true for crypto applications, but it certainly isn't for
some others. e.g. If you're debugging simulation software, you may need
to be able to make the PRNG produce repeatable output by giving it the
same seed on every run.

For crypto, it absolutely clear that you need a true RNG for some
things,
if only seeding and re-seeding a PRNG, and that using a PRNG introduces
one more thing that could contain dangerous weaknesses.

Given a well-designed PRNG, though, it is not clear that there's any
real benefit to using a true RNG instead. If you're generating 128-bit
session keys, there is no practical difference between using the true
RNG directly and using a good PRNG with, say, 256-bit key.




Re: Two ideas for random number generation

2002-04-24 Thread Riad S. Wahby

Sampo Syreeni <[EMAIL PROTECTED]> wrote:
> Aren't there dedicated avalanche diodes available with low breakdown
> voltages, precisely for this reason? I think they're used in applications
> where zeners could be, except for higher breakdown current.

Sure.  I was thinking of an IC design, in which case you're a lot more
likely to be able to make a bipolar than you are to have essentially
<5V zeners.

>> [mention of reliability issues]
> Shouldn't be a problem, if you limit the breakdown current. If you're
> after entropy, you'd likely want to use a constant current source anyway.

To first order, yes, but at least a couple of the processes I've
worked with warn against even controlled emitter-base breakdown.  Of
course, I suppose they're assuming you want to use the transistor some
other way, too...

--
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2/A 2002




Re: Two ideas for random number generation

2002-04-24 Thread Jim Choate


On Wed, 24 Apr 2002, David Howe wrote:

> "Jim Choate" <[EMAIL PROTECTED]> wrote:
> > But that changes the game in the middle of play, the sequence of digits
> > in pi is fixed, not random. You can't get a random number from a constant.
> > Otherwise it wouldn't be a constant.

> PRNG output is fixed/repeatable too - that is a properly you *want* from a
> PRNG.

No it isn't. You -want- a RNG but you can't have one. Nobody -wants- a
PRNG, they -settle- for it. What one wants is a bit sequence which is
-random-. There are many definitions of random, but they boil down to
-unpredictable- outside of chance with respect to predicting individual
bit results as well as sequences of bits (they are not the same
statistically speaking, re probability distributions).

Ideally what one would want is a situation where each bit has a 50/50
chance of being in either state and there are -no- inter-bit dependencies.
That implies no modulo (though it doesn't prevent clustering which can
fool you if you don't test your sub-strings well enough - re Gamblers
Ruin).

Which raises an interesting aspect for me, what happens if you put a PRNG
into a 'Garden of Eden' state?

> any subset of the digits of pi is as close to RNG output as you would
> need to satisfy any entropy tests - unless you *knew* you had derived it
> from pi you couldn't distinguish it from a true random string of the same
> size.

Satisfying an -entropy test- is -not- equivalent to -being- a RNG. It only
says that within a particular error margin you're -close enogh-.
 
> > You can't stop them from using their tables. Slow them down, not stop
> > them. You can't use that huge a seed, hardware limitations. They can match
> > you.

> *shrug* given that adding a bit to the seed doubles the quantity of data
> they would have to cache in their tables, it can quickly become unworkable;

Really? The offset into the sequence is a fixed width and the result is
alaways a single character. Where do you add a bit?

> the single-digit-of-pi formula is too slow to form a good stream cypher, but
> is otherwise ok;

Maybe for you, I sure as hell wouldn't use it either as a key or as a
seed into a known hashing/whiting algorithm.

Let me ask you a more pointy question. Are you selecting some offset and
then taking the sequence of digits from pi, or are you selecting the
digits out of order? In either of these cases it isn't the sequence of pi
that is providing the randomness (which is apparently the claim) but
rather the selection process; which is both undescribed at this point
-and- simply moves the argument from one area to another - this -proving-
nothing.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org





Re: Two ideas for random number generation

2002-04-24 Thread David Howe

"Jim Choate" <[EMAIL PROTECTED]> wrote:
> But that changes the game in the middle of play, the sequence of digits
> in pi is fixed, not random. You can't get a random number from a constant.
> Otherwise it wouldn't be a constant.
PRNG output is fixed/repeatable too - that is a properly you *want* from a
PRNG.  any subset of the digits of pi is as close to RNG output as you would
need to satisfy any entropy tests - unless you *knew* you had derived it
from pi you couldn't distinguish it from a true random string of the same
size.

> You can't stop them from using their tables. Slow them down, not stop
> them. You can't use that huge a seed, hardware limitations. They can match
> you.
*shrug* given that adding a bit to the seed doubles the quantity of data
they would have to cache in their tables, it can quickly become unworkable;
the single-digit-of-pi formula is too slow to form a good stream cypher, but
is otherwise ok; if you aren't constrained to matching a real world sequence
(pi in this case) but are happy with *any* non-repeating but deterministic
stream, you can probably find something much faster.






Re: Two ideas for random number generation

2002-04-24 Thread Sampo Syreeni

On Tue, 23 Apr 2002, Riad S. Wahby wrote:

>This may take more voltage than you want to use in your process, but you
>can engineer the base-emitter junction if you've got a friend in process
>engineering.

Aren't there dedicated avalanche diodes available with low breakdown
voltages, precisely for this reason? I think they're used in applications
where zeners could be, except for higher breakdown current.

>One other potential problem is long-term reliability, but that's a
>subject for another email.

Shouldn't be a problem, if you limit the breakdown current. If you're
after entropy, you'd likely want to use a constant current source anyway.

Sampo Syreeni, aka decoy - mailto:[EMAIL PROTECTED], tel:+358-50-5756111
student/math+cs/helsinki university, http://www.iki.fi/~decoy/front
openpgp: 050985C2/025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2




Re: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Tue, 23 Apr 2002, Riad S. Wahby wrote:

> Another nice way to get an RNG is Avalanche breakdown.

I like using radiation on diodes myself. Reverse bias them and then
amplify the noise. Use a Schmitt Trigger. Use one for each bit.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Tue, 23 Apr 2002 [EMAIL PROTECTED] wrote:

> One can build a true random generator using a two resistors, a

A resistor isn't a Boolean gate.

Go back to sleep.

I'm still working on your Chomsky page. I don't think you'll be happy.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: CDR: Re: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Tue, 23 Apr 2002, Alan Braggins wrote:

> > > Actually you left something out, the PRNG by definition must have a
> > > modulus of repetition. At some point it starts the sequence over.
> > >
> > As usual, Jim is wrong. There are deterministic systems which never
> > repeat. For example, there is an algorithm which will give you the
> > nth digit of pi.
> 
> http://www.mathsoft.com/asolve/plouffe/plouffe.html
> 
> 
> >  If I use this as my PRNG (one way I could seed it would
> > be to use key to pick a starting point)
> 
> Or just mix pi with a more conventional PRNG, which will avoid you having
> to use huge seeds to stop your output stream matching anyone's pregenerated
> lookup tables of lots of digits of pi.

But that changes the game in the middle of play, the sequence of digits
in pi is fixed, not random. You can't get a random number from a constant.
Otherwise it wouldn't be a constant.

You can't stop them from using their tables. Slow them down, not stop
them. You can't use that huge a seed, hardware limitations. They can match
you.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org





RE: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Tue, 23 Apr 2002, Trei, Peter wrote:

> Exactly what is the Choatian definition of a PRNG which requires
> it to repeat, anyway?

Wrong question, the -right- questions is...

What is -random-?

It means unpredictable, this means unrepeatable. If it repeats then it
-must- be predictable; that means it ain't -random-. A PRNG isn't
unpredictable, only taking such a long time that by the time the repeat
gets there you don't care. The length of that sequence is called the
modulus of the generator. -ALL- PRNG's will repeat, if they didn't they
wouldn't be -pseudo-.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






RE: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Tue, 23 Apr 2002, Trei, Peter wrote:

> As usual, Jim is wrong. There are deterministic systems which never
> repeat. For example, there is an algorithm which will give you the
> nth digit of pi.

Ok. The distribution of a single digit is -not- the same as pi itself...

> If I use this as my PRNG

The -only- thing in question there is the index of the digit you started
with. From there the sequence is ABSOLUTELY deterministic. It's a basic
select and compare problem at that point. pi may be irrational, but its
value is constant. So is the sequence of digits that your algorithm
generates, irrespective of the starting digit -provided- we can ever find
an overlap in the two sequences (your original and my search). There's
been some pretty nifty string manipulation algorithms in the last few
years coupled with some of the new parallel and distributed systems

Not a very good PRNG, not even as a seed source to a hash function.

Look. Software is nothing more than instructions to -general purpose
computing HARDWARE-. The question is what does that mean when we talk
about -hardware-? The hardware that my computer runs on is binary. And it
uses some derivation of at least two of three basic functions (ie NOT,
AND, OR). If I can't write it in software which is simply commands to
specific meshes of these basic gates (assuming of course the two meshes
can be made comparable - cost v complexity) execute specific sequences of
operations on specific bits of data. Now we take out the abstraction layer
that the -general purpose- throws in there. We take that program and build
it using the -exact- same sort of logical gate meshes. What changed?

Nothing. That's the -universal- in 'Universal Turing Machine'. What is the
-primary- dependency of the UTM? That it is -binary-.

-If- you want to do a -real- RNG then it must have some -continous-
component and include some sort of feedback(forward) dependency.

The strength of hardware comes from the fact that you can do things that
aren't representable in some other languages (eg Boolean Algebra), A/D <>
D/A conversion, dependencies on manufacturing tolerances, etc.). Those
come from the -analog- characteristic of the -representation- of the logic
in hardware. It's a -hidden- variable, so to speak.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org







complexity theory and information warfare (was: Re: Two ideas for random number generation)

2002-04-23 Thread Faustine

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Tim wrote:

>The modern name for this outlook is "chaos theory," but I believe 
>"chaos" gives almost mystical associations to something which is really 
>quite understandable: divergences in decimal expansions.

>Discrepancies come marching in, fairly rapidly, from "out there in the 
>expansion."

>Another way of looking at unpredictabality is to say that real objects
>in real space and subject to real forces from many other real objects 
>are in the world of the "real numbers" and any representation of a real
>number as a 25-digit number (diameter of the solar system to within 1 
>centimeter) or even as a 100-digit number (utterly beyond all hope of 
>meaurement!) is just not enough.

>(snip) In short, predictability is a physical and computational chimera: it 
>does not, and cannot, exist.


Fascinating post on a fascinating subject, but since I'm too short of time
for the kind of reply it deserves, here's a minor aside for anyone interested
in developing practical applications of complexity theory on cypherpunk
themes: you might find some of the works listed here relevant and
useful...

Complexity, Global Politics and National Security

Complexity And Chaos:
A Working Bibliography

School of Information Warfare and Strategy
National Defense University
Washington, D.C.

http://www.ndu.edu/ndu/inss/books/complexity/bibliogr.html


The fact that NDU is putting so much stock in R&D in these areas
as part of their information warfare efforts is interesting in its
own right. Even if it ultimately proves to be nothing more than 
a dead-end bunch of hooey, libertarians of all persuasions ought to
at least be aware of the kinds of research going on, where analysts 
are trying to take this field. Especially those with a background
like yours who are in the perfect position to make a real 
open-literature counter-contribution someday if the alchemists
of predictability ever do come across their philosopher's stone.

Improbable--crazy, even--but when did that ever stop a mathematician:

"All stable processes we shall predict. All unstable processes we
shall control." --John von Neumann.

This is what's ultimately at stake. Fascinating, terrifying.
The only way to counter math is with better math.
 
Oh well, so it seems to me. 


~Faustine.


***

He that would make his own liberty secure must guard even his enemy
from oppression; for if he violates this duty he establishes a precedent
that will reach to himself.

- --Thomas Paine

-BEGIN PGP SIGNATURE-
Version: PGPsdk version 1.7.1 (C) 1997-1999 Network Associates, Inc. and its 
affiliated companies. (Diffie-Helman/DSS-only version)

iQA/AwUBPMXr0fg5Tuca7bfvEQIfJACgz1DxiddKDkm1bw6ZfrGGMUQ6D3wAoMrP
lQBfq2Wfh2qMxdFkbHnJnDdr
=mCZt
-END PGP SIGNATURE-




Re: Two ideas for random number generation

2002-04-23 Thread Riad S. Wahby

gfgs pedo <[EMAIL PROTECTED]> wrote:
> why exactly is avalanvche break down a good RNG?
> Thank u.

Avalanche noise is just about as good as Johnson / Johnson-Nyquist /
thermal noise (all names for the same phenomenon) for collecting
entropy.  The spectral density is flat, but the amplitude distribution
isn't Gaussian, so you should probably do a little hashing of the
output.

Avalanche noise has the advantage that its amplitude is greater than
thermal noise, which makes it easier to detect and build circuits
around.  A 10 kOhm resistor produces thermal noise on the order of
10^{-8} V / sqrt(Hz).  A diode in avalanche breakdown with .5mA of
current running through it produces on the order of 10^{-7} V /
sqrt(Hz).

For a description (not by me) of how to make an avalanche RNG, consult
http://www.jfet.org/hw-rng.html

A few reasonable introductions to noise in solid-state devices are:

"Intrinsic Noise in Electronic Systems"
http://web.mit.edu/6.121/www/Resources/handouts/int_noise.pdf

"Op Amps for Everyone"
Chapter 10: Op Amp Noise Theory and Applications
http://www-s.ti.com/sc/psheets/sloa082/sloa082.pdf

"Analog Integrated Circuit Design" by Johns and Martin
Chapter 4: Noise Analysis and Modeling

"Analysis and Design of Analog Integrated Circuits" by Gray and Meyer
Chapter 11: Noise in Integrated Circuits

"The Design of CMOS Radio-Frequency Integrated Circuits" by Thomas Lee
Chapter 10: Noise

"Electronic Circuits and Applications" by Senturia and Wedlock
Chapter 17: Noise

"The Art of Electronics" by Horowitz and Hill
Chapter 7: Precision Circuits and Low-Noise Techniques
   Section 3: Amplifier Noise
   Section 4: Noise Measurements and Noise Sources

--
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2/A 2002




Re: Two ideas for random number generation

2002-04-23 Thread Ken Brown

Tim May wrote:

> Boehm's "hidden variables" model is generally discredited (some would
> say "disproved"). Alternatives to the Copenhagen Interpretation, notably
> EWG/"many worlds," Hartle's "consistent histories," and Cramer's
> transactional model, are still not deterministic, in that the world an
> observer is in ("finds himself in") is still not predictable in advance.
> Operationally, all interpretations give the same results, i.e., the
> Uncertainty Principle. (Which is why I mentioned "hidden variables," the
> only alternative theory which _might_ have restored classical
> Lagrange/Laplace predictability, in theory.)

For no other reason than namedropping I want to point out that Bohm
worked out much of this theory in the building I'm in now - Birkbeck
College gave him a job after he'd been more or less forced out of the
USA by HUAAC.  I exchanged a few words with one of his ex-collaborators
(in the theory, not the Party) in the lift (elevator) a couple of hours
ago. About the soup, not about physics. I can't claim to understand the
physics.

As far as I know (not far, because I can only follow the English
descriptions, not the maths) Bohm's "hidden variables" never claimed to
make the universe predictable, even if deterministic. What they were
after is trying to find some way of describing what they thought was
really going on, instead of what we can observe. (& what they thought
was really going on seems to end up at God, more or less). And one of
the criticisms of it was that even if it were true (some claimed) it
would be impossible to tell experimentally from non-hidden-variable
descriptions. Which might come to the same thing as saying ">
Operationally, all interpretations give the same results" (I tend to be
wary of sentences that start with the word "operationally")

Back nearer to on-topic, Tim's explanation why the world could not be
predicted even if it were locally (microscopically) predictable sounds
spot-on. I may not know much about physics but I know enough about
biology, and the mathematical modelling of biology,  to see how big the
numbers get and how quickly. In a billiard-ball universe we couldn't
predict the exact behaviour of a medium-sized protein molecule, never
mind a whole cell. The world is too complex and detailed to predict. We
hit the combinatorial explosion almost as soon as we set out. I'm
working on some individually-based models of some stylised biochemical
pathways in model microorganisms. The simplest inquiries smash into
impossible numbers at resolutions many orders of magnitude above
anything that quantum considerations are relevant to.


> And even if the world were Newtonian, in a classical billiard ball
> sense, with Planck's constant precisely equal to zero, predictability is
> a chimera. Consider a game of billiards, with perfectly spherical
> billiard balls, a perfectly flat table, etc. Trajectories depend on
> angles to a precision that keeps going deeper and deeper into the
> decimals. For example, predicting the table state after, say, 3 seconds,
> might require knowing positions, speeds, and angles (in other words, the
> vectors) to a precision of one part in a thousand. Doable, one might say.
> 
> But after 30 seconds, any "errors" that are greater than one part in a
> billion would lead to "substantially different" table states. Fail to
> know the mass or position or elasticity or whatever of just one of the
> billiard balls to one part in a billion and the outcome is no longer
> "predictable."
> 
> After a couple of minutes, the table positions are different if anything
> is not known to one part in, say, 10^50.
> 
> Even talk of a "sufficiently powerful computer" is meaningless when the
> dimensions of objects must be known to better than the Planck-Wheeler
> scale (even ignoring issues of whether there's a quantum foam at these
> dimensions).
> 
> I feel strongly about this issue, and have thought about it for many
> years. The whole "in principle the Universe can be calculated" was a
> foray down a doomed path WHETHER OR NOT quantum mechanics ever came out.
> 
> The modern name for this outlook is "chaos theory," but I believe
> "chaos" gives almost mystical associations to something which is really
> quite understandable: divergences in decimal expansions.
> 
> Discrepancies come marching in, fairly rapidly, from "out there in the
> expansion."
> 
> Another way of looking at unpredictabality is to say that real objects
> in real space and subject to real forces from many other real objects
> are in the world of the "real numbers" and any representation of a real
> number as a 25-digit number (diameter of the solar system to within 1
> centimeter) or even as a 100-digit number (utterly beyond all hope of
> meaurement!) is just not enough.
> 
> (Obscure aside: What if Wheeler and others are right in some of their
> speculations that the universe is ultimately quantized, a la "It from
> Bit"? Notwithstanding that we're now back into quantum real

Re: Two ideas for random number generation

2002-04-23 Thread Tim May

On Monday, April 22, 2002, at 11:23  PM, Joseph Ashwood wrote:
>
> From: <[EMAIL PROTECTED]>
>> If a RNG runs off Johnson noise, then the ability to predict its
>> output would imply the ability to violate the second law of
>> thermodynamics.  If it runs off shot noise, then the ability to
>> predict its output would disprove quantum mechanics.
>
> Actually there are models that fit the universe that are entirely
> deterministic.

Could you mention what they are?

Boehm's "hidden variables" model is generally discredited (some would 
say "disproved"). Alternatives to the Copenhagen Interpretation, notably 
EWG/"many worlds," Hartle's "consistent histories," and Cramer's 
transactional model, are still not deterministic, in that the world an 
observer is in ("finds himself in") is still not predictable in advance. 
Operationally, all interpretations give the same results, i.e., the 
Uncertainty Principle. (Which is why I mentioned "hidden variables," the 
only alternative theory which _might_ have restored classical 
Lagrange/Laplace predictability, in theory.)

And even if the world were Newtonian, in a classical billiard ball 
sense, with Planck's constant precisely equal to zero, predictability is 
a chimera. Consider a game of billiards, with perfectly spherical 
billiard balls, a perfectly flat table, etc. Trajectories depend on 
angles to a precision that keeps going deeper and deeper into the 
decimals. For example, predicting the table state after, say, 3 seconds, 
might require knowing positions, speeds, and angles (in other words, the 
vectors) to a precision of one part in a thousand. Doable, one might say.

But after 30 seconds, any "errors" that are greater than one part in a 
billion would lead to "substantially different" table states. Fail to 
know the mass or position or elasticity or whatever of just one of the 
billiard balls to one part in a billion and the outcome is no longer 
"predictable."

After a couple of minutes, the table positions are different if anything 
is not known to one part in, say, 10^50.

Even talk of a "sufficiently powerful computer" is meaningless when the 
dimensions of objects must be known to better than the Planck-Wheeler 
scale (even ignoring issues of whether there's a quantum foam at these 
dimensions).

I feel strongly about this issue, and have thought about it for many 
years. The whole "in principle the Universe can be calculated" was a 
foray down a doomed path WHETHER OR NOT quantum mechanics ever came out.

The modern name for this outlook is "chaos theory," but I believe 
"chaos" gives almost mystical associations to something which is really 
quite understandable: divergences in decimal expansions.

Discrepancies come marching in, fairly rapidly, from "out there in the 
expansion."

Another way of looking at unpredictabality is to say that real objects 
in real space and subject to real forces from many other real objects 
are in the world of the "real numbers" and any representation of a real 
number as a 25-digit number (diameter of the solar system to within 1 
centimeter) or even as a 100-digit number (utterly beyond all hope of 
meaurement!) is just not enough.

(Obscure aside: What if Wheeler and others are right in some of their 
speculations that the universe is ultimately quantized, a la "It from 
Bit"? Notwithstanding that we're now back into quantum realms and 
theories, even if the universe were quantized at Planck-scale levels, 
e.g., at 10^-33 cm levels, the gravitational pull of a distant galaxy 
would be out somewhere at the several hundredth decimal place, as a wild 
guess. The point being, even a grid-based positional system would yield 
the same "real number" issues.)

In short, predictability is a physical and computational chimera: it 
does not, and cannot, exist.


> Admittedly the current line of thinking is that entropy
> exists, but there we still have not proven that it must exist.

Just as no sequence can ever be proved to be "random," so, too, one can 
never prove that entropy "exists" except in the definition sense.

Our best definition of what we mean by random is that of a thing 
(sequence, for example) with no shorter description than itself. This is 
the Kolmogorov-Chaitin definition. Much more is available on the Web 
about this, which is generally known as "algorithmic information theory."

--Tim May




Re: Two ideas for random number generation

2002-04-23 Thread Riad S. Wahby

"Trei, Peter" <[EMAIL PROTECTED]> wrote:
> You can build analog devices out of silicon, and get Johnson noise
> from resistors or diodes.  You can also build radiation detectors in
> silicon, though in the absence of a supplied radiation source your
> data rate will be low

Another nice way to get an RNG is Avalanche breakdown.  You can do
this with reverse emitter-base breakdown in a BJT, or with a diode.
This may take more voltage than you want to use in your process, but
you can engineer the base-emitter junction if you've got a friend in
process engineering.  One other potential problem is long-term
reliability, but that's a subject for another email.

--
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2/A 2002




Re: Two ideas for random number generation

2002-04-23 Thread jamesd

--
Jim Choate wrote:
> > > If you can't develop a RNG in software (ie you'd be in a
> > > state of sin), what makes you think you can do it using
> > > -only- digital gates in hardware? You can't.

James A. Donald:
> > Classic Choatian physics.
> >
> > Of course you can.

Jim Choate:
> Not if you use -only- digital gates and derived functions (eg
> flip flop or a shift register)

One can build a true random generator using a two resistors, a
capacitor, three unbuffered inverters and a shift register.  A
second shift register and an XOR gate will serve to quite
adequately whiten the output. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 ZzpcuzbxAQDBClChxGOXr7wbgY5MtGBoFjAZXIU5
 4eBWSkC558m56GA2t+Tty15qNDQ5EJWuRdv8IKZ8A




Re: Re: Re: Two ideas for random number generation

2002-04-23 Thread jamesd

--
Joseph Ashwood
> > > Because with a pRNG we can sometimes prove very important 
> > > things, while with a RNG we can prove very little (we can't 
> > > even prove that entropy actually exists, let alone that we 
> > > can collect it).

James A. Donald:
> > Don't be silly.  Of course we know that entropy exists, and we 
> > can collect it.
> >
> > If a RNG runs off Johnson noise, then the ability to predict 
> > its output would imply the ability to violate the second law 
> > of thermodynamics.  If it runs off shot noise, then the 
> > ability to predict its output would disprove quantum 
> > mechanics.

Joseph Ashwood
> Actually there are models that fit the universe that are 
> entirely deterministic.

These models are entirely incoherent, and I would summarize them 
as "God knows".   And if these models allowed us to predict the 
outcome of a true RNG, they would not fit the universe.

James A. Donald:
> > > > And if ofne is implementing a PRNG in software, it is 
> > > > trivial to have lots of internal state (asymptotically 
> > > > approaching one-time pad properties).

Joseph Ashwood
> > > The problem is not having that much internal state, but what 
> > > do you do with it? Currently the best options on that front 
> > > involve using block ciphers in various modes, but this has a 
> > > rather small state,
> >
> > RC4 has 1684 bits of state, which should prove sufficient to 
> > defeat guessing.
>
> And RC4 is far from a good RNG of any type, it's distinguishable 
> from random fairly easily, and unless it's used very carefully 
> it's weak. If one were to try to guess all 1684 bits it would be 
> exceedingly difficult, but to start with, it's only a 
> permutation so the space is much smaller, in addition the state 
> itself has more attacks available

Wrong.  1684 bits of entropy.  Count them.

The state is a permutation 256, which requires 2048 bits to 
describe (256 *8) but contains 1684 bits of entropy, not 1684 
bits.

2048 bit description, but because it is a permutation, 1684 bits
actual entropy. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 MjdAfFTXXtA7qo/FzKsFLPFEYgVQ8bY2lfseYhYX
 4P9O7sqp2z5todA8tcLMmb8wQiZ9lLBz/la5zhU+f




Re: Two ideas for random number generation

2002-04-23 Thread Alan Braggins

> > Actually you left something out, the PRNG by definition must have a
> > modulus of repetition. At some point it starts the sequence over.
> >
> As usual, Jim is wrong. There are deterministic systems which never
> repeat. For example, there is an algorithm which will give you the
> nth digit of pi.

http://www.mathsoft.com/asolve/plouffe/plouffe.html


>If I use this as my PRNG (one way I could seed it would
> be to use key to pick a starting point)

Or just mix pi with a more conventional PRNG, which will avoid you having
to use huge seeds to stop your output stream matching anyone's pregenerated
lookup tables of lots of digits of pi.

-- 
Alan Braggins  mailto:[EMAIL PROTECTED]  http://www.ncipher.com/
nCipher Corporation Ltd.  +44 1223 723600  Fax: +44 1223 723601




RE: Two ideas for random number generation

2002-04-23 Thread Trei, Peter

>   Jim Choate[SMTP:[EMAIL PROTECTED]]
> 
> On Mon, 22 Apr 2002, Trei, Peter wrote:
> 
> > The defining difference between the two is that if you know the
> > algorithm and seed, the output of a PRNG can be reproduced,
> > at a different time, place. or both. There are circumstances in 
> > which this is very much a desired quality.
> 
> Actually you left something out, the PRNG by definition must have a
> modulus of repetition. At some point it starts the sequence over.
> 
As usual, Jim is wrong. There are deterministic systems which never
repeat. For example, there is an algorithm which will give you the
nth digit of pi. If I use this as my PRNG (one way I could seed it would
be to use key to pick a starting point) how long does Jim think it will run
before it repeats??

Exactly what is the Choatian definition of a PRNG which requires
it to repeat, anyway?

> In general, this is -never- a desired quality and is the primary
> distinction between the cost-utility of PRNG's versus RNG's.
> 
Agreed it's not a desireable property, but it's not always present, as
I demonstrate above. It's the 'primary distinction' only on ChoateWorld,
of course.

> And on another statement by somebody about hardware v software RNG's:
> 
> If you can't develop a RNG in software (ie you'd be in a state of sin),
> what makes you think you can do it using -only- digital gates in hardware?
> You can't.
> 
Choate is the only one who has said anything about using "-only-" digital
gates. 
You can build analog devices out of silicon, and get Johnson noise from 
resistors or diodes.  You can also build radiation detectors in silicon,
though
in the absence of a supplied radiation source your data rate will be low
(Tim
should be able to confirm this).

It really bugs me that we have to keep correcting Choate's nonsense - but
we have to, lest the less experienced take his views as reasonable.

Peter Trei
Disclaiimer: The above is my opinion, not neccesarily my employer's.




Re: Two ideas for random number generation

2002-04-23 Thread Jim Choate


On Mon, 22 Apr 2002 [EMAIL PROTECTED] wrote:

> --
> On 22 Apr 2002 at 17:38, Jim Choate wrote:
> > If you can't develop a RNG in software (ie you'd be in a state
> > of sin), what makes you think you can do it using -only- digital
> > gates in hardware? You can't.
> 
> Classic Choatian physics.
>
> Of course you can.

Not if you use -only- digital gates and derived functions (eg flip flop
or a shift register). If you want a -real- RNG in hardware it -must- have
an analog stage, that analog stage must be non-linear (ie sensitive to
small changes in input), and it must use feedback.

If you can do it in hardware gates then you -can- do it in software gates.

And -real- RNG's can't be done in software. Very complicated and long
modulo PRNG's can but RNG's can't.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Two ideas for random number generation

2002-04-22 Thread Jim Choate


On Mon, 22 Apr 2002, Trei, Peter wrote:

> The defining difference between the two is that if you know the
> algorithm and seed, the output of a PRNG can be reproduced,
> at a different time, place. or both. There are circumstances in 
> which this is very much a desired quality.

Actually you left something out, the PRNG by definition must have a
modulus of repetition. At some point it starts the sequence over.

In general, this is -never- a desired quality and is the primary
distinction between the cost-utility of PRNG's versus RNG's.

And on another statement by somebody about hardware v software RNG's:

If you can't develop a RNG in software (ie you'd be in a state of sin),
what makes you think you can do it using -only- digital gates in hardware?
You can't.


 --


 The law is applied philosophy and a philosphical system is
 only as valid as its first principles.
 
James Patrick Kelly - "Wildlife"
   
 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Re: Two ideas for random number generation: Q for Eugene

2002-04-22 Thread gfgs pedo

hi,

I get the point.Thanx for all the replies.

regards Data.

--- Joseph Ashwood <[EMAIL PROTECTED]> wrote:
> 
> - Original Message -
> From: "gfgs pedo" <[EMAIL PROTECTED]>
> 
> > > > Oh surely you can do better than that - making
> it
> > > hard to guess the seed
> > > > is also clearly a desirable property (and one
> that
> > > the square root "rng"
> > > > does not have).
> > U can choose any arbitrary seed(greater than 100
> bits
> > as he (i forgot who) mentioned earlier.Then
> subject it
> > to the Rabin-Miller test.
> > Since the seed value is a very large number,it
> would
> > be impossible to determine the actual value.The
> > chances the intruder  find the correct seed or the
> > prime number hence generated is practically verly
> low.
> 
> You act like the only possible way to figure it out
> is to guess the initial
> seed. The truth is that the number used leaves a
> substantial amount of
> residue in it's square root, and there are various
> rules that can be applied
> to square roots as well. Since with high likelihood
> you will have a lot of
> small factors but few large ones, it's a reasonable
> beginning to simply
> store the roots of the first many primes, this gives
> you a strong network to
> work from when looking for those leftover
> signatures. With decent likelihood
> the first 2^32 primes would be sufficient for this
> when you choose 100 bit
> numbers, and this attack will be much faster than
> brute force. So while you
> have defeated brute force (no surprise there, brute
> force is easy to defeat)
> you haven't developed a strong enough generation
> sequence to really get much
> of anywhere.
> 
> > > Of course, finding the square root of a 100
> digit
> > > number to a
> > > precision of hundreds of decimal places is a lot
> of
> > > computational
> > > effort for no good reason.
> > Yes the effort is going to be large but why no
> good
> > reason?
> 
> Because it's a broken pRNG, that is extremely
> expensive to run. If you want
> a fast pRNG you look to ciphers in CTR mode, or
> stream ciphers, if you want
> one that's provably good you go to BBS (which is
> probably faster than your
> algorithm anyway). So there's no good reason to
> implement such an algorithm.
> 
> > > BTW, the original poster seemed to be under the
> > > delusion that
> > > a number had to be prime in order for its square
> to
> > > be irrational,
> > > but every integer that is not a perfect square
> has
> > > an irrational
> > > square root (if A and B are mutually prime,
> A^2/B^2
> > > can't be
> > > simplified).
> >
> > Nope ,I'm under no such delusion :)
> 
> Just the delusion that your algorithm was good.
> Joe
> 


__
Do You Yahoo!?
Yahoo! Games - play chess, backgammon, pool and more
http://games.yahoo.com/




Re: Re: Two ideas for random number generation: Q for Eugene

2002-04-22 Thread Joseph Ashwood


- Original Message -
From: "gfgs pedo" <[EMAIL PROTECTED]>

> > > Oh surely you can do better than that - making it
> > hard to guess the seed
> > > is also clearly a desirable property (and one that
> > the square root "rng"
> > > does not have).
> U can choose any arbitrary seed(greater than 100 bits
> as he (i forgot who) mentioned earlier.Then subject it
> to the Rabin-Miller test.
> Since the seed value is a very large number,it would
> be impossible to determine the actual value.The
> chances the intruder  find the correct seed or the
> prime number hence generated is practically verly low.

You act like the only possible way to figure it out is to guess the initial
seed. The truth is that the number used leaves a substantial amount of
residue in it's square root, and there are various rules that can be applied
to square roots as well. Since with high likelihood you will have a lot of
small factors but few large ones, it's a reasonable beginning to simply
store the roots of the first many primes, this gives you a strong network to
work from when looking for those leftover signatures. With decent likelihood
the first 2^32 primes would be sufficient for this when you choose 100 bit
numbers, and this attack will be much faster than brute force. So while you
have defeated brute force (no surprise there, brute force is easy to defeat)
you haven't developed a strong enough generation sequence to really get much
of anywhere.

> > Of course, finding the square root of a 100 digit
> > number to a
> > precision of hundreds of decimal places is a lot of
> > computational
> > effort for no good reason.
> Yes the effort is going to be large but why no good
> reason?

Because it's a broken pRNG, that is extremely expensive to run. If you want
a fast pRNG you look to ciphers in CTR mode, or stream ciphers, if you want
one that's provably good you go to BBS (which is probably faster than your
algorithm anyway). So there's no good reason to implement such an algorithm.

> > BTW, the original poster seemed to be under the
> > delusion that
> > a number had to be prime in order for its square to
> > be irrational,
> > but every integer that is not a perfect square has
> > an irrational
> > square root (if A and B are mutually prime, A^2/B^2
> > can't be
> > simplified).
>
> Nope ,I'm under no such delusion :)

Just the delusion that your algorithm was good.
Joe




Re: Re: Two ideas for random number generation

2002-04-22 Thread Joseph Ashwood

- Original Message -
From: "Eugen Leitl" <[EMAIL PROTECTED]>

> On Mon, 22 Apr 2002, Tim May wrote:
>
> > What real-life examples can you name where Gbit rates of random digits
> > are actually needed?
>
> Multimedia streams, routers. If I want to secure a near-future 10 GBit
> Ethernet stream with a symmetric cypher for the duration of a few years
> (periodic rekeying from a RNG might help?) I need both lots of internal
> state (the PRNG can't help leaking information about its state in the
> cypher stream, though the rate of leakage is the function of smarts of the
> attacker) and a high data rate.

Actually that's not necessarily the case. Let's use your example of a
Multimedia stream server that is filling a 10GBit/s connection. Right now
the current minimum seems to be 56kbit/s. So that means that if every
available connection is taken in the same second, the server would only need
a rate of 2.2 million bits/sec from it's RNG to build a 128-bit key for
each. A good design for this though has the client doing most of the random
number choosing, where the only purpose of the server random number is to
prevent the client of biasing the result, so 128-bits is more than
sufficient. So 2.2 Mbit/sec seems to be the peak for that. Finding
situations where a decent design will yield a need for an RNG to run about 1
Gbit/sec is extremely difficult. With poor designs it's actually rather
easy, take a RNG that is poor enough (or a situation where that is a basic
assumption) that it has to be distilled to 1 billionth it's size, obviously
to support that multimedia stream server would require 2.2 million Gigabits
per second (approximately).

> > In any case, if someone wants Gbits per second of random numbers,
> > it'll cost 'em, as it should. Not something I think we need to worry
> > much about.
>
> Maybe, but it's neat trying to see how the constraints of 2d and 3d layout
> of cells, signal TOF and fanout issues influence PRNG design if lots of
> state bits and a high data rate are involved. It is not very useful right
> now, agreed.

I think it would be a good process to go through to develop a design for
one, or at least a basic outline for how it could be done, but the basic
idea that comes to mind looks a lot like /dev/random, but run in parallel
collecting from several sources including a custom hardware pool similar to
the Intel RNG.
Joe




RE: Two ideas for random number generation

2002-04-22 Thread Jack Lloyd

On Mon, 22 Apr 2002, Trei, Peter wrote:

> So my suggestion is that while hardware accelaration of PRNGs may have
> some usefulness, true RNGs need not have the same performance. I'd
> rather see people work on making the true RNGs *trustworthy*, which is
> a much more difficult problem.

Out of curiosity, does anyone know what kind of data rates one can get from
the RNG built into those oh-so-commodity i8xx motherboards? I don't have
one, else I would just go time a read loop. And has anyone ever done an
analysis of the output? Last I heard (~2.5 years ago?) Intel was refusing
to release the design for inspection. Has that changed since?

-Jack




Re: Two ideas for random number generation: Q for Eugene

2002-04-22 Thread Sandy Harris

Ben Laurie wrote:
> 
> gfgs pedo wrote:
> >
> > hi,
> >
> > --- [EMAIL PROTECTED] wrote:
> > > On 22 Apr 2002 at 0:08, Ben Laurie wrote:
> >
> > > > Oh surely you can do better than that - making it
> > > hard to guess the seed
> > > > is also clearly a desirable property (and one that
> > > the square root "rng"
> > > > does not have).
> > U can choose any arbitrary seed(greater than 100 bits
> > as he (i forgot who) mentioned earlier.Then subject it
> > to the Rabin-Miller test.
> > Since the seed value is a very large number,it would
> > be impossible to determine the actual value.The
> > chances the intruder  find the correct seed or the
> > prime number hence generated is practically verly low.
> 
> Uhuh - and how do you choose this large seed?

Suppose an attacker knows some of your outputs.

The whole sitiuation can be written as three numbers:

x = integer part of root plus unknown "random"
  bits previously generated
y = "random" bits attacker knows
z = future "random" bits

Shift everything left until you get:

root*10^a = x*10^b + y + z  with z < 1

We may not know a since that depends on the number of x
bits, but we do know b since that depends only on the
number of y bits.

squaring to find the prime:

prime*10^2a = x^2 * 10^2b + y^2 + z^2 + 2xy*10^b + 2xz*10^b + 2yz

which is obviously zero mod 2a, so all bits to the right of that
are zero. So it's also zero mod 10^b, since b < 2a, and we get:

mod 10^b
0 = y^2 + z^2 + 2yz   

with z^2 < 1 so for bits to the left of the decimal, we have:

y^2 + 2yz = 0 mod 10^b

and since y and b are known, this is easliy solved for some
z bits. 

Iterate with y' = y + discovered z bits. Methinks this is
a fatal weakness, irrespective of the size of the prime.



RE: Two ideas for random number generation

2002-04-22 Thread Trei, Peter

> [EMAIL PROTECTED][SMTP:[EMAIL PROTECTED]] wrote:
> 
> Why would one want to implement a PRNG in silicon, when one can 
> easily implement a real RNG in silicon?
> 
RNGs and PRNGs serve somewhat different purposes in current
cryptographic systems. Both are useful, but it's not clear to me
that the Gbit true RNGs are needed.

The defining difference between the two is that if you know the
algorithm and seed, the output of a PRNG can be reproduced,
at a different time, place. or both. There are circumstances in 
which this is very much a desired quality.

For example, in communications using a stream cipher, is it
neccesary that both ends be able to produce that same 
pseudorandom bitstream. You could not replace RC4 in SSL
with an RNG, since both ends need to generate the same 
sequence.

True RNGs are needed for much more limited purposes:
generating session keys, initialization vectors, candidate
RSA prime numbers, etc. The only high volume use I can
think of for a true RNG is the mass production of OTP 
key material.

So my suggestion is that while hardware accelaration of
PRNGs may have some usefulness, true RNGs need not
have the same performance. I'd rather see people work
on making the true RNGs *trustworthy*, which is a much
more difficult problem.

Peter Trei




Re: Two ideas for random number generation: Q for Eugene

2002-04-22 Thread Ben Laurie

gfgs pedo wrote:
> 
> hi,
> 
> --- [EMAIL PROTECTED] wrote:
> > On 22 Apr 2002 at 0:08, Ben Laurie wrote:
> 
> > > Oh surely you can do better than that - making it
> > hard to guess the seed
> > > is also clearly a desirable property (and one that
> > the square root "rng"
> > > does not have).
> U can choose any arbitrary seed(greater than 100 bits
> as he (i forgot who) mentioned earlier.Then subject it
> to the Rabin-Miller test.
> Since the seed value is a very large number,it would
> be impossible to determine the actual value.The
> chances the intruder  find the correct seed or the
> prime number hence generated is practically verly low.

Uhuh - and how do you choose this large seed?

Cheers,

Ben.

--
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff



Re: Two ideas for random number generation: Q for Eugene

2002-04-22 Thread gfgs pedo

hi,



--- [EMAIL PROTECTED] wrote:
> On 22 Apr 2002 at 0:08, Ben Laurie wrote:


> > Oh surely you can do better than that - making it
> hard to guess the seed
> > is also clearly a desirable property (and one that
> the square root "rng"
> > does not have).
U can choose any arbitrary seed(greater than 100 bits
as he (i forgot who) mentioned earlier.Then subject it
to the Rabin-Miller test.
Since the seed value is a very large number,it would
be impossible to determine the actual value.The
chances the intruder  find the correct seed or the
prime number hence generated is practically verly low.

> > 
> 
> Of course, finding the square root of a 100 digit
> number to a 
> precision of hundreds of decimal places is a lot of
> computational
> effort for no good reason.
Yes the effort is going to be large but why no good
reason?  



> BTW, the original poster seemed to be under the
> delusion that
> a number had to be prime in order for its square to
> be irrational,
> but every integer that is not a perfect square has
> an irrational 
> square root (if A and B are mutually prime, A^2/B^2
> can't be
> simplified).

Nope ,I'm under no such delusion :)
> George
> > Cheers,
> > 
> > Ben.
> > 
> > --
> > http://www.apache-ssl.org/ben.html  
> http://www.thebunker.net/
> > 
> > "There is no limit to what a man can do or how far
> he can go if he
> > doesn't mind who gets the credit." - Robert
> Woodruff
> 


__
Do You Yahoo!?
Yahoo! Games - play chess, backgammon, pool and more
http://games.yahoo.com/



Re: Two ideas for random number generation

2002-04-22 Thread Eugen Leitl

On Mon, 22 Apr 2002, Tim May wrote:

> What real-life examples can you name where Gbit rates of random digits 
> are actually needed?

Multimedia streams, routers. If I want to secure a near-future 10 GBit
Ethernet stream with a symmetric cypher for the duration of a few years
(periodic rekeying from a RNG might help?) I need both lots of internal
state (the PRNG can't help leaking information about its state in the
cypher stream, though the rate of leakage is the function of smarts of the
attacker) and a high data rate.
 
> In any case, if someone wants Gbits per second of random numbers,
> it'll cost 'em, as it should. Not something I think we need to worry
> much about.

Maybe, but it's neat trying to see how the constraints of 2d and 3d layout
of cells, signal TOF and fanout issues influence PRNG design if lots of
state bits and a high data rate are involved. It is not very useful right
now, agreed.




Re: Two ideas for random number generation (h/w comments)

2002-04-22 Thread Major Variola (ret)

At 08:09 AM 4/22/02 +0200, Eugen Leitl wrote:
>> And if one is implementing a PRNG in software, it is trivial to
>> have lots of internal state (asymptotically approaching one-time
>> pad properties).
>
>Yes, but software is too slow to be able to handle >GBit data rates.
It's
>inefficient use of CPU silicon real estate.

Comment on the latter: There is a lot of (high-margin) hardware made
which does something
that software can do, but much faster.  This sounds obvious, but figure
out how long you
have to make a routing decision if your bits are coming in 10^10 / sec
and a new packet is every 2400 bits.  Your basic Linux/486+2NICs
hobbyist router
does not have a chance :-) though its a perfectly viable solution at
lower rates.
Neither does your latest bleeding-edge general purpose Pentium-whatever.

So a general-purpose CPU is not just 'inefficient' ---its also
'insufficient'.  Also,
they tend to burn a lot more watts than an ASIC would.

Comment on the former:
If you *really* had a reason and money for a hardware PRNG *and* you
needed
a lot of state, you'd just synthesize up a block of memory on your chip,
or an interface
to whatever kind of external D/SRAM you preferred.

\begin{understatement}
Software is a lot easier than h/ware if you don't need performance.
\end{understatement}




Re: Two ideas for random number generation

2002-04-22 Thread Tim May

On Sunday, April 21, 2002, at 11:09  PM, Eugen Leitl wrote:

> On Sun, 21 Apr 2002 [EMAIL PROTECTED] wrote:
>
>> Why would one want to implement a PRNG in silicon, when one can
>> easily implement a real RNG in silicon?
>
> Both applications are orthogonal. PRNG != entropy.
>
>> And if one is implementing a PRNG in software, it is trivial to
>> have lots of internal state (asymptotically approaching one-time
>> pad properties).
>
> Yes, but software is too slow to be able to handle >GBit data rates. 
> It's
> inefficient use of CPU silicon real estate.
>

What real-life examples can you name where Gbit rates of random digits 
are actually needed?

Even high-bandwidth transfers of MPEGs, for example, will be done with 
conventional ciphers using only a tiny fraction of this bandwidth for 
the random number parts of the ciphers.

Speaking of real world issues, it's been half a dozen years since 
Goldberg and Wagner broke the Netscape "time of day random number 
generator." I've heard of no serious attacks when a PRNG is used 
properly. Not to say it can't happen, or won't happen, or hasn't already 
happened.

In any case, if someone wants Gbits per second of random numbers, it'll 
cost 'em, as it should. Not something I think we need to worry much 
about.

--Tim May
"As my father told me long ago, the objective is not to convince someone
  with your arguments but to provide the arguments with which he later
  convinces himself." -- David Friedman




Re: Two ideas for random number generation

2002-04-21 Thread Eugen Leitl

On Sun, 21 Apr 2002 [EMAIL PROTECTED] wrote:

> Why would one want to implement a PRNG in silicon, when one can 
> easily implement a real RNG in silicon?

Both applications are orthogonal. PRNG != entropy.
 
> And if one is implementing a PRNG in software, it is trivial to 
> have lots of internal state (asymptotically approaching one-time 
> pad properties). 

Yes, but software is too slow to be able to handle >GBit data rates. It's 
inefficient use of CPU silicon real estate.




Re: Two ideas for random number generation

2002-04-21 Thread Tim May

On Sunday, April 21, 2002, at 09:53  PM, Joseph Ashwood wrote:

> - Original Message -
> From: <[EMAIL PROTECTED]>
> To: "Tim May" <[EMAIL PROTECTED]>; "Eugen Leitl" <[EMAIL PROTECTED]>
> Cc: <[EMAIL PROTECTED]>
> Sent: Sunday, April 21, 2002 1:33 PM
> Subject:  Re: Two ideas for random number generation
>
>
>> Why would one want to implement a PRNG in silicon, when one can
>> easily implement a real RNG in silicon?
>
> Because with a pRNG we can sometimes prove very important things, while 
> with
> a RNG we can prove very little (we can't even prove that entropy 
> actually
> exists, let alone that we can collect it).

Not to be pedantic, but we cannot _really_ prove that entropy is being 
collected from a PRNG, either.

In fact, there really _is_ no entropy from a PRNG: the bits are 
deterministic from the PRNG, meaning they actually have no uncertainty, 
no entropy. To the person who selected the program and the seed, _he_ 
certainly sees no randomness, no entropy. Relativity. Usual 
Kolmogorov/Chaitin stuff here.

Operationally, when faced with a purported source of "random numbers," 
naive calculations of entropy are often made. This is true for the 
output of a Johnson noise diode, or a radioactive decay circuit, or the 
RAND Corporation's Table of One Million Random Digits.

But all of the calculated ("measured") entropy numbers collapse to zero 
or some small number if and when someone figures out that he knows how 
to predict the nth digit.

The usual arguments I hear about why PRNGs are better is that they a) 
can give reproducible sequences for testing software (often much more 
useful than physically-derived numbers would be, for obvious statistical 
reasons), b) in some sense there is more assurance that a BBS generator, 
for example, has not been manipulated.

If I wanted a good PGRNG (Pretty Good RNG) I'd favor one that mixes a 
physical source, mixes entropy extracted from user actions (keystroke 
intervals, mouse movements, over a lot of days), and distills it all 
some more with a BBS.

(The meta-solution for good numbers is then to mix different kinds of 
RNGs, to take a BBS output and XOR it with a Lava Lamp sequence, to seed 
a PRNG with mouse strokes, etc. As with ciphers, composition of many 
functors tends to help, unless one of the stages has bad properties, 
e.g., avalanche conditions. (Blowfish (3DES (Bassomatic (foo))) will be 
at least as strong as (Blowfish (foo)), all other things being equal.)

--Tim May
"You don't expect governments to obey the law because of some higher 
moral development. You expect them to obey the law because they know 
that if they don't, those who aren't shot will be hanged." - -Michael 
Shirley




Re: Re: Two ideas for random number generation

2002-04-21 Thread Joseph Ashwood


- Original Message -
From: <[EMAIL PROTECTED]>
To: "Tim May" <[EMAIL PROTECTED]>; "Eugen Leitl" <[EMAIL PROTECTED]>
Cc: <[EMAIL PROTECTED]>
Sent: Sunday, April 21, 2002 1:33 PM
Subject: CDR: Re: Two ideas for random number generation


> Why would one want to implement a PRNG in silicon, when one can
> easily implement a real RNG in silicon?

Because with a pRNG we can sometimes prove very important things, while with
a RNG we can prove very little (we can't even prove that entropy actually
exists, let alone that we can collect it).

> And if one is implementing a PRNG in software, it is trivial to
> have lots of internal state (asymptotically approaching one-time
> pad properties).

The problem is not having that much internal state, but what do you do with
it? Currently the best options on that front involve using block ciphers in
various modes, but this has a rather small state, but again we can quite
often prove things about the construct.
Joe




Re: Two ideas for random number generation: Q for Eugene

2002-04-21 Thread georgemw

On 21 Apr 2002 at 10:00, Major Variola (ret) wrote:

> At 11:22 AM 4/21/02 +0200, Eugen Leitl wrote:
> 
> >I disagree here somewhat. Cryptography ttbomk doesn't have means of
> >construction of provably strong PRNGs, especially scalable ones, and
> with
> >lots of internal state (asymptotically approaching one-time pad
> >properties), and those which can be mapped to silicon real estate
> >efficiently both in time (few gate delays, >GBps data rates) and in
> space
> >(the silicon real estate consumed for each bit of PRNG state).
> 
> What is a "provably strong" PRNG?  Strong against what?
> If I'm supposed to know this, and have forgotten it, a
> pointer will suffice.  I know what the attacks are for a crypto-strong
> plain-ole-analog-based-RNG.
> 
> Its quite easy to generate apparently-random (ie, PRNGs) from
> block ciphers being fed, say, integers, or their own output, etc.
> These can be made small and fast in hardware.  Large families of
> these can be constructed e.g. by varying bits e.g., in Blowfish's
> S-tables, etc.
> 
> 
Yes.  If you know what PRNG somebody is using and you know the
seed you know the output.  Seems to me the best a PRNG
could hope to get is a situation where, looking at a long stream
of output, there's no way of predicting the future output that's
more efficient than guessing the initial seed.  I don't think
achieving that is all that hard in practice. 

George




Re: Two ideas for random number generation: Q for Eugene

2002-04-21 Thread Major Variola (ret)

At 11:22 AM 4/21/02 +0200, Eugen Leitl wrote:

>I disagree here somewhat. Cryptography ttbomk doesn't have means of
>construction of provably strong PRNGs, especially scalable ones, and
with
>lots of internal state (asymptotically approaching one-time pad
>properties), and those which can be mapped to silicon real estate
>efficiently both in time (few gate delays, >GBps data rates) and in
space
>(the silicon real estate consumed for each bit of PRNG state).

What is a "provably strong" PRNG?  Strong against what?
If I'm supposed to know this, and have forgotten it, a
pointer will suffice.  I know what the attacks are for a crypto-strong
plain-ole-analog-based-RNG.

Its quite easy to generate apparently-random (ie, PRNGs) from
block ciphers being fed, say, integers, or their own output, etc.
These can be made small and fast in hardware.  Large families of
these can be constructed e.g. by varying bits e.g., in Blowfish's
S-tables, etc.




Re: Two ideas for random number generation

2002-04-21 Thread Eugen Leitl

On Sat, 20 Apr 2002, Tim May wrote:

> As a meta-point, the world is not in short supply of lots of good RNGs, 
> ranging from Johnson noise detectors to very strong Blum-Blum-Shub 
> generators. The interesting stuff in crypto lies in other places.

I disagree here somewhat. Cryptography ttbomk doesn't have means of
construction of provably strong PRNGs, especially scalable ones, and with
lots of internal state (asymptotically approaching one-time pad
properties), and those which can be mapped to silicon real estate
efficiently both in time (few gate delays, >GBps data rates) and in space
(the silicon real estate consumed for each bit of PRNG state).

It is rather hard to get all these requirements under one hat.




Re: Two ideas for random number generation

2002-04-20 Thread Tim May

On Saturday, April 20, 2002, at 01:51  PM, gfgs pedo wrote:

> hi,
>
> Here are two ideas which came up in my mind.
> Since I have done a few diagrams for illustration and
> thought that it will not be a good idea as
> attachment,I have put the ideas at the following url
> http://www.ircsuper.net/~neo/
>
> I sincerely appreciate ur comments.Thank u for ur
> time.
>

After including a huge amount of stuff which looks to be taken directly 
from a textbook, you write:

"Therefore square root(5)

2.23606797749978969640917366873128..

We consider only the decimal part 23 60 67 97 74 99 78 96 96 40 91 73 66 
87 31 28

Above we break the decimal into set of 2 digits proceeding from left to 
right and gives us random numbers 23, 60, 67, 97, 74, 99, 78, 96, 96, 
40, 91, 73, 66, 87, 31, 28

As above we obtain 16 random numbers between 0 an 100.

By extending this idea to 3 digits, by grouping the decimals as 3 digits 
we can get as many random numbers between 0 and 1000.

This idea can be extended to any higher order.

Is this a good idea for a random number generator too?

Thank you for your time.
"

These decimal digits in the expansion of SQRT(5) are NOT "random." They 
have a shorter description than themselves, namely, "SQRT(5)," and hence 
are neither random by accepted definitions nor are they as hard to 
"predict" as better random-like numbers would be. In short, all somewhat 
has to do is guess "Maybe he's using the SQRT of a natural number as his 
source of "random" numbers."

Your scheme is not even as good as most PRNGs, which at least purport to 
make the sequence far-removed from any of the "very short description" 
sequences.

As a meta-point, the world is not in short supply of lots of good RNGs, 
ranging from Johnson noise detectors to very strong Blum-Blum-Shub 
generators. The interesting stuff in crypto lies in other places.

--Tim May
"He who fights with monsters might take care lest he thereby become a 
monster. And if you gaze for long into an abyss, the abyss gazes also 
into you." -- Nietzsche




Re: Two ideas for random number generation

2002-04-20 Thread Sandy Harris

gfgs pedo wrote:
> 
> hi,
> 
> Here are two ideas which came up in my mind.
> Since I have done a few diagrams for illustration and
> thought that it will not be a good idea as
> attachment,I have put the ideas at the following url
> http://www.ircsuper.net/~neo/
> 
> I sincerely appreciate ur comments.Thank u for ur
> time.

Random numbers used in any security application must
meet far stronger requirements than in other applications,
such as simulations etc. The standard reference is:
http://www.ietf.org/rfc/rfc1750.txt

A draft of an updated version is also available:
http://search.ietf.org/internet-drafts/draft-eastlake-randomness2-02.txt

There's also some good material at www.counterpane.com,
in the papers section under "yarrow".

My guess is that neither of your suggestions is useful for
security applications. 

In one case, the input or seed is a text file, not a remarkably
random object, and completely useless if the enemy can discover
or guess what file is used. Also, I cannot tell if your method
is either as secure as or more efficient than a more direct
approach using standard crypto prinitives. For example, one
might just hash the textfile and use the result to key a block
cipher in counter mode.

In the other case, the key seems to be choice of a prime
number. You need a large prime, perhaps 100 bits or more,
to make brute force guessing sufficiently hard. Is your
arithmetic then acceptably efficient? Can an enemy deduce
the prime, or even narrow down the search dangerously,
if he sees enough of your output?



Re: Two ideas for random number generation

2002-04-20 Thread Morlock Elloi

For the start, before deeper analysis, it would be a good idea to run Diehard
on the output, just to check for the obvious problems.



=
end
(of original message)

Y-a*h*o-o (yes, they scan for this) spam follows:
Yahoo! Games - play chess, backgammon, pool and more
http://games.yahoo.com/




Two ideas for random number generation

2002-04-20 Thread gfgs pedo

hi,

Here are two ideas which came up in my mind.
Since I have done a few diagrams for illustration and
thought that it will not be a good idea as
attachment,I have put the ideas at the following url
http://www.ircsuper.net/~neo/

I sincerely appreciate ur comments.Thank u for ur
time.

Regards Data.



__
Do You Yahoo!?
Yahoo! Games - play chess, backgammon, pool and more
http://games.yahoo.com/