Cryptography-Digest Digest #322, Volume #13 Wed, 13 Dec 00 17:13:01 EST
Contents:
Re: YAPRNG (John Myre)
electronic voting article in comp.risks (John Myre)
Re: How to embed a key in executable code? ([EMAIL PROTECTED])
Re: Protocol for computer go (Paul Rubin)
Re: Virtual memory security hole? (Paul Rubin)
Re: Software PRNG.. ([EMAIL PROTECTED])
Re: On using larger substitutions (Simon Best)
Re: Sr. Cryptographer/mathematician (John Savard)
Re: Unguessable sequence of unique integers? ("Brian Gladman")
Re: Probability of collision in hash ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: John Myre <[EMAIL PROTECTED]>
Subject: Re: YAPRNG
Date: Wed, 13 Dec 2000 14:03:17 -0700
David Wagner wrote:
>
> If you xor two linear keystream generators together,
> the result is still linear.
<snip>
Well, assuming XOR is part of the linear algebra in the
first place.
Just to be argumentative, suppose that we have LCG, except
instead of (ax + b) we compute (x^a * b). (I don't mean
to imply that xor'ing two such streams would be secure, or
practical, only that the result is not so obviously linear).
JM
------------------------------
From: John Myre <[EMAIL PROTECTED]>
Subject: electronic voting article in comp.risks
Date: Wed, 13 Dec 2000 14:10:17 -0700
Pursuant to a rather active recent thread here, about
electronic or internet voting, there is an interesting
article in comp.risks (a moderated newsgroup), in the
entry "Risks Digest 21.14".
JM
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: How to embed a key in executable code?
Date: Wed, 13 Dec 2000 21:15:24 GMT
In article <pICZ5.1700$[EMAIL PROTECTED]>,
"Matt Timmermans" <[EMAIL PROTECTED]> wrote:
> She knows what the computer did, but it can take a lot more than
skill at
> reading assembly to understand it. To have the kind of power you
attribute
> to her, she needs to be able to do things like solve stopping
problems --
> another thing which is common in practice, even though the problem is
known
> to be undecidable in general.
No she does not need to solve the stopping problem. She only needs to
solve the reverse-engineering problem, which is substantially easier.
>
> This is the crux of our disagreement. My argument is "it is not
proven, in
> theory, that Alice can do this". You keep saying "yes she
can". "yes she
> can" does not constitute a proof, and so it does not refute my
argument,
> even though you have lots of empirical evidence to back you up.
Correction, your argument seems to be "No she can't" whlie mine is
based on known proven facts about the problems involved. Alice
possesses the puzzle, the solution, the steps taken to generate the
solution, the verifier for the solution, and an oracle to generate as
many of these as needed. With that information there are no problems
that cannot be solved. Given that information every hash problem, every
encryption problem, every problem with steps to achieve that solution
can be solved. You are merely presenting your personal blindness as
mathematical fact.
> In my reckless youth I wrote programs that I cannot understand Today
> (in programming, too, power accrues more quickly than wisdom ;-).
Code that
> is well obfuscated by a keyed process that is designed to obfuscate
might
> very well be incomprehensible to anyone who doesn't have the key.
There still exists an entity that understands the programs you wrote.
Simply because you have chosen to change your entity in such a way as
to not remember, does not change the existance fact.
>
> > Claims about no algorithm being known that is
> > polynomial time make no difference to possible/impossible, the
presence
> > of an algorithm proves possible, regardless of the running time.
>
> P-time matters, because I said "feasible", not "possible" -- reverse
> engineering can be infeasible in the same way that finding SHA
collisions is
> infeasible (It certainly is algorithmically possible, even though
noone
> could do it). P-time is simply the best working definition
of "feasible"
> that we have.
This discussion only became a discussion of feasible when you chose to
take your half of it there. I have since the beginning been discussing
possible. It is impossible to embed data in an executable that someone
analyzing the executable cannot find. It is impossible to have a
computer do something that a properly equiped user cannot uncover.
>
> > Reverse engineering is very possible, the engineer is unlikely to
> > attain an exact copy of the starting code, however a functional
> > equivalent is all that is necessary.
>
> Comparing programs for equivalence is also undecidable.
It is undecidable for a computer. However, we are talking about
engineers, not computers. A human is much more powerful than a turing
machine.
> The trust model is a different question
The trust model is exactly what dictates the security requirements, it
is a base question. You are considering things based on assumptions you
do not realize you have. You have asked what color is the sky, without
realizing that the assumed questions you have asked are how high are
you, and what constitutes air.
> , but that works too. With the
> method of embedding personal information, it works because Alice
doesn't
> want to release the software in its original form.
All she need to do is create a system where users can place their own
information in there. The method of placing a name can of course be a
false name, placing phone/address/etc can also be false. If you embed a
credit card number the user only has to make use of the transience of
credit cards, especially if you accept American Express who now offers
one time numbers.
> You have to trust the
> process that gets you Alice's personal information, but that's
outside the
> scope of this discussion,
If you intend to use it as a deterent it is very much inside the scope
of this discussion. <applicable to US only, all other areas may differ
greatly>It is entirely possible for an individual to register a
legitimate alias with the police, use that alias to open a post office
box, get a credit card for that alias, use that credit card to
purchase, and register the termination of that alias. Those acts are
perfectly legal, and without a court order (or the prior commission of
a felony) the police bound by law to not reveal that information. That
gives a perfectly legal, verifiable name, shipping address, credit
card, etc. And no tracability to the individual.
> and you must trust that Alice doesn't want to go
> to jail, which is required for _all_ of our legal protection.
Legal arguments will get you nowhere in a discussion of piracy. First
you were discussing possible, you failed, then you discussed feasible,
it is also failing, now your moving on to legislation, which has
nothing to do with the problem because the distribution through
freedomnet will certainly take care of any possible prosecution.
> We do not
> need to trust alice or her computer or her disk -- she can
redistribute the
> software in its original form if she likes, but that makes it likely
that
> she will get caught.
Only if she's stupendously stupid (how do you prosecute someone who
doesn't exist?). You also have to trust that she will not perform a
data removal, and/or data verification alteration.
>
> We _do_ need to trust that, even with ICE et cetera, and all of the
code
> available to her, Alice cannot modify the software in such a way that
will
> preserve its functionality while removing her personal information.
There
> is no reason, in principle, to dismiss this as impossible.
There are plenty of reasons. Alice has possession of the puzzle, the
solution to the puzzle, instructions to solve the puzzle, and an oracle
to generate the solution and instructions given a puzzle.
Joe
Sent via Deja.com
http://www.deja.com/
------------------------------
From: Paul Rubin <[EMAIL PROTECTED]>
Subject: Re: Protocol for computer go
Date: 13 Dec 2000 13:34:38 -0800
Francois Grieu <[EMAIL PROTECTED]> writes:
> On the computer-go mailing list [1] there is ongoing debate
> on how to design a realistic protocol for computer go.
The corresponding question for chess gets discussed on
rec.games.chess.computer fairly regularly. You might want to ask
there instead of sci.crypt.
Basically though it's a hard problem. It was a subject of dispute
after the Kasparov-Deeper Blue match when Kasparov accused IBM of
having a human intervene with the machine's thinking and give it
some good moves.
The strongest chess programs are nondeterministic. They use multiple
processors and might choose a different variation if one processor
finishes evaluating some subtree a little earlier than another one,
which in turn depends on the exact sequence of cache hits or whatever.
So the exact move they pick is decided partly by the whole system, not
just the high level algorithm.
Think of a random number generator that's using all kinds of sources
of system entropy, like disk operation timings, to produce output, and
the difficulty in getting such a system to produce the same output
sequence twice.
------------------------------
From: Paul Rubin <[EMAIL PROTECTED]>
Subject: Re: Virtual memory security hole?
Date: 13 Dec 2000 13:37:56 -0800
"Harvey Rook" <[EMAIL PROTECTED]> writes:
> > Why would your memory get written to disk in a crash? I mean it
> > might happen because of some bug, but your files might also get
> > broadcast over the network because of some bug.
>
> It is very common for OS's to dump a programs memory to the disk when the
> program commits an access violation such as accessing a NULL pointer. The
> dumped memory can be forwarded to the programmer for diagnoses. The UNIX
> term is 'core dump' the Windows NT term is "crash dump"
At least in unix, those dumps end up in the file system, not the swap
area. If you're using an encrypted file system, the dump will be
encrypted.
> The whole issue of accessing the swap file, and memory dumps raises
> a number of hard questions. Under what circumstances would your
> attacker be able to read the swap file, yet not be able to place a
> keyboard bug (Either hardware or software) on your system. A hole in
> your network security? A stolen laptop?
It may be easier for someone to seize or steal your machine with your
knowledge than for them to tamper with it without your knowledge.
If you think someone has been messing with your secure machine, you
probably should not trust the machine after that.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Software PRNG..
Date: Wed, 13 Dec 2000 21:30:47 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
[Paul Crowley said RC4 has a slight bias]
> Although, I've a question about this term "biased", what do you mean
> with that something is "biased"?
It means as was said before, that there is something about a long
stream of numbers that makes the likelihood of prediction of a bit
higher than 1/2. In the particular case of RC4 it means that there is a
probably of a (8-bit) number followed by the same (8-bit) number that
is 1/256 + 1/2^24, the actual bias should be exactly 1/256. Like Paul
said, it's a slight bias, but it can be very harmful in some
situations. There is another known issue with RC4, the first byte is
biased towards 0, this shows whenever the first byte of key plus the
second byte of key mod 256 is 0 (the sum is either 0, 256, 512, ...
k*256.
Sent via Deja.com
http://www.deja.com/
------------------------------
From: Simon Best <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: On using larger substitutions
Date: Wed, 13 Dec 2000 21:41:25 +0000
Hello!
I'm new here (but have been lurking awhile)...
Mok-Kong Shen wrote:
[...]
> One generates four 8-bit substitution tables. The two
> bytes of the given 16 bits are first transformed by the
> first two tables respectively. The result is circularly
> shifted 4 bits and the remaining two substitution tables
> are applied in the same manner.
[...]
> M. K. Shen
> -------------------------------
> http://home.t-online.de/home/mok-kong.shen
Or how about:
1: Apply the first two S-boxes to the two bytes.
2: Use a similar method to Rijndael's column mixing for mixing the two
bytes.
3: Apply the second two S-boxes.
? It's the same as yours, but with something a little different for the
step in the middle.
For that middle step, you could use:
(1) b[1]x + b[0] = ( c[1]x + c[0] )( a[1]x + a[0] ) mod x^2 + 1
where:
('[i]' means 'subscript i',)
a[1], a[0] are the original high and low bytes,
b[1], b[0] are the final high and low bytes, and
c[1], c[0] are constants.
The bytes would be treated as elements of GF(2^8) just like in Rijndael,
where things are done modulo x^8 + x^4 + x^3 + x + 1 (or '11B' for
short).
That middle step boils down nicely to:
(2) b[1] = c[0]a[1] + c[1]a[0]
(3) b[0] = c[1]a[1] + c[0]a[0]
This doesn't work for any old constant bytes, as you want the thing to
be invertible (presumeably). '01' and '02' work, as do '10' and '11'
(hexidecimal). '10' and '11' are interesting, as this function is then
an involution, and you get:
(4) b[1] = '11'a[1] + '10'a[0]
(5) b[0] = '10'a[1] + '11'a[0]
(6) a[1] = '11'b[1] + '10'b[0]
(7) a[0] = '10'b[1] + '11'b[0]
It's nice'n'easy to implement, too, as you can do it with an extra S-box
and some xors. (4) and (5) can be expressed as:
(8) b[1] = a[1] + '10'( a[1] + a[0] )
(9) b[0] = a[0] + '10'( a[1] + a[0] )
which both include the term "'10'( a[1] + a[0] )". That's easy to
compute, as all you need do is xor a[1] and a[0] together (GF(2^8)
addition), and use the result to index the S-box used to do the
multiplication by '10'. Then, just xor what came back from the S-box
onto the two bytes, and you've done it!
Of course, there are lots of other ways to make big S-boxes out of small
ones. Block ciphers do it all the time... It's what many block ciphers
are, isn't it?
Simon
--
_______________________________________________________________________________
Personal: [EMAIL PROTECTED]
Yellow Skies: [EMAIL PROTECTED] http://www.yellowskies.com
Everyone does their own signature to be different. How does that work?
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Sr. Cryptographer/mathematician
Date: Wed, 13 Dec 2000 20:49:42 GMT
On Tue, 12 Dec 2000 15:17:36 GMT, Tom St Denis <[EMAIL PROTECTED]>
wrote, in part:
>In article <2IfZ5.17746$[EMAIL PROTECTED]>,
> "Kevin" <[EMAIL PROTECTED]> wrote:
>> techniques. You will aslo be expected to design new applications in
>the
>> above areas for incorporation into our secret-hiding, tamper proof
>software
>> encoding tools and to program key components to incorporate the
>designs into
>> our tool set.
>Tamper proof encoding tools? Shaw right.
Others have already taken you to task for confusing combinatorics
(counting Hamiltonian graphs and stuff) - a mathematical specialty
with, admittedly, some relevance to computation - with computing
science, and number theory (prime numbers and stuff) with numerical
analysis (solving differential equations numerically on computers and
stuff).
But here, you _almost_ made a valid criticism.
Tamper-resistant, and _nearly_ tamper-proof cryptographic *hardware*
is something valid; it may be imperfect, but making the best possible
effort along those lines is still worthwhile, and thus military cipher
machines do this.
Tamper proof _software_ encoding tools?
Again, it's not unreasonable to try to make reverse-engineering
software to overcome piracy-prevention code (i.e. dongle detection) as
difficult as possible. But this *is* known to be inherently very
limited; something impossible to do with the same security as is
provided even by a mediocre form of encryption.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: "Brian Gladman" <[EMAIL PROTECTED]>
Subject: Re: Unguessable sequence of unique integers?
Date: Wed, 13 Dec 2000 21:44:51 -0000
"Bryan Olson" <[EMAIL PROTECTED]> wrote in message
news:914rn0$1gh$[EMAIL PROTECTED]...
>
> John Savard wrote:
>
> > Johan Hanson wrote:
> > >I suppose I could do this
> [generate unguessable 32-bit integers without repetition]
> > > by using a simple counter
> > >and encrypting the result with a symmetric algorithm.
> >
> > That generally would be believed to be the only safe way of doing
> > this; however, the problem exists that if you use a block cipher with
> > a 32-bit block, to guarantee uniqueness of the 32-bit units output, it
> > is not believed that a block cipher with so short a block can be made
> > secure.
>
> A 32-bit block cipher should be fine. The small-codebook
> problem is the _only_ security issue that pushes us to
> larger block sizes. That problem is irrelevant in this
> case; what he wants is the permutation, not the cipher.
Maybe I have not fully understood what is proposed but would there not be a
danger that an adversary could obtain the full sequence from a small number
of samples in such a situation?
What I am thinking of is the proposed approach in which a 32-bit cipher is
fed by a simple counter. In this case, if the cipher key was short, it
would be possible to do an exhaustive key search by 'decrypting' two
adjacent numbers to look for keys that resulted in original 32-bit values
that differed by one. And if two blocks were not a good enough
discriminant, additional blocks could be decrypted in those situations where
trial decryptions gave adjacent values.
Of course I don't have a threat model (criteria to clarify what the
originator meant by 'unguessable') but I am assuming that being able to
recreate the whole sequence given a small contiguous sub-sequence would be a
problem.
Hence while the cipher might have a short block length, the key length would
need to be significantly longer. And here I admit that I have not given any
thought to the consequences of using short block, long key ciphers in this
application. Would this be adequately secure given threats of the above
type (or others)?
Brian Gladman
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Probability of collision in hash
Date: Wed, 13 Dec 2000 21:38:10 GMT
In article <917vo9$7kd$[EMAIL PROTECTED]>,
Pawel Krawczyk <[EMAIL PROTECTED]> wrote:
> 1. How to estimate probability of collision for a hash trimmed to
> given length - for example 96 bits out of total 160 for SHA1 etc.
> (like in HMAC-SHA1).
Given a good hash algorithm (which we currently believe MD5 and SHA-1
to be), the formula is very simple probable collision = 1/2^(output
size/2). So if you reduce either to 96 bits, the probability will be
1/2^(96/2).
>
> 2. Is the avalanche effect distributed equally along the result hash?
> I mean, are the terminating bits of the hash equally depending on
changes
> in plaintext as those from beginning?
Again currently that is believed to be the case. However if you are
concerned about this, simply take part you chopped off, and XOR it with
the remaining. Or just leave the hash the full size
> I was wondering how to fingerprint keys best, that's why I'm asking.
> Thanks for your replies or just pointers to some documents.
I'd suggest that you leave the hash the full size (don't clip it), and
express it to the user in hex so that it is relatively easy to read
over the phone, you might also want to look into using a word list like
PGP does, some people really like being able to say choking aimless
cowbell.... instead of 34C. Also depending on how many people you are
expecting on this, and how strong you want the collision resistance you
may actually want to move to a larger hash, like Tiger-192, SHA-256,
SHA-384, or SHA-512.
Joe
Sent via Deja.com
http://www.deja.com/
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************