Cryptography-Digest Digest #607, Volume #14 Wed, 13 Jun 01 20:13:01 EDT
Contents:
Re: Kernaugh maps (try #2) (Sam Yorko)
Re: shifts are slow? (Bob Jenkins)
Re: Alice and Bob Speak MooJoo ("Robert J. Kolker")
Re: Uniciyt distance and compression for AES (Tim Tyler)
Re: IV (Tim Tyler)
Re: IV (Tim Tyler)
Re: Yarrow PRNG (Tim Tyler)
FIPS 140-1 test ("Dobs")
RNG ("Dobs")
Re: RNG (SCOTT19U.ZIP_GUY)
Re: Sophie-Germain Primes for sale (Anton Stiglic)
----------------------------------------------------------------------------
From: Sam Yorko <[EMAIL PROTECTED]>
Subject: Re: Kernaugh maps (try #2)
Date: Wed, 13 Jun 2001 13:23:25 -0800
Jeffrey Walton wrote:
>
> : So the Kernaugh map is just a way to
> : optimize the expressions for where a 1 occurs in the table?
>
> I find it easier and less error prone than reducing by hand. As jlcooke
> stated, it can be bone with min terms (0s) also.
>
> Also, this method only works for 4 inputs (possibly 6 if you can do this
> with a cube - that would be impressive, but not impossible).
>
Actually, I had to do 5 and 6 inputs in college. What you do is to
split each of the squares in the table into two or four pieces using
diagonal lines. Then, you have to be real careful in making the
circles. It's a real pain, but can be done....
------------------------------
From: [EMAIL PROTECTED] (Bob Jenkins)
Subject: Re: shifts are slow?
Date: 13 Jun 2001 13:29:56 -0700
"Niels J=?ISO-8859-1?B?+A==?=rgen Kruse" <[EMAIL PROTECTED]> wrote in message
news:<xhJU6.173$[EMAIL PROTECTED]>...
> I artiklen <[EMAIL PROTECTED]> ,
> [EMAIL PROTECTED] (Bob Jenkins) skrev:
> > My old model of the world had +-^|&~ take 1 cycle, tab[] take 2,
> > if() take 5 if it guesses wrong, * take 10, and / take 20. That's
> > apparently no longer close to reality. What is the new reality?
>
> This depends very much on the CPU. For the PPC7450, the timings are
> (latency,throughput):
>
> +-^|&~ (1,1) (except arithmetic right shift
> and other oddities)
> tab[], int/vector (3,1)
> tab[], floating point (4,1) (distance from L1 to FP register file is
> larger than to int and vector r. files)
> mispredict 6 (minimum)
> 32*8 bit (3,1)
> 32*16 bit (3,1)
> 32*32 bit (4,2)
> / (23,23)
>
> These are the latencies as far as forwarding is concerned. Getting condition
> codes ready take another cycle.
>
> What was your old world? Pentium MMX? I believe load is 3 cycle latency on
> Pentium III.
My old model was a gestalt of many chips from about 1992.
Those timings on multiplication look very good. I should look into
replacing shifts with multiplications. I suppose I should investigate
tab[] too.
------------------------------
From: "Robert J. Kolker" <[EMAIL PROTECTED]>
Subject: Re: Alice and Bob Speak MooJoo
Date: Wed, 13 Jun 2001 17:07:27 -0400
David A Molnar wrote:
>
> I think the issue here is in the model, then. Normally we say Eve has
> access only to the communication between Alice and Bob. As you point out,
> given these assumptions about language, this means Eve gets noise as long
> as she cannot observe Alice and Bob's referents.
How would Eve know whether A/B are discussing the weather,
the stockmarket or the war de jour? Is Eve in a position to
force a topic of discussion on A/B. If so, some kind of referrent
could be teased out, otherwise no. Is there any thing corresponding
to the chosen plaintext attack here? I don't think so.
In the case of the Navajo Code Talkers, the Japanese who were
evesdropping on their communication had some idea of what
the Code Talkers were talking about, but were unable to tease
out any specific words and meanings. If the Navajo Code talk
had a lot of synonyms then the frequency analysis would fail,
as the synonyms would serve as homophones.
>
> Except in the real world, Eve may observe some referents of Alice and Bob.
> Troop movements, stock prices, whatever. Alice and Bob have to talk about
> *something*, since the claim is that all language must refer to
> *something*. So is this something accessible to Eve "in the real world" or
> is it not?
The nature of eavesdropping is there is no ostention. We learn our
first language with the aid of a pointing finger. This method of
indicating the referrent non-verbally is completely lacking when
all Eve can do is listen to a conversation but not * see * what
is being referred to.
>
>
> If it is accessible, then Eve can learn to speak MooJoo.
> If it is not accessible, then what's the point of Eve observing Alice and
> Bob? They won't affect her anyway; any effect they might have would create
> a referent which would allow Eve to start learning MooJoo.
My point exactly. Alice and Bob will have a secure conversation.
>
>
> So the model of Eve having access only to Alice and Bob's talk seems
> possibly too restrictive to be useful here.
I brought it up to show a mode of secure communication that does not
depend on encoding or encryption.
Bob Kolker
"We have learned your Earth languages from your radio
broadcasts ..." ---- almost any sci fi movie made in the 1950's
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Wed, 13 Jun 2001 21:19:11 GMT
[EMAIL PROTECTED] wrote:
: Tim Tyler wrote:
:> Here is the model I think you should have adopted:
:>
:> Alice -> [encrypt] -> ...eve... -> [decrypt] -> Bob
:>
:> Unicity distance is as measured by Eve.
:>
:> We are interested in the effect of replacing [encrypt] and [decrypt]
:> by [compress encrypt] and [decrypt decompress] respectively.
:>
:> What you have done is not /just/ make changes to the stages represented
:> inside square brackets.
:> You have made fundamental changes to the set of messages which are sent
:> and their probabilities.
: How is this avoidable if messages are compressed from n=4 to n=3?
: There are 16 possible messages before compression and 8 possible after
: compression.
That is not how compression programs function. I'm talking about
conventional compression programs, like zip, huffman, arithmetic, etc.
These allow you to compress a file, decompress it again and wind up with
the same result.
:> Your system after the compressor is added can't even transmit some
:> messages that were transmitted before - and *all* the probabilities of
:> the messages occurring are changed!
: This isn't surprising. [...]
It was suprising to me. How you could think comparing the resulting
systems had any meaning was most puzzling.
: Can *any* compressor compress *all* possible messages in the original
: message space?
Of course not - but that is not how compressors operate either. They
compress some files, and expand others.
:> Why should putting in a compressor affect the frequency of the messages
:> being sent? It should not.
:>
:> If you /must/ use a lossy compressor at least set the probability of
:> messages it can't handle to be 0 initially, or you are comparing different
:> systems with different message spaces.
: Again, doesn't one *have* to compare different message spaces since the
: value of n is different after compression??
No, you should be comparing the same message spaces, otherwise the
comparison is completely nonsensical.
:> Also, do not change the probabilities of the messages occurring, or else
:> you are comparing entropies in very different systems - and it is no
:> suprise that your unicity distances are all over the map.
: The probabilities in the original message space are what they are. If
: they are non-zero then assigning them zero probability isn't an option;
: *that* would be altering the message space.
Of course. But *don't* apply a compressor that can't accept these
messages as inputs and pretend that nothing has changed.
: All the probabilities in my compressed message spaces sum to one [...]
No they don't. Add up the probabilities in your second column, and you
will find they exceed one. A minor arithmetic error on your part.
I ignored it until now because there were other far more important errors
to address.
: Propose other probabilities.
Leave the probabilities completely alone, you you are not comparing
equivalent systems.
If you wish to persist with this nonsense, I recommend you do as I
suggested - consider probabilities after adding a compressor and
decompressor without modifying the rest of the system in any way at all.
In particular *don't* have Alice send Bob messages with different
probabilities in the new system.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: IV
Reply-To: [EMAIL PROTECTED]
Date: Wed, 13 Jun 2001 21:27:34 GMT
Volker Hetzer <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
: >Volker Hetzer <[EMAIL PROTECTED]> wrote:
: But, even for block ciphers, if you *want* to transmit arbitrarily
: long messages you have to transmit the length or some end sequence, thereby
: adding redundancy in a way that an attacker can use to distinguish possible
: from impossible plaintexts.
I believe you are mistaken there. The length can be sent in the clear -
and then no clues are added to distinguish possible from impossible
decrypts. This is the approach used by BICOM, for example.
Transmission protocols are such that the length is normally sent in the
clear anyway.
: OTOH, with the counter mode you don't have a full block, making analysis
: harder.
Hmm. I don't think anyone's discussing breaking the cypher here.
: How do they balance with a real world block cipher?
Your question is not clear. You're talking about Wagner's paper?
:> : However, I still fail to see the problem (my fault likely enough):
:> : Given a CTR encrypted message of one bit, i.e. a 0 or a 1, how
:> : would you go about decrypting it?
:>
:> You immedialey know it is not one of the (possibly zillions)
:> of possible messages larger than 1 bit - and can narrow the
:> possibilities down to one of two plaintexts. This is not what
:> encryption ought to be about IMO.
:
: Well, do you think of the otp as an example of what encryption is
: not about too? After all it has the same flaw CTR has.
Yes - the conventional OTP is an equally flawed system when used to
transmit small messages.
As you say, it has exactly the same problem as CTR mode has.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: IV
Reply-To: [EMAIL PROTECTED]
Date: Wed, 13 Jun 2001 21:31:41 GMT
Cristiano <[EMAIL PROTECTED]> wrote:
: "Tim Tyler" <[EMAIL PROTECTED]> ha scritto nel messaggio
:> Cristiano <[EMAIL PROTECTED]> wrote:
:> : [...] could you tell me if is there any weakness in my method?
:>
:> The fact that identical plaintext blocks (every N bytes) will result in
:> identical cyphertexts /is/ a weakness - if not an earth shattering one.
:>
:> An attacker who observed repeated blocks at those intervals can tell that
:> the plaintext repeated itself. He should not be able to do this.
: In this sense also ECB mode is a weakness.
Yes indeed.
That is one reason why use of ECB mode is not normally recommended.
This problem is covered in Applied Cryptography, 2nd edition, p. 190.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Yarrow PRNG
Reply-To: [EMAIL PROTECTED]
Date: Wed, 13 Jun 2001 21:38:49 GMT
Mark Wooding <[EMAIL PROTECTED]> wrote:
: [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
:> 1. Has anyone reviewed the properties of the PRNG? Is it as good as
:> it claims?
: The paper claims 160-bit security for Yarrow-160. If you interpret that
: to mean that you can't tell the difference between Yarrow-160 and real
: random data with less than 2^{160} effort then the claim is false. [...]
FWIW, I've found the Yarrow paper (and the other paper by the same folk
about analysing PRNGs) to be very useful material.
Although the specific system they proposed has one or two problems there
still seems to me to be be much of value in the general approach.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: "Dobs" <[EMAIL PROTECTED]>
Subject: FIPS 140-1 test
Date: Thu, 14 Jun 2001 00:20:37 +0200
I am looking for source code of FIPS 140-1 statistical test for randomness
which is used for high security application (that's what was written in
Handbook of Applied Cryptography:) Can somebody send me by mail working copy
of it. I would be also interested in any other code of working statistical
tests.
I've got one copy of FIPS 140-1 but it is not working. Does anybody know why
it's not working? Here is the code:
/*
Author: Pate Williams (c) 1997
The test suite is according to FIPS 140-1. See "Handbook
of Applied Cryptography" by Alfred J. Menezes
et al Section 5.4.4 pages 181 - 183 and 5.40
Algorithm page 186.
*/
#define BIT_STRING_LENGTH 20000l
#define MONOBIT_LO 9654l
#define MONOBIT_HI 10346l
#define POKER_LO 1.03
#define POKER_HI 57.4
int FIPS_140_1(char *bit_string)
/* Statistical tests for randomness returns - 3
if bit_string fails monobit test, - 2 if it
fails poker test, - 1 if it fails runs test,
0 if it fails the long run test 1 otherwise */
{
float X3, sum = 0.0;
long B[6]= {0}, G[6] = {0}, n[16] = {0};
long i, index, j, k, max_run = 0, n1 = 0;
long interval_lo[6] = {2267, 1079, 502, 223, 90, 90};
long interval_hi[6] = {2733, 1421, 748, 402, 223, 223};
/* monobit test */
for (i = 0; i < BIT_STRING_LENGTH; i++)
if (bit_string[i] == 1) n1++;
/* poker test */
i = 0;
k = BIT_STRING_LENGTH / 4;
for (j = 0; j < k; j++) {
index = 8 * bit_string[i + 3] + 4 * bit_string[i + 2]
+ 2 * bit_string[i + 1] + bit_string[i];
n[index]++;
i += 4;
}
for (i = 0; i < 16; i++) sum += n[i] * n[i];
X3 = 16.0 * sum / k - k;
/* runs test */
i = 0;
while (i < BIT_STRING_LENGTH) {
j = 0;
while (i < BIT_STRING_LENGTH && bit_string[i] == 1) i++, j++;
if (j <= 6) B[j - 1]++; else B[5]++;
if (j > max_run) max_run = j;
while (i < BIT_STRING_LENGTH && bit_string[i] == 0) i++;
}
i = 0;
while (i < BIT_STRING_LENGTH) {
j = 0;
while (i < BIT_STRING_LENGTH && bit_string[i] == 0) i++, j++;
if (j <= 6) G[j - 1]++; else G[5]++;
if (j > max_run) max_run = j;
while (i < BIT_STRING_LENGTH && bit_string[i] == 1) i++;
}
/* print out results of the tests */
printf("monobit statistic: %ld\n", n1);
printf("poker test statistic: %lf\n", X3);
printf("# blocks\tgaps\n");
for (i = 0; i < 6; i++)
printf("%ld %4ld\t\t%4ld\n", i + 1, B[i], G[i]);
printf("long runs statistic: %ld\n", max_run);
/* compute return value based on statistics */
if (n1 <= MONOBIT_LO || n1 >= MONOBIT_HI) return - 3;
if (X3 <= POKER_LO || X3 >= POKER_HI) return - 2;
for (i = 0; i < 6; i++) {
if (B[i] < interval_lo[i] || B[i] > interval_hi[i]) return - 1;
if (G[i] < interval_lo[i] || G[i] > interval_hi[i]) return - 1;
}
if (max_run >= 34) return 0;
return 1;
}
Thanks
Best Regards!!!
Mike
------------------------------
From: "Dobs" <[EMAIL PROTECTED]>
Subject: RNG
Date: Thu, 14 Jun 2001 01:00:42 +0200
According to the Bruce Schneier book I made such a random number generator:
I took two random sources from the system of the computer- I menaged to find
to such a function : lp.dwAvailPhys from
GlobalMemoryStatus(&lp) and timebuffer.millitm from _ftime( &timebuffer )
which seems to return random values from the system. I took last significant
bit of each, compare them:if it is the same I take to compare new couple of
bits or if it is different I took as my output the first bit. However my
output is like 1111111111111111000001111111111111111111111111111111 which
seems not to be random. Here is my code , could somebody tell me what I am
doing wrong? Thanks. Best Regards.Mike
#include <stdlib.h>
#include <time.h>
#include <conio.h>
#include <iostream.h>
#include <windows.h>
#include <sys/timeb.h>
#include <time.h>
#define BIT_MASK 1
char *GenBit(void){
static int bit1,bit2;
static int dig1, dig2;
static char result[1];
MEMORYSTATUS lp;
struct _timeb timebuffer;
static int stat;
GlobalMemoryStatus(&lp);
_ftime( &timebuffer );
cout<<timebuffer.millitm<<endl;
cout<<lp.dwAvailPhys<<endl;
dig2 = lp.dwAvailPhys%1119;
dig1 = timebuffer.millitm;
next:
if(stat) {
dig1 = dig1>>1;
dig2 = dig2>>1;
}
bit1 = dig1 & BIT_MASK;
bit2 = dig2 & BIT_MASK;
if( bit1 == bit2) {
stat = 1;
goto next;
}
// cout<<"Bit1 = "<<bit1<<endl;
// cout<<"Bit2 = "<<bit2<<endl;
sprintf(result, "%d", bit1);
return result;
}
void main(void){
static char buf[1];
static char result[1024];
for(int i =0;i<100;i++) {
Sleep(3);
sprintf(buf,"%s",GenBit());
result[i]=buf[0];
}
cout<<endl<<result<<endl;
}
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: RNG
Date: 13 Jun 2001 23:14:35 GMT
[EMAIL PROTECTED] (Dobs) wrote in <9g8rnd$kvt$[EMAIL PROTECTED]>:
>According to the Bruce Schneier book I made such a random number
>generator: I took two random sources from the system of the computer- I
>menaged to find to such a function : lp.dwAvailPhys from
>GlobalMemoryStatus(&lp) and timebuffer.millitm from _ftime( &timebuffer
>) which seems to return random values from the system. I took last
>significant bit of each, compare them:if it is the same I take to
>compare new couple of bits or if it is different I took as my output the
> first bit. However my output is like
>1111111111111111000001111111111111111111111111111111 which seems not to
>be random. Here is my code , could somebody tell me what I am doing
>wrong? Thanks. Best Regards.Mike
>
Since you asked. The first mistake was reading the BS crypto book.
At least thats my humble view. I just hope you weren't foolish and
spent a lot of bucks for it.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged or
something..
No I'm not paranoid. You all think I'm paranoid, don't you!
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Sophie-Germain Primes for sale
Date: Wed, 13 Jun 2001 20:04:46 -0400
Just go to google, type "Sophie Germain Prime", the first hit
you get will be:
http://www.utm.edu/research/primes/glossary/SophieGermainPrime.html
read the definition of Sophie Germain Prime.
I'm not buying any of those primes your selling...
--Anton
Tom St Denis wrote:
>
> Mark Wooding wrote:
> >
> > Tom St Denis <[EMAIL PROTECTED]> wrote:
> > > David Hopwood wrote:
> > > > Tom St Denis wrote:
> > > >
> > > > > A SG prime is of the form p = 2q + 1, where q itself is prime and of
> > > > > course p mod 4 = 3.
> > > >
> > > > No. It's not two weeks since I last corrected you on this (message ID
> > > > <[EMAIL PROTECTED]>).
> > > >
> > > > If p = 2q + 1 for p and q both prime, then q is a Germain prime, and
> > > > p is a safe prime. Also it is not part of the definition that p = 3
> > > > (mod 4) (counterexample: p = 5, q = 2, although admittedly that is
> > > > the only counterexample).
> > >
> > > Ok. Well if you checked the numbers you would realize they are all 3
> > > mod 4.
> >
> > Learn to read.
> >
> > The correction is that, when p = 2 q + 1 with p and q prime, it's *q*
> > which is the Sophie Germain prime. We in the crypto community call p a
> > `safe' prime, though I don't believe it has a name among general
> > mathematicians.
>
> I don't get why. q is just any prime (even a 'safe' one itself). I
> thought SG referred to p which is what I'm interested in anyways.
>
> > You're right that if q is odd then p = 2 q + 1 = 3 (mod 4). That's why
> > David said that 5 is the only safe prime not congruent to 3 (mod 4).
> >
> > > When I said "and of course p mod 4 = 3" I meant "of course I made them
> > > such that ...", although I can see how that could have been misleading.
> >
> > You couldn't have made nontrivial safe primes to have any other residue
> > mod 4.
>
> Rightly so which is why I forced the two bits to 1.
>
> Tom
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************