Cryptography-Digest Digest #571, Volume #11      Tue, 18 Apr 00 04:13:02 EDT

Contents:
  Re: Q: Entropy (Diet NSA)
  GOST related key attack ([EMAIL PROTECTED])
  Re: AES-encryption ("David C. Oshel")
  Re: Paper on easy entropy ("John E. Kuslich")
  Re: updated paper on easy entropy (Francois Grieu)
  Re: Regulation of Investigatory Powers Bill (Philip Baker)
  Re: Paper on easy entropy (Mok-Kong Shen)
  Page-tag encryption demonstration ("C. Prichard")
  Re: Regulation of Investigatory Powers Bill ("Stou Sandalski")
  Re: Paper on easy entropy (Niklas Frykholm)
  Re: updated paper on easy entropy (Terry Ritter)

----------------------------------------------------------------------------

Subject: Re: Q: Entropy
From: Diet NSA <[EMAIL PROTECTED]>
Date: Mon, 17 Apr 2000 14:14:14 -0700


In article <
8dfm9p$925$[EMAIL PROTECTED]>, Bryan
Olson <[EMAIL PROTECTED]> wrote:
>Diet NSA wrote:
>>
>> In article <8ddauf$npq$[EMAIL PROTECTED]>
>> , Bryan Olson <[EMAIL PROTECTED]>
>> wrote:
>>
>> >Again, depending on how references define Turing
>> >machines and universal Turing machines, they may
>> >place some other conditions on the machines.
>> >
>> In terms of additional conditions, I don't
>> see how the *type* of universal turing
>> machine (UTM) used would matter. Could
>> anyone explain a counterexample, i.e. how
>> defining the complexity of infinite
>> strings *would* be dependent on which
>> UTM is used?
>
>For a trivial example, one of the parameters
>defining a Turing machine is an alphabet.  To
>get a consistent measure of complexity, we might
>limit our consideration to machines defined
>over the same alphabet.
>
>In addition to being Turing-complete, there is
>another requirement on the languages, which is
>that they can efficiently encode and delimit
>string constants in the program text.
>

If I understand correctly, the algorithmic
complexity of *finite* strings depends on
which UTM is used. If we switched to a
new UTM then the complexities would
change by a *bounded* function. If this is
true, then wouldn't the complexities of
*infinite* strings be definable
independently of which UTM is used?


"I feel like there's a constant Cuban Missile Crisis in my pants."   
    - President Clinton commenting on the Elian Gonzalez situation
=======================================================================
* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!



------------------------------

From: [EMAIL PROTECTED]
Subject: GOST related key attack
Date: Tue, 18 Apr 2000 04:03:40 GMT

Hello all,

GOST is in general thought to be immune to subkey rotation related key
attacks.  The key schedule has a twist in the last eight of the thirty
two rounds that in general prevents such attacks.  I believe I have
found a small (2^64) class of keys that can be attack via subkey
rotation however.

Let A and B be 32-bit round keys.  If we have two 256-bit keys

(1) ABBAABBA
(2) BAABBAAB

The key schedule for 32 rounds is

(1) ABBAABBAABBAABBAABBAABBAABBAABBA
(2)   BAABBAABBAABBAABBAABBAABBAABBAAB

Note that by sliding the (2) key schedule two rounds the schedules
match for 28 out of 32 rounds.  This is perfect for Biham's subkey
rotation related key attack.  The twist is canceled out because the
keys are palidromes.

If we assume the s-boxes are known then I believe an attack can recover
the 64-bits of secret information faster than exhaustive search.  Note
the s-boxes are shared in GOST between trusted computers, not generated
by key, so it is possible to have known s-boxes.

The exhaustive search would take about 2^63 guesses or 2^68 rounds of
GOST.

To obtain the keys, one input must be found that matches encryption
under key (1) for two rounds.  The input is encrypted with key (2). Two
rounds of GOST is weak and the keys can be easily recovered.  When the
A and B suggested at the start match the A and B found at the end,
the key has been found.

If (L,R) and (L',R') are the two texts, the suggested values are
F(R^R') - L and F(L^L')- R'.  The ^ is exclusive-or in the case, F is
the inverse of the GOST F function.

Obtaining two pools of 2^32 known plain/cipher text for keys (1) and (2)
will lead to 2^64 pairs of input.  Via the Birthday Paradox, one of
those pairs should be a good pair. To check a pair, we get the
plaintext for both key and calculate the suggested values for A and B.
We then get the ciphertext for both and calculate the suggested values
for A and B again.  If they match, we have almost certainly found the
key.  False hits can be tested against a know plain/cipher text.

The calculation of the suggested keys is equivalent to four rounds of
GOST per pair.  On average we will find the key after 2^63 pairs. The
total rounds of GOST is then 2^65 which is 2^3 times faster than
exhaustive search.

Not much of an attack but I don't know of -any- other subkey rotation
attacks against 32 round GOST.  Schneier, et al. found a differential
related key attack against GOST.  Wagner has applied variants of the
slide attack to reduced GOST.

This attack might be speeded up by use of structures after guessing one
of the keys. I'll work on it.

As an aside, if we wanted to use 64-bit GOST, it would not be
unreasonable to fill the rest of the schedule by flip/flopping the
key.  So if our 64-bit key was AB we might generate AB BA AB BA as the
256-bit key. Thus every key in this 'schedule' would be open to this
attack,hmm.

--Matthew



Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "David C. Oshel" <[EMAIL PROTECTED]>
Subject: Re: AES-encryption
Date: Mon, 17 Apr 2000 23:35:56 -0500

In article <[EMAIL PROTECTED]>, Tom St Denis 
<[EMAIL PROTECTED]> wrote:

> You are such a crank....

Well, now.  Bandwidth is certainly cheaper by the bit for that.  So the guy
blundered and used his own initials in a name that happens to be turf owned
by nist.gov.  Maybe he's a taxpayer in Norway and wasn't paying attention to
merely Aremican (sic) raves.  Really, when it comes down to it, isn't the
contest already over, and won't the practical effect be that those algorithms
which are a) public domain, and b) untainted by political allegiances will 
be the ones in practical daily use?  What the U.S. guvmint chooses to do is
its own business.  The five finalists will be discussed for years to come,
regardless of who gets the medal, and Rijndael will probably outweigh the
rest in terms of sheer installed code base in ten years time.


>...  First AES is already in development and
> finalazion [is that even a word?] ...

"Finalization" is the grayspeak you are searching for.  Voible (sic) clarity
suggests that the AES process will probably wend its way to a conclusion
with or without any or all of the denizens of sci.crypt, including the ones
who can spell offkey on purpose :)

... remainder also snipped ...

-- 
David C. Oshel           mailto:[EMAIL PROTECTED]
Cedar Rapids, Iowa       http://pobox.com/~dcoshel
``Tension, apprehension, and dissension have begun!" - Duffy Wyg&, in Alfred
Bester's _The Demolished Man_

------------------------------

From: "John E. Kuslich" <[EMAIL PROTECTED]>
Subject: Re: Paper on easy entropy
Date: Mon, 17 Apr 2000 23:04:31 -0700

Entropy is a measure of the uncertainty in any information field.

If you enter a room and you know in advance that all the people in that room
are either named Bob or Ralph you only have to ask one yes or no question
(Is you name Ralph?) to be able to determine any individual's name.  There
is one bit of uncertainty.

On the other hand, if you enter a room with people whose names may be chosen
from a field of 8 possible names (2^3), you may have to ask a maximum of
three questions to be able to determine any individual name.  The questions
would be of the form: "Is you name Ralph or Bill or Pete or John?", for
example.  A yes would prompt a question: "Is your name Pete or John?" etc.
Here the entropy is 3 bits.

Your remark about entropy not decreasing  in a closed system is NOT correct
and this misconception is the cause of the always hilarious attempts by
anti - evolutionists (Creationists) to get on the side of science with their
wacky theories.

The Second Law of Thermodynamics actually states that "In any closed system
*TOTAL* entropy increases with time."  The key word TOTAL makes all the
difference in the world in the meaning of this law and when it is left out
it can be used to justify just about any zany theory you can think of (and
they don't come any more zany that "Creation Science").  It means that in
parts of the system entropy can decrease with time but in the aggregate
(total) entropy must increase with time. Please bear in mind that evolution
and religion are COMPLETELY disjoint concepts (you can believe in both)
despite the efforts of the religious right whackos.

Hope this helps with understanding entropy.  In communications theory
entropy is a measure of uncertainty.

In Kansas, the Second Law of Thermodynamics does not apply :--)

JK  http://www.crak.com  Password Recovery Software





Guy Macon <[EMAIL PROTECTED]> wrote in message
news:8dfprt$[EMAIL PROTECTED]...
> O.K.  Read the paper.  Let's start at the top:
>
> "Entropy is the measure of the unknown information in a closed system".
>
> I believe that Entropy is a measure of the disorder/randomness of
> information or energy in any system. open or closed.  In a closed
> system entropy cannot decrease, but open systems have entropy too.
> (Please correct me if I have define Enropy poorly).
>
>


------------------------------

From: Francois Grieu <[EMAIL PROTECTED]>
Subject: Re: updated paper on easy entropy
Date: Tue, 18 Apr 2000 08:41:54 +0200

Tom St Denis <[EMAIL PROTECTED]> wrote in his paper at
> paper (..) <http://24.42.86.123/files/entropy.pdf>

You got the code for your order 0 model working.

Still, the wording of your entropy definition is wrong.
At the very least, n is unrelated to 256.

You also fail to note you assimilate the probability of
occurence of a symbol with it's frequency in the message,
which is an approximation.

I did not check the order 1 model precisely. But I can
tell any model without extensive dictionary will
markedly overestimate the usefull entropy in
"We the people of the United States, in order to form"

And again, you miss the main source of entropy in
keyboard input: timing of keypress. Wich has the strange
property to be perfectly usable for seeding a RNG (your
stated goal), but not for a password.

IMHO the right conclusion is:

  entropy != easy


   Francois Grieu

------------------------------

From: Philip Baker <[EMAIL PROTECTED]>
Crossposted-To: alt.security.scramdisk,alt.computer.security
Subject: Re: Regulation of Investigatory Powers Bill
Date: Tue, 18 Apr 2000 06:40:47 +0100

In article <[EMAIL PROTECTED]>, Scotty
<[EMAIL PROTECTED]> writes
>
>Philip Baker wrote in message ...
>>In article <[EMAIL PROTECTED]>, Scotty
>><[EMAIL PROTECTED]> writes
>>>
>>>Your Name wrote in message <7QRJ4.224$[EMAIL PROTECTED]>...
>>>>In article <[EMAIL PROTECTED]>,
>>>>[EMAIL PROTECTED] says...
>>>>>
>>>>>-----BEGIN PGP SIGNED MESSAGE-----
>>>>>Hash: SHA1
>>>>>
>>>>>There's little point, the first time a case comes to the courts, then
>>>>>it will fall flat on it's face. If you have encrypted data on your
>>>>>hard disk, and refuse to decrypt, the the law says that you can be
>>>>>imprisoned.
>>>>>
>>>>>What this basically means is that they are removing the right of
>>>>>indivduals in a criminal court to be tried as innocent until proven
>>>>>guilty. This is a breach of at least the European Declaration of
>>>>>human rights, and probably the Universal Declaration of Human Rights.
>>>
>>>This is not quite correct. There was a very slight change of emphasis in
>the
>>>RIP bill compared to its previous incarnation in the e-commerce bill.
>>>Whereas the e-commerce bill was going to put the burden of proof on you to
>>>show that you didn't have a decryption key, the new RIP bill changes the
>>>test to one of balance of probabilities. So its gone from 'guilty till
>>>proven innocent' to 'balance of probabilities'.
>>
>>
>>This is not my reading of Section III of the bill but perhaps I've
>>missed something. The prosecution has to show that the accused 'is a
>>person who has or has had possession of the key'. There is no mention,
>>that I can find, of the degree of proof required. So isn't it up to the
>>prosecution to prove this beyond reasonable doubt?
>
>You might think so at first reading, but this isn't correct when you examine
>it in detail. First, the old e-commerce bill had:
>
>"(2) If it appears to any person with the appropriate permission under
>Schedule 1 that a key to the protected information-
>(a) is in the possession of any person, and
>(b) cannot reasonably be obtained by the person with that permission
>without the giving of a notice under this section,
>the person with that permission may, by notice to the person appearing to
>him to have possession of the key, require the disclosure of the key."
>
>and
>
>"12.-(1) A person is guilty of an offence if he fails to comply, in
>accordance with any section 10 notice, with any requirement of that notice
>to disclose a key to protected information."
>
>Notice here the word 'appears' in the first clause, an entirely subjective
>test. If it 'appears' to the 'person with the appropriate permission'
>(police officer etc)  that you have a key you can be served a notice to
>decrypt. That is all that the prosecution had to prove for the notice to be
>lawfully served. The prosecution would ask the police officer "did it
>*appear* to you that the defendant had a key?". If the answer is yes, the
>rest is automatic. If you don't comply you are guilty, the offence is one of
>not complying with a notice to decrypt. That means if you can't comply you
>are automatically guilty until you can show good reason why you cant.
>
>Now compare that with the new clause in the RIP bill:
>
>"(2) If any person with the appropriate permission under Schedule 1
>believes, on reasonable grounds-
> (a) that a key to the protected information is in the possession of any
>person,
><snip>
>the person with that permission may, by notice to the person whom he
>believes to have possession of the key, require the disclosure of the key."
>
>and
>
>"49. - (1) A person is guilty of an offence if-
>  (a) he fails to comply, in accordance with any section 46 notice, with any
>requirement of that notice to disclose a key to protected information; and
>  (b) he is a person who has or has had possession of the key. "
>
>Clause 2 now requires the prosecution to show 'reasonable grounds' to
>believe that you have a key. Reasonable grounds is in effect 'balance of
>probabilities'. Notice the argument is not over whether you have a key or
>not, but whether the police etc can *reasonably believe* that you do. Once
>this hurdle is passed everything is automatic as before. Failure to comply
>is still an offence and forgetting your key wont get you off, you have to
>prove you've forgotten it.


But if your defense was that the data referred to in the section 46
notice was not encrypted (but a randomly generated meaningless sequence
of bytes) and there was no key and therefore one could never have been
in possession of a key. The prosecution would have to show beyond
reasonable doubt that you were lying ie that you are 'a person who has
or has had possession of the key'. 

But you are right in that you have to mount an active defense especially
in regard to sect 49(4). You can't just sit silently in the dock because
then the prosecution would have to prove significantly less to win its
case. It could be argued that if you presented a defense under 49(2) or
(3) that had any degree of plausibility then the prosecution would have
to show that it was false beyond reasonable doubt to win their case.
>
>
>Rather like driving with excess alcohol or speeding, failure to comply with
>a decryption notice is an absolute offence, i.e. you're automatically guilty
>until you can show you're innocent. (For example, a defence of 'I drove with
>excess alcohol because a terrorist hijacked my car and made me do it at gun
>point'  would have to be proved by the defence beyond reasonable doubt).
>

An absolute offence is one where intent is irrelevant. The prosecution 
still have to prove the case beyond reasonable doubt. In the case of 
drink driving, that the correct procedure was followed and the apparatus 
showed you over the limit. That you were being forced to drive could 
only be presented in mitigation. You'd still be guilty.
-- 
Philip Baker
http://www.thalasson.com


------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Paper on easy entropy
Date: Tue, 18 Apr 2000 09:31:36 +0200

Tom St Denis wrote:
> 

> Well for certain chars they are more likely to occur, *and* more likely
> to occur after certain other chars.  By modelling this as best as
> possible I can estimate for a given buffer of text how much 'info' is
> actually there.  Since I presume the user is just randomly hitting keys
> and not trying to get a pattern, the output is random.
> 
> So if the user types random chars, and I estimate say 160 bits of
> entropy, and I hash it with say SHA-1 I will have about 160 random bits
> to play with.

There is no questioning of your having done the best that you can.
I just want to point out that the numerical values you compute
are dependent on a number of your assumptions and estimates and 
these are 'by definition' (of their being assumptions and estimates)
arguable (if people want to argue with you). That's why I asked
how certain/exact are your figures. In lots of measuments in
engineering and science, one gives error bounds. If you attempt
to do that, I suppose you will see my point.

M. K. Shen

------------------------------

From: "C. Prichard" <[EMAIL PROTECTED]>
Subject: Page-tag encryption demonstration
Date: Tue, 18 Apr 2000 07:33:59 GMT

Page-tag encryption is an unusual line of defense against site hacking.

The CipherText demonstration uses a Perl-wrapped CGI site where page =
tags are associated to actual page locations. The tags are encrypted on =
the fly and substituted into html links returned to clients. The =
returned tags are decrypted and used to reference the requested page. =
Alterred tags can result in error trapping and processing to support =
site anti-hacking policy. With encrypted page tags it is nearly =
impossible to make accurate guesses about page tags that the site will =
accept.

HTTP protocol is default html/text. Only the ampersand delimiter needs =
to be specially filterred in encrypted tag strings.

Improved security is made even better with implementation of methods to =
change the CipherText page-tag key with each response. This feature can =
be made dynamic, used only when hacking has been detected, only for =
unknown users or it can be made standard for all users at all times.

Demonstration is at:=20

http://www.greentv.com/cgi-bin/cgiwrap/greentv/CRI/_nttc.cgi?page=3Dopene=
r&user_id=3D&env=3Dx


CipherText uses key expansion to mask the key length. This capability is =
even scaleable so that the number of cipher combinations can exceed the =
message length. A new addition to CipherText will be a slight =
data-dependent shift in the first modified key. Its thought that this =
variation will improve the algorithm's performance in handling a =
repeated number of short strings as in form fields or in the CipherText =
application of page-tag encryption.

CipherText's strength depends greatly on the selection of a =
sophisticated key. It is not a blocking cip
her.

A U.S. patent is pending on CipherText filed in November 1999.

-Charles Prichard
www.greentv.com



------------------------------

From: "Stou Sandalski" <tangui [EMAIL PROTECTED]>
Crossposted-To: alt.security.scramdisk,alt.computer.security
Subject: Re: Regulation of Investigatory Powers Bill
Date: Tue, 18 Apr 2000 00:33:37 -0700


"Trevor L. Jackson, III" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Stou Sandalski wrote:
>
> > Actualy its partialy true, you do have to sign something when working at
a
> > top security place that makes your rights go bye bye. but in the case of
> > school I am right.  In the US by law you are required to go to school if
you
> > are under 18, also by law your rights are mostly suspended while you are
at
> > school.
>
> You are oversimplifying, and thus reaching a conclusion that is false.
The
> counter example is that you are not compelled to go to a school that
requires
> the suspension of the 4th amendment prohibition against unreasonable
search.

Yes your statement is _theoreticaly_ true... you don't have to go to the
public school, and you are perfectly within your rights to go to some other
town and find somewhere a private school that doesn't suspend your rights...
then you are perfectly within your rights to pay 30K a year for that school.
yup you are right... except that when you are under age your parents own you
and they won't exactly move you to a private school because you want certain
rights.

Is there any point in taking this arguement any further?


Stou





------------------------------

From: [EMAIL PROTECTED] (Niklas Frykholm)
Subject: Re: Paper on easy entropy
Date: 18 Apr 2000 07:39:23 GMT

Francois Grieu wrote:

>On the other hand, any 1 symbol message has 0 entropy, which is 
>ridiculous. But at least one could demonstrate this formula is
>on the safe side, right ?
>
>Wrong, because symbols are not independant in real life !
>This formula fails to take into account that a user is more
>likely to enter
>        "aaaaaaaaaaaaaaaaaazzzzzzzzzzzzzzzzzz"
>than    "zaazaazazazaazzaazzzazzazzazazaaazaz"
>and will assign too much entropy (36 bits) to the first input.

Perhaps you should do a verification step by estimating the order-1
entropy. If you find that order-1 entropy << order-0 entropy you
know that the order-0 entropy should not be trusted.

Of course you should not trust the order-1 entropy either, but
verify it with a order-2 entropy estimate, and so on.

The problem with this is, I think, that you will need more and
more data to estimate order-k entropies with any accuracy as
k grows large. Since the input data is limited in your application
that might not be possible.

// Niklas

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: updated paper on easy entropy
Date: Tue, 18 Apr 2000 07:52:38 GMT


On Tue, 18 Apr 2000 08:41:54 +0200, in
<[EMAIL PROTECTED]>, in sci.crypt Francois
Grieu <[EMAIL PROTECTED]> wrote:

>[...]
>And again, you miss the main source of entropy in
>keyboard input: timing of keypress. 

Alas, in PC-style systems, key timing is *quantized*:  A keyboard
processor performs a "slow" and continuous scan over all keys, then
reports keypress results on a serial line.  So even when key timing is
measured with a fast clock, that information also will contain much
less entropy than one might at first think.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to