Cryptography-Digest Digest #436, Volume #10      Fri, 22 Oct 99 04:13:04 EDT

Contents:
  Re: There could be *some* truth to it (Anton Stiglic)
  Re: some information theory (very long plus 72K attchmt) (Anton Stiglic)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (Mok-Kong 
Shen)
  Re: RSA key size and encryption (Peter Gutmann)
  Cato Institute News Memo: What will the new economy mean to you? ("Cato Institute")
  Re: Which encryption for jpeg compressed pictures? (me)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column 
([EMAIL PROTECTED])
  Re: Simple random number generation (?) (David Wagner)

----------------------------------------------------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: There could be *some* truth to it
Date: Thu, 21 Oct 1999 13:58:43 -0400



It's absolutely secure, with probability of error exponentialy small
(which can be mayde as small as you want).  This is what it is.
That is, it is secure against an *all powerfull, unlimited in time and
memory", adversary, with exponentialy small prob. of error.

You seem to be ingaged in some childish game of "I'm right, everybody
else is wrong".  I'm ending this discussion on my side.


Anton



------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: some information theory (very long plus 72K attchmt)
Date: Thu, 21 Oct 1999 14:30:43 -0400


==============3B04D5EEF6452A49B23EF897
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit

>

Here is, the simple mathematical explanation (for once and for all), for
why compressing data before encrpyting it does not reduce de amount
of information on the plaintext, given the ciphertext, in the sens of
Shanon's information theory.

Let P be the set of plaintexts,
       C the set of ciphertexts,
       S = Img(Comp)),   the image of the compression function.

I denote H(X) to be the entropy of the source X.
H(X | Y), is the conditional entropy of X, given Y.
In crypto, we are interested in H(P | C), the entropy of the plaintext,
given the ciphertext.  H(K|C) is also interesting, where K is the set
of the key's.
For the mathematical definition of H, and some interesting theorems
about the relationships of C, K, P, entropy wise, see the notes I wrote
in postscript, for my thesis director graduat crypto cours:

http://crypto.cs.mcgill.ca/~crepeau/CS54/crypto.ps

Now, what is to be proved, is simply that H(P | C) = H(S | C).
But this is trivial, if you decompose these terms, I will not
do it here, because there is alot of indexes to print and it would
suck in ASCII, but I encourage you to do so.   You simply need
to proove that H(P | C) = H( f(P) | C), where f is a deterministic
function (in our case, f = Comp()).

If you can tell me why this is not true, using the mathematical
terms and defintions, I'd be glade to hear from you.
Hint:  you won't be able to found anything wrong...

Anton




==============3B04D5EEF6452A49B23EF897
Content-Type: text/html; charset=us-ascii
Content-Transfer-Encoding: 7bit

<!doctype html public "-//w3c//dtd html 4.0 transitional//en">
<html>

<blockquote TYPE=CITE>&nbsp;</blockquote>
Here is, the simple mathematical explanation (for once and for all), for
<br>why compressing data before encrpyting it does not reduce de amount
<br>of information on the plaintext, given the ciphertext, in the sens
of
<br>Shanon's information theory.
<p>Let P be the set of plaintexts,
<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; C the set of ciphertexts,
<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; S = Img(Comp)),&nbsp;&nbsp; the
image of the compression function.
<p>I denote H(X) to be the entropy of the source X.
<br>H(X | Y), is the conditional entropy of X, given Y.
<br>In crypto, we are interested in H(P | C), the entropy of the plaintext,
<br>given the ciphertext.&nbsp; H(K|C) is also interesting, where K is
the set
<br>of the key's.
<br>For the mathematical definition of H, and some interesting theorems
<br>about the relationships of C, K, P, entropy wise, see the notes I wrote
<br>in postscript, for my thesis director graduat crypto cours:<a href="
http://crypto.cs.mcgill.ca/~crepeau/CS54/crypto.ps"></a>
<p><a href="
http://crypto.cs.mcgill.ca/~crepeau/CS54/crypto.ps">http://crypto.cs.mcgill.ca/~crepeau/CS54/crypto.ps</a>
<p>Now, what is to be proved, is simply that H(P | C) = H(S | C).
<br>But this is trivial, if you decompose these terms, I will not
<br>do it here, because there is alot of indexes to print and it would
<br>suck in ASCII, but I encourage you to do so.&nbsp;&nbsp; You simply
need
<br>to proove that H(P | C) = H( f(P) | C), where f is a deterministic
<br>function (in our case, f = Comp()).
<p>If you can tell me why this is not true, using the mathematical
<br>terms and defintions, I'd be glade to hear from you.
<br>Hint:&nbsp; you won't be able to found anything wrong...
<p>Anton
<br>&nbsp;
<br>&nbsp;
<br>&nbsp;</html>

==============3B04D5EEF6452A49B23EF897==


------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 21 Oct 1999 20:35:16 +0200

Brian Gladman wrote:
> 
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote
> 
> > Bruce Schneier wrote:
> 
> > In one point Mr. Schneier has clearly and definitely erred, namely
> > concerning his phrase about randomly choosing a cipher. Randomly
> > choosing among a set of ciphers can NEVER be weaker than the
> > weakest of the set. If ciphers in a set are of equal strength (
> > or not known to be of differing strength) then randomly choosing is
> > evidently a superior stategy. If one cipher is stronger than the
> > others, the usefulness of random choosing will depend on the
> > strength differences and the cardinality of the set and can't be
> > evaluated simply. On the assumption that there are 3 ciphers
> > of the final round of AES that are equally or extremely nearly
> > equally strong, it is almost sure that random choosing is a much
> > better strategy.
> 
> I do think, however, that there have to be fair ground rules for such a
> comparison and I think this is what Bruce is saying.
> 
> If there are, say, two ciphers to choose from then 1-bit is necessary to
> make this choice so it is reasonable to ask which would be stronger when
> this scheme using n-bit ciphers is compared with the use of a single cipher
> with an (n+1)-bit key.

But this argument neglects the fact that the key size of AES is fixed
(that there are a few options in key size is not relevant here) and 
can't be increased by one or 2 bits. We have thus only the option 
of using the winner or using the winner plus one or two runner-ups. 
Let's call these configurations A and B. If our resources can afford 
to use A or B, the question is then which one can provide higher 
security. My opinion is then B, on the assumption that the runner-ups 
have the same strength as the winner (e.g. they only fail to be 
winner due to poorer performance). One would then prefer to use B,
if one (subjectively) believes that the security offered by A is 
less than desirable for one's specific applications.

> 
> If we postulate that all ciphers achieve a strength implied by their
> keyspace, we will not have achieved anything other than increasing the
> complexity of our system.  So we have to assume unequal strength ciphers in
> order to see possible advantages.  But when the two cipher scheme is
> employed with algorithms of known strength it makes no sense to choose the
> weaker of the two so we also have to also assume that we do not know which
> of the algorithms is stronger.

We have long ago in this group discussed the impossibility of
exactly quantifying strength of an encryption algorithm. There is
no way to quantitatively compare the AES candidates. There is bound
to be some more or less subjectivity (and hence disagreement) in 
evaluating them. I assume on the other hand that there will be in 
the final round 3 candidates which the majority of the experts would 
consider to be equivalent in strength though not in other 
characteristics, e.g. performance. (This equivalence may in the end 
stem from, as you said, lack of knowledge. Experts are human. They 
too can't know everything.) So we have the situation that we have 
'subjective probability' of 1 of all 3 candidates being equally 
strong. I don't consider the case in which the winner is considered 
by the experts to be very much stronger than all the rest, because I 
believe this is a very unlikely outcome of the AES contest.

Let me remark for the benefit of the general reader that we are 
discussing presently not multuiple encryption, which has been the 
topic of a number of previous follow-ups, but randomly choosing a 
single encryption from a few of the candidates of the final round.

M. K. Shen
========================
http://home.t-online.de/home/mok-kong.shen

------------------------------

From: [EMAIL PROTECTED] (Peter Gutmann)
Subject: Re: RSA key size and encryption
Date: 21 Oct 1999 19:21:28 GMT

[EMAIL PROTECTED] (DJohn37050) writes:

>Some worries:
>1) If the plaintext can be exhausted.

It's a 128/112-bit 3DES key, it'd be easier (relatively speaking) to brute-
force the 3DES key directly.

>2) If the plaintext has a low value and a low RSA public exponent is used.

It uses F4 for the exponent (that's F4 meaning 65537, not 0xF4 :-).

Peter.

------------------------------

From: "Cato Institute" <[EMAIL PROTECTED]>
Crossposted-To: 
alt.sci.tech,sci,sci.agriculture,sci.bio,sci.bio.technology,sci.biology,sci.chem,sci.chemistry
Subject: Cato Institute News Memo: What will the new economy mean to you?
Date: Thu, 21 Oct 1999 15:51:50 -0400
Reply-To: "Cato Institute" <[EMAIL PROTECTED]>

October 21, 1999

Cato Institute News Memo:

Subject: What will the new economy mean to you?

Our high-tech society is changing at the speed of light.  Earlier this year,
Global Crossing's Gary Winnick became a billionaire in just 18 months, one
of the first mainstream investors to cash in on the cyber-revolution.
Wealth at warp speed is just one of the remarkable changes we witness in our
high-tech society.  From finance and communications to biology and law,
advances in technology are transforming our world.

The challenge for citizens in the 21st century will be to master the dynamic
changes in free and diverse markets.  Join leading scholars, entrepreneurs
and scientists as they consider the implications of the knowledge
revolution. The Cato Institute's Technology & Society 1999 Conference will
examine:

· The future impact of market forces and new technology in education
· Online arbitration as a substitute for sluggish state-run courts;
· How to provide broadband access to homes;
· The growing controversy over genetically altered plants;
· The outcome of ICANN's Los Angeles meeting for domain name competition;
· Crypto's influence on the future of money.

Speakers will include Bill Schrader of PSINet, Tom Siebel of Siebel Systems,
education reformer Tim Draper, and Stan Williams, one of the scientists
working on the molecular computer switch.  For additional information see
www.cato.org.

What:  Cato Institute/Forbes ASAP Technology and Society Conference 1999
When/Where: November 4-5, Crowne Plaza San Jose/Silicon Valley, Milpitas,
California
How:  To register, call 202-218-4633 or e-mail to [EMAIL PROTECTED]

The Cato Institute is a nonpartisan public policy research foundation
dedicated to broadening policy debate consistent with the traditional
American principles of individual liberty, limited government, free markets,
and peace.






------------------------------

From: me <[EMAIL PROTECTED]>
Crossposted-To: comp.security.misc,comp.graphics.algorithms,comp.compression
Subject: Re: Which encryption for jpeg compressed pictures?
Date: Fri, 22 Oct 1999 01:32:22 +0100
Reply-To: me <[EMAIL PROTECTED]>

Jens Grivolla <[EMAIL PROTECTED]> wrote
>On Tue, 19 Oct 1999 00:46:34 -0700, Anthony Stephen Szopa
><[EMAIL PROTECTED]> wrote:
>>Try http://www.ciphile.com
>>You'll be glad you did.
>
>ROTFLMFAO

Don't you mean ROT13MAO?
-- 

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Fri, 22 Oct 1999 01:10:38 GMT

Brian Gladman wrote:

> Usually what matters is overall system performance and here there are
many
> systems where the performance of the block cipher employed will not
have a
> major impact on the overall system performance.  I agree that block
> encryption speed will often be important in securing the lower
protocol
> layers but there are uses of encryption at the applications layer that
have
> the opposite properties.  Here the speed of the block encryption will
often
> have very little impact on overall system performance but the
protection
> requirement may either be long term or alternatively so critical to
the
> security of the system as a whole that an extra degree of safety is
> worthwhile.
>
> I do not think it is unreasonable to seek an AES outcome that
accommodates
> both these requirement domains.

I think Rijndael meets the broadest spectrum of
requirements, with Twofish second.

I agree with what Massey said at the first AES conference,
that the number of rounds should increase with the key
size.  Of the five finalists, Rijndael is the only one
that calls for more rounds when using longer keys.


> One of the difficulties with AES is that the IPR issues may dictate
that AES
> algorithms have to be declared 'winners' in order that they can be
used
> freely whereas all we really need is more than one algorithm to come
through
> with some sort of formal status.

I don't think there are significant outstanding IPR issues.

>  I don't have a problem with one
winner and
> 1 or 2 backup algorithms or algorithm ranking but I am unclear whether
the
> rules of the AES competition will allow these forms of result in
respect of
> MARS and RC6 without declaring them 'winners'.

The statement required with the submission is clear:

    Should my submission be selected for inclusion in
    the AES, I hereby agree not to place any restrictions
    on the use of the algorithm intending it to be
    available on a worldwide, non-exclusive, royalty-free
    basis.

see:
http://csrc.nist.gov/encryption/aes/pre-round1/aes_9709.htm#sec2d

I think we need one clear winner, but the rules allow
NIST to incorporate more than one algorithm in the
standard without revisiting IP issues.

--Bryan


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Simple random number generation (?)
Date: 21 Oct 1999 18:48:31 -0700

In article <7uo07d$3k7$[EMAIL PROTECTED]>,
misterguch <[EMAIL PROTECTED]> wrote:
> One of the things I'm wondering is if there's any easy way to generate truly
> random numbers in the home.

Yes; a set of dice will do nicely.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to