Cryptography-Digest Digest #414, Volume #10      Fri, 15 Oct 99 05:13:03 EDT

Contents:
  Re: hos secure is RSA? (Tom McCune)
  Re: Breaking iterated knapsacks ("rosi")
  des question ("King")
  Re: hos secure is RSA? ("Steve Myers")
  Re: CRYPTO/EUROCRYPT ("Thomas L")
  Re: classifying algorithms (wtshaw)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("rosi")
  Re: where to put the trust (wtshaw)
  Re: RSA Algorithm ("Douglas A. Gwyn")
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Roger 
Schlafly")
  Re: hos secure is RSA? ("Richard Parker")
  prime number problem (Shyh-Tyan Liaw)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Roger 
Schlafly")
  Re: prime number problem ("Richard Parker")
  Re: Six out of six for Kerckhoffs

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Tom McCune)
Subject: Re: hos secure is RSA?
Date: Fri, 15 Oct 1999 03:01:09 GMT

=====BEGIN PGP SIGNED MESSAGE=====

In article <[EMAIL PROTECTED]>, Thinker
<[EMAIL PROTECTED]> wrote:

>i rem. reading somewhere that RSA had been proven slightly insecure.  Is
>this even true, and if it is, where can i read up on this?

RSA remains very strong with decent key sizes.  You may be thinking of the
Twinkle machine which is suppose to be able to relatively easily break up to
512 bit RSA keys, or other recent reports of breaking 512 bit keys.  Of
course, fairly often someone makes some outlandish claim, but there is
nothing to indicate that even 1024 bit RSA keys are at any kind of risk at
this time, or the near future.

=====BEGIN PGP SIGNATURE=====
Version: PGP Personal Privacy 6.0.2
Comment: My PGP Page: http://Tom.McCune.net/PGP.htm

iQCVAwUBOAaY88MxrQ5/VTwtAQEJnAP+MRsgBa+3AfhJQHxZDaPFkqwKqQGXkeM1
11Nz5VatPd57UVTl2jVk835dTNM5vjIlsyHwL4Mhf6ILRPHdt5mLp65BfAcUkuaL
9ytrg8nukfemdmN7Jy1XXxelJ1yIT1pF1OgXaVV0NCfNOIITcnaaH463TSeTLiqp
HzAiGV86KRI=
=EAty
=====END PGP SIGNATURE=====

------------------------------

From: "rosi" <[EMAIL PROTECTED]>
Subject: Re: Breaking iterated knapsacks
Date: Thu, 14 Oct 1999 22:36:33 -0400

Richard Parker wrote in message
<3cuN3.467$[EMAIL PROTECTED]>...
>"rosi" <[EMAIL PROTECTED]> wrote:

[snip]
>
>I think the future significance of knapsack encryption will not be its
>value as a mechanism for protecting data, instead it will be the value
>of the many techniques that have been developed to break knapsack
>encryption.
>

Dear Richard,

   Could very well be true. A couple of 'observations'. I am not sure
there are 'many' techniques for breaking knapsack cryptosys. Nothing
so far suggests that the key is easy to find (exception e.g. Shamir's
on basic MH). One always attacks the more vulnerable of the two,
key or ciphertext blocks. Ciphtertext blocks seem easier, efforts
may not have extended to the direction of key recovery that much.
Or there might a slight chance that keys are 'hard' to recover. The
rest is, to me, one technique, namely lattice reduction and its equivalents
with 'twists'.

   Obvious to me is that all those did not escape the destiny of being
able to be converted to an 'equivalent' zero-one set. Once that is
done, one may hardly need look further. To redundantly bring back
Orton's, we see clearly via Ritter's attack that, even though we seem
to have a dense set, the predominant weights are those before the
iterations. Furthermore, as another way of looking at this, when the
a[i] (hope I am using the right thing to refer to the public weights) are
'binarized', the f[i]'s can be 'packed' back to the 'binarized' weights
(with some loss of precision). So in essence, there are not that many
weights, and the density is, to me, just a smoke screen. Fractions did
not help. They would not just in the general case and that had been
shown I believe. They did not help in this particular case due to the
reasons just mentioned IHMO.

   All along, such vulnerabilities had been shown over and over again.
You could be right that these could not be overcome, or even worse
that even when they were, there are other hidden ones that may
make the scheme intrinsically not suitable for almost all crypto
functionalities. But that is just not yet definite.

   Same everywhere. No one nowadays will generate two primes
of about same bit length and believe they give secure RSA. The
primes one can use fall into a narrower 'strip'. It is the throwing out
of the easy cases (or classes of cases) that one constructs one
that is 'hard' (assuming the hardest ones are really 'hard'). Yet,
leaving the same vulnerability in can hardly be called any
improvement. Such cases may not say enough about the fate
of the knapsack schemes. As a natural sequel, it is to find a
way to avoid those soft spots we already know.

   The fundamental weaknesses have to be removed. It may not
be an easy task or it may be quite easy. But at the same time,
before we have something that overcomes the weaknesses, we
may go the other way round and ask us what it would be like if such
weaknesses were not there (e.g. the weights constructed after
the iterations 'carry about the same weight' as those before the
iterations) ?

   Thank you very much Richard.
   --- (My Signature)



------------------------------

From: "King" <[EMAIL PROTECTED]>
Subject: des question
Date: Thu, 14 Oct 1999 22:31:56 -0500

Is triple DES simply running the message through DES rounds three times?
Why three?  And why and how much does it increase the security?  Thanks.



------------------------------

From: [EMAIL PROTECTED] ("Steve Myers")
Subject: Re: hos secure is RSA?
Date: 15 Oct 99 03:40:05 GMT


Thinker wrote in message <[EMAIL PROTECTED]>...
>i rem. reading somewhere that RSA had been proven slightly insecure.  Is
>this even true, and if it is, where can i read up on this?
>
>Thanx,

Are you refering to how RSA's PKCS#1 can be efficiently cracked via chosen
cipher text
attack?

Steve




------------------------------

From: "Thomas L" <[EMAIL PROTECTED]>
Subject: Re: CRYPTO/EUROCRYPT
Date: Thu, 14 Oct 1999 22:58:28 -0500

=====BEGIN PGP SIGNED MESSAGE=====
Hash: SHA1

  Hi,

   You may want to look into  isbn 3540-65069-5
          Advances in Crytology 1981-1997
   it contains all proceedings of CRYPTO & EUROCRYPT  from 81 to 97
        on cd (the book acts as an index/documentation overview).
    [you may want to skip amazon ..i went thru springers site and it
arrived
      within 2 weeks with a nice postmark from WIEN lolol]

      I am not sure what the exchange rate is now but it came in
under $100 US
when I purchased it.[that is overall book+s&h]

    c++ya,
       xkey aka tom

Jonathan Katz <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
...
> Anyone selling back issues of CRYPTO/EUROCRYPT proceedings? Please
> contact me...

=====BEGIN PGP SIGNATURE=====
Version: PGPfreeware 6.5.1 for non-commercial use <http://www.pgp.com>

iQA/AwUBOAamYulGIzlj+01aEQJ42gCgz5NMB9EU1q+Q47oCIb9dPT2OTPgAn35J
5aay43OYeSsK33Y5ZnZRzb9h
=bLi8
=====END PGP SIGNATURE=====





------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: classifying algorithms
Date: Thu, 14 Oct 1999 22:51:32 -0600

In article <7u4oot$la6$[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:

> In article <[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] (wtshaw) wrote:
....
> >
> > The above using one implementation represent only 3 of 2^54
> > possible cipher texts.  For quick results I used the same
> > words hashed three different ways to make the keys.  The
> > maximum length of each output block
> > is 200 characters from a 65 character set. It is a variable
> > length block
> > cipher, as I see it, with different blocks from the same message
> > being entirely different problems.

The whole string is much less then the maximum allowed input block of
characters, 192.  Consider that a usual message is some odd length.  Since
the minimum block is 2 characters, anything left is easily handled.  I
always cut a block at a space.
> 
>    Are these overlapping variable length blocks ? Here, I'm
>    wondering about orthogonality problems and how they relate
>    to entropy.

Each full block or fragment, using the example implementation, will
typically add 8 characters in length, 2 become 10, 192 becomes 200. 
Randomness used in encryption is recoverable in decryption to aid in
unscrambling the data.  It's weird but it works will bells on.
-- 
Truth lies in your path for you to stumble over, 
even if you think you can easily sidestep it.

------------------------------

From: "rosi" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 14 Oct 1999 23:54:41 -0400

Just a few light notes.

First, I think we are debating the good or bad about having more than
one as a standard. Ascend's emotions are quite secondary.

Further on that. Cisco's feelings are quite relevant if it is quite
impractical
to force more than one for hardware. If I were a vendor, I would probably
feel best no matter what functionalities my products provide as long as
they sell. :) OTOH, if it were that impractical, NIST would have rejected
the idea outright.

Second, IMHO, it is not 'to let people ...'. We are here helping make the
choice and we are here, in some sense, making the choice (of one
choice or of more than one choice). No need to repeat what a standard
is. But understanding what it is can also save our time debating things
like interoperability.

Lastly, any assumption (how unfortunate we are here) can call for at least
equally 'valid' assumptions from the other side of the isle. 'Necessarily
weak' can have 'necessarily strong' to match.

--- (My Signature)

Bruce Schneier wrote in message <[EMAIL PROTECTED]>...
>On Thu, 14 Oct 1999 12:33:11 +0100, "Brian Gladman"
><[EMAIL PROTECTED]> wrote:
>
>>
>>Roger Schlafly <[EMAIL PROTECTED]> wrote in message
>>news:7u41at$[EMAIL PROTECTED]...
>>> Sundial Services <[EMAIL PROTECTED]> wrote in message
>>> news:[EMAIL PROTECTED]...
>>
>>> Yes, that's right. Those who subscribe to the theory that
>>> it is better to use your own homebrew combination of
>>> ciphers can just use the AES finalists. Everyone else will
>>> just use the AES winner.
>>
>>The arguments for multiple AES winners cannot be dismissed so easily.
There
>>are at least three reasons for wanting more than one winner: (1) to
provide
>>a degree of choice in dedicated, closed applications, (2) to provide a
>>degree of diversity in open applications (a well established practice),
and
>>(3) to meet requirements that benefit from the sequential application of
>>different encryption algortithms (again an established practice).
>
>(3) doesn't make sense to me.  Whether there is one winner or several,
>those that want to cascade multiple algorithms will do so.  They do so
>now, and there's no problem.
>
>I disagree with (2) also, primarily from a hardware perspective.  I've
>gone to Ascend, Cisco, and other companies to discuss AES.  The first
>thing they are concerned about is how they will be able to fit the
>algorithms into their hardware.  Multiple algorithms means more chip
>real estate; they hate that.  In software it's no problem adding
>another algorithm, but it is in hardware.
>
>As to (1), why does it make sense to give people without the expertise
>to make a choice a choice?  If we, as the world's non-military
>cryptographers, cannot choose an algorithm, why should we expect
>others to?
>
>My primary worry is that a system that has a choice of algorithms
>will, operationally, be as secure as the weakest.  If that is the
>case, diversity is necessarily bad.  And we will always have a backup
>algorithm: triple-DES.  I see no benefit from NIST passing the buck
>and not making a decision.
>
>Bruce
>**********************************************************************
>Bruce Schneier, President, Counterpane Systems     Phone: 612-823-1098
>101 E Minnehaha Parkway, Minneapolis, MN  55419      Fax: 612-823-1590
>           Free crypto newsletter.  See:  http://www.counterpane.com



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: where to put the trust
Date: Thu, 14 Oct 1999 23:19:51 -0600

In article <7u59pu$347$[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:

> If a standard cryptographic primitive fails, then all systems that use
> it might be in deep trouble. Cryptography will be the basis of the
> information society of the future and if, after 50 years, somebody
> finds a way to break the AES with two known plaintexts and 1 second of
> a PC, or if there is a mathematical breakthrough with problems we use
> as the basis of public key cryptography, then, if unprepered, we would
> probably have a sudden, mayor and global disruption in our hands.
...
>One good idea is not to depend too much
> on one primitive. This might be achieved, for example, by using
> multiple symmetric ciphers in series, or multiple key exchange
> algorithms in parallel.
> 
When DES first came out, and the Scientific American article was
delivered, I thought that I was going to read something earthshaking. 
But, as I had recently spent much effort on some similiar ideas involved
it it, I broke out in one of the longest horselaughing events of my life.
When so many put so much in so little, you should not really know whether
to laugh or to cry.

In time it will become much more apparent that incestious, recurive
ciphers have definite drawbacks, perhaps in the line of what you are
alluding to.  I don't mind being rash.  It's been a long time, lots of
years, to see this start to come out, and the joke remains ready to at
least spark snickers and smiles.
-- 
Truth lies in your path for you to stumble over, 
even if you think you can easily sidestep it.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: RSA Algorithm
Date: Thu, 14 Oct 1999 21:52:29 GMT

"Bo Dömstedt" wrote:
> How large is the largest number (2^n)-1 factored today?

If n is an integral power of 2, I can factor the dickens
out of it..

------------------------------

From: "Roger Schlafly" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 14 Oct 1999 22:20:36 -0700

DJohn37050 <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Here is 2 slightly different scenarios;
> A) NIST chooses ONE AES algorithm.  A few years from now, it is discovered
to
> be somewhat weaker than at first thought.  A few years after that, it is
> discovered to be much weaker.  Now what?

I suppose you could argue that is what happened to DES.
Not bad at all. Those who don't think it is good enough
have switch to something better.

> B) NIST chooses a small number of AES algorithms.  For those systems
without
> much limitations, they do all the algs.  For those systems that are
> constrained, they choose one.

This would make AES a failure, IMO. No one wants to do all the algs.
And when they don't, you give up interoperability.

>  Now an algorithm is found to be weaker than
> thought.

Which is much more likely than scenario A because more algs
are involved.

>  Everyone starts to migrate away from it to another of the AES algs.

Or maybe they lose confidence entirely in AES and use something else.

> The systems that use multiple algs, this is a matter of negoritiation of
which
> to use, etc.

Yeah, adversaries will use the negotiation options to switch to the
weakest cipher.

> For systems with one alg, they  switch over over time.
>
> I know that I MUCH prefer the possibilities of scenario B.

Not me.




------------------------------

From: "Richard Parker" <[EMAIL PROTECTED]>
Subject: Re: hos secure is RSA?
Date: Fri, 15 Oct 1999 06:30:50 GMT

Thinker <[EMAIL PROTECTED]> wrote:
> i rem. reading somewhere that RSA had been proven slightly insecure.  Is
> this even true, and if it is, where can i read up on this?


It is possible to take advantage of an RSA public-key cryptosystem's
design to mount a number of different attacks.  Among others, there
are the common modulus attack, the low public exponent attack, and the
low private exponent attack.  Also, it is possible to exploit features
of an RSA implementation (such as timing, power, or fault tolerance)
to attack the system.  For that matter, improvements in factoring
relative to factoring estimates at the time the system was designed
could be considered an attack.

However, it is generally believed that is possible to design and
implement an RSA system that is secure against these known attacks.

A good survey of the attacks on RSA is the following paper by Boneh:

  D. Boneh, "Twenty years of attacks on the RSA cryptosystem,"
  Notices of the American Mathematical Society, Vol. 46, No. 2, 1999,
  pp. 203-213.
  <http://crypto.stanford.edu/~dabo/papers/RSA-survey.pdf>

You might also want to take a look at RSA Laboratories' FAQ:
  <http://www.rsasecurity.com/rsalabs/faq/>

-Richard

------------------------------

From: [EMAIL PROTECTED] (Shyh-Tyan Liaw)
Subject: prime number problem
Date: 15 Oct 1999 06:45:28 GMT

Dear all,

   What I would like to ask is whether a number ,say 
a, is assumed a prime number after "some prime number 
test". That is to say that "is a number assumed to be 
a prime number if it passes a lot of prime number test 
in a practical use?" If not, is it chosen from some
specific form, such as 2^(2^m)-1? In short, how to 
generate a prime number in practical?

Thanks

Shyh-Tyan Liaw

------------------------------

From: "Roger Schlafly" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 14 Oct 1999 23:07:57 -0700

DJohn37050 <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> BTW, when I made my presentation to ANSI X9F1, the consensus by a large
margin
> was to ask NIST to include "future resiliency" as an AES criterion.

What do the bankers mean by the term "future resiliency"? That
the AES winner not be broken in the future?

These are the same folks who want "strong" keys for RSA. Do
they also want strong keys for AES? <g>




------------------------------

From: "Richard Parker" <[EMAIL PROTECTED]>
Subject: Re: prime number problem
Date: Fri, 15 Oct 1999 07:12:53 GMT

[EMAIL PROTECTED] (Shyh-Tyan Liaw) wrote:
>    What I would like to ask is whether a number ,say
> a, is assumed a prime number after "some prime number
> test". That is to say that "is a number assumed to be
> a prime number if it passes a lot of prime number test
> in a practical use?" If not, is it chosen from some
> specific form, such as 2^(2^m)-1? In short, how to
> generate a prime number in practical?

Shyh-Tyan Liaw,

A good online reference for questions about prime numbers is the "The
Prime Pages" at <http://www.utm.edu/research/primes/

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Six out of six for Kerckhoffs
Date: 15 Oct 99 07:05:10 GMT

[EMAIL PROTECTED] wrote:
: And on

: http://www.ecn.ab.ca/~jsavard/mi0611.htm

: there is now a section, entitled "The Ideal Cipher", where I discuss
: Kerckhoff's six desiderata for a cipher system.

And I've made a couple of modifications to that page to clarify where I
stand on some of the questions that have caused endless debate in this
newsgroup. At one point, by going through these dicta, I have acknowledged
something that I had failed to properly note before, the real value of the
benefits of descending the hierarchy one-time-pad, symmetric encryption,
public-key cryptography in terms of key management (as opposed to the
benefits of going the other way in terms of theoretical confidence in the
strength of one's encryption).

John Savard

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to