Cryptography-Digest Digest #664, Volume #13       Fri, 9 Feb 01 17:13:00 EST

Contents:
  Re: Factoring (and not the Philippino :) (Tom St Denis)
  Re: Different cipher type ("Paul Pires")
  Re: Mod function (Jerry Coffin)
  Re: Different cipher type ("Paul Pires")
  Re: relative key strength private vs public key (Steve Portly)
  Re: relative key strength private vs public key (Thomas Kellar)
  Re: Factoring (and not the Philippino :) (DJohn37050)
  Re: Factoring (and not the Philippino :) (DJohn37050)
  Re: Different cipher type ("Douglas A. Gwyn")
  Re: Different cipher type ("Paul Pires")
  Re: relative key strength private vs public key ("Douglas A. Gwyn")
  Re: Encrypting Predictable Files ("Joseph Ashwood")
  Re: Factoring (and not the Philippino :) (John Savard)
  Re: relative key strength private vs public key ("Joseph Ashwood")
  Re: Factoring (and not the Philippino :) ("Douglas A. Gwyn")

----------------------------------------------------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Factoring (and not the Philippino :)
Date: Fri, 09 Feb 2001 18:58:17 GMT

In article <[EMAIL PROTECTED]>,
  "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> DJohn37050 wrote:
> > if e = 3 then p (and q) = 2 mod 3 which gives more info about the values
>
> I have some general thoughts about potential RSA cracking:
> (1) N is computed from p and q, e and d are computed via z.  It is
> often said that cracking an RSA encryption is equivalent to factoring
> N, but in practice one is faced with a known (N,e) and all that is
> needed for a crack is *some* d' (not necessarily the d maintained
> as a secret by the sender) that has the relevant inverse property,
> not p and q.  Is it a theorem that knowing (N,e,d) allows a fast
> recovery of p and q?  If not, then the notion that cracking RSA is as
> hard as factoring needs to be rethought.


The problem is while there are other 'd' that will decrypt JUST the same as
another, they are actually multiples of lcm(p-1,q-1).  I.e 3m mod m = 0 just
like m mod m = 0.  Thus 3^(n-1) mod n = 1 just like 3^3(n-1) mod n = 1.

Tom


Sent via Deja.com
http://www.deja.com/

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: Different cipher type
Date: Fri, 9 Feb 2001 11:21:55 -0800


Douglas A. Gwyn <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]...
> Michael Brown wrote:
> > "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote ...
> > > Before spending much more time in this direction, read what Knuth
> > > had to say about his youthful attempt at a "super-random" generator
> > > (the Art of Computer Programming, Vol. 2: Seminumerical Algorithms).
> > Unfortunately, I can't get hold of this book (don't have it and nor do local
> > bookstores). Could you tell me what he had to say?
>
> It ought to be in any reasonable computer science library.
>
> Basically, he described an algorithm that tried to combine
> "at random" several methods of pseudorandom generation,
> with the idea that the result would be more random than the
> individual methods.  When he actually implemented it, right
> away he found that it tended to get stuck in very short
> cycles.  The moral was, methods of generating random numbers
> should not be designed at random..  Half of the book is
> concerned with a careful investigation of pseudorandom number
> generation.

Isn't this implicit in the context of working with a deterministic process?
There is a tension between mutually exclusive goals. Decouple the behavior
of the machine from the content of it's state enough that it is not predictable
but don't loose the richness of the hard won complexity of how the state
evolves. That probably doesn't make a lot of sense.

I guess I'm trying to describe an impression I have of how this works. To me,
It seems like the difficulty and art involved is in compromising and
complementing. Absolutely treating any routine to produce a desired effect
is likely to fail as this focus will result in the exposure of another problem.
Often, you hear people pound on the basics of needing number theory
knowledge and an understanding of cryptanalysis to design good ciphers
but an obvious point is missed. You need to be a good designer (in general)
also. In Mechanical Engineering, there is a distinction between Designers,
Engineers, Design Engineers and Proffesors. Maybe not at the University
level but definately within the trade. Different skill sets and focus. The
problem with this being such a new and specialized (and generally secretive)
feild is that we hear from the professors (No slight intended) often but don't
hear the war stories from the trenches or get to observe the practitioners in
action. Mentoring does go on but it is often hard to find and painful to
endure.

Paul






====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: Mod function
Date: Fri, 9 Feb 2001 12:27:07 -0700

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
says...
> Jerry Coffin wrote:
> > Like most people who mention this patent, you're _grossly_ mis-
> > characterizing it
> 
> Are we talking about the same patent?  The one I had in mind
> was the one involved in a case against Apple.

It's barely possible that we're talking about different ones, but 
I've only ever heard of one patent that was much like this.  It's 
certainly widely used as an example of a horrible patent, but TTBOMK, 
primarily (if not exclusively) by people who've never really read it. 

-- 
    Later,
    Jerry.

The Universe is a figment of its own imagination.

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: Different cipher type
Date: Fri, 9 Feb 2001 11:47:08 -0800

Oh dear,

Proffessors? Profesors?  Soo sorry, drain bammage.

Professors!

Paul





====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: Steve Portly <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Fri, 09 Feb 2001 14:52:08 -0500



Bob Silverman wrote:

> In article <[EMAIL PROTECTED]>,
>   Steve Portly <[EMAIL PROTECTED]> wrote:
> >
> <snip>
>
> > > > large prime number composites
> > >
> > > Huh?  What is a "prime number composite"?? Please explain.
> >
> > A composite of two numbers each of which is prime.
>
> May I suggest that if you want to discuss this subject that it is better
> not to invent new terminology, "on the fly"?  Why is it necessarily the
> case that your "prime number composite" only has 2 factors, for
> example?
>
> It is better to simply say "product of two primes".

Good food for thought.  Assuming a sieving technique is being employed to
factor composite prospects whether they be the product of two primes or
more,  I don't see a big jump in the difficulty when you increase the number
of primes used in the composite?  As a simplistic example lets say we have a
sieve table of only four numbers 2,3,5, and 7.   The time it takes to factor
105 is not much different then the time it takes to factor 35.  Since I am
limited by the constraints of C++ to word128 values my keys would need to be
less than 10000 bits in size.  In a practical sense therefore using
composites with many primes would be silly since the constituent primes
would have to be smaller.

>
>
> > > > you may have noticed that the spacing between prime
> > >> numbers grows quite large.  Prime numbers only occur about one in a
> > > hundred thousand
> > > > numbers at current key sizes.
> > >
> > > Where did you get this piece of misinformation?  The average
> > > gap between primes for primes near a large integer n is about log n.
> > > For 512 bit primes, this yields an average gap of 512 log 2 ~ 360
> >
> > True for spacing between primes to be greater than 100,000 you would
> need
> > to be near a very  large prime and these are not being used to create
> keys
> > as far as I know.
>
> This last sentence is not good English.  I do not understanmd what you
> are trying to say.  Could you rephrase?

I would have no knowledge of keys larger than 10000 bits in size since such
a key would exceed a word128 receiving field on my desktop computer.

>
>
> RSA keys are constructed (for 1024-bit keys) by finding two 512-bit
> primes at random.  This is easy and there are plenty of them.
> Primes near "100,000"  (as you put it) are not used in cryptography,
> except perhaps peripherally, since systems built from them have
> insufficient security.  Or perhaps you meant something else???
> > > > I am wondering if there might be some point in the
> > > > future (30 years down the road) in which prime numbers are not as
> an
> > > attractive
> > > > alternative for public key systems?  Any math theory to prove
> > > disprove this?
> > >
> > > 1) Alternative to what?
> >
> > Methods that do not require a continual expansion in the size of
> primes.

>
> Your reply assumes facts not in evidence.  What continual expansion
> do you refer to?

Key size recommendations for public keys increase over the years with the
increasing factoring ability of available hardware.    128 bit public keys
were considered adequate at one time.

>
>
> >
> > >
> > > 2) How do you propose to do away with needing them?
> >
> > Sometime in the future perhaps they will remove the prime requirement
> from
> > ECC.
>
> Which prime requirement is that?  The requirment that the public
> base point have prime, or near prime  order?  This can never be
> eliminated, otherwise one would have a "small subgroup" attack.

The curve arithmetic is performed modulo p, where p is either a large prime
number or large power of two.  If a table of large primes representing all
possible base points used by ECC were ever developed some time in the future
it would shorten the time it takes to break current keys.

>
> > > 4) You have failed to suggest a reason *why* you think they might
> > > become unattractive.
> >
> > If finding primes magically becomes much easier for organized crime
> > sometime in the future, then creating secure primes may be impossible
> for
> > consumer owned machines.
>
> I'm not sure I understand this reasoning.  Finding primes is ALREADY
> easy, for you , for me, for organized crime, or for anyone else.
>
> And what do you consider "secure primes"?  You seem to be inventing
> terminology again. What does "secure" mean in this context?

The only point to be made is that "safe" key strength is relative.  A secure
prime that can be used to create a "safe" public key in reasonable time on
my desktop computer today will not be safe 30 years from now.

>
> Why do you believe that easily finding primes
> would make it hard for consumer owned machines??  What has one got to
> do with the other?

Perhaps I am being short sighted.  Consumer owned desktop machines in the
future will probably be able to handle larger keys in Word256 sized fields.
I was thinking that since sieving primes becomes increasingly easier as the
key size increases (since the distance between  primes also increases),  at
some point in the future the ability of a consumer owned desktop machine to
create a "safe" prime might be in jeopardy.


>
>
> --
> Bob Silverman
> "You can lead a horse's ass to knowledge, but you can't make him think"
>
> Sent via Deja.com
> http://www.deja.com/


------------------------------

From: Thomas Kellar <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Fri, 9 Feb 2001 14:57:40 -0500


On Fri, 9 Feb 2001, Bob Silverman wrote:
> In article <[EMAIL PROTECTED]>,
>   Steve Portly <[EMAIL PROTECTED]> wrote:
> > > > large prime number composites
> > > Huh?  What is a "prime number composite"?? Please explain.
> > A composite of two numbers each of which is prime.
>
> May I suggest that if you want to discuss this subject that it is better
> not to invent new terminology, "on the fly"?  Why is it necessarily the
> case that your "prime number composite" only has 2 factors, for
> example?
> 
> It is better to simply say "product of two primes".

Some people refer to the product of two primes as a semiprime.
I know that I am diligently working on methods of factoring them.

Thomas
--
w8twk  Freelance Systems Programming  http://www.fsp.com
  Networking is our profession, factoring is our hobby


------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Date: 09 Feb 2001 20:04:22 GMT
Subject: Re: Factoring (and not the Philippino :)

It is known that IF you know any d that works, you can factor n in prob. poly
time, which means you can factor n.
Don Johnson

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Date: 09 Feb 2001 20:06:03 GMT
Subject: Re: Factoring (and not the Philippino :)

It is also true that RSAP <= IFP.  It is known that solving the IFP breaks RSA,
but just being able to decrypt a message may not involve factoring the modulus.
 Most people think this is true, but it has not been proven, AFAIK.
Don Johnson

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Different cipher type
Date: Fri, 9 Feb 2001 19:58:05 GMT

Paul Pires wrote:
> It seems like the difficulty and art involved is in compromising and
> complementing.

Yes, trying to force this very interesting real-world problem
into some simple academic model tends to cause real problems
(which might not be evident looking solely at the model).

> but an obvious point is missed. You need to be a good designer (in general)
> also. In Mechanical Engineering, there is a distinction between Designers,
> Engineers, Design Engineers and Proffesors. Maybe not at the University
> level but definately within the trade. Different skill sets and focus. The
> problem with this being such a new and specialized (and generally secretive)
> feild is that we hear from the professors (No slight intended) often but don't
> hear the war stories from the trenches or get to observe the practitioners in
> action.

I've been fortunate enough to work in the trenches as well as
dealing with the generally-academic work that's published.
What you say is quite true -- the best job is done by applying
various skill sets at several levels.  Several of the most
important cryptanalytic methods were devised by "engineers",
and only later did the "professors" take over to organize
and extend our knowledge.  However, that is not to say that
one should expect blind groping around without sufficient
theoretical understanding to be very effective.  Ideally, a
team approach can be taken, where what one person is doing can
be evaluated by another who might be able to contribute some
insight.  This synergy is one of the reasons that a large
organization might make more progress than a similar number
of equally talented people working at scattered locations.

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: Different cipher type
Date: Fri, 9 Feb 2001 12:33:08 -0800


Douglas A. Gwyn <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]...
> Paul Pires wrote:
> > It seems like the difficulty and art involved is in compromising and
> > complementing.
>
> Yes, trying to force this very interesting real-world problem
> into some simple academic model tends to cause real problems
> (which might not be evident looking solely at the model).
>
> > but an obvious point is missed. You need to be a good designer (in general)
> > also. In Mechanical Engineering, there is a distinction between Designers,
> > Engineers, Design Engineers and Proffesors. Maybe not at the University
> > level but definately within the trade. Different skill sets and focus. The
> > problem with this being such a new and specialized (and generally secretive)
> > feild is that we hear from the professors (No slight intended) often but don't
> > hear the war stories from the trenches or get to observe the practitioners in
> > action.
>
> I've been fortunate enough to work in the trenches as well as
> dealing with the generally-academic work that's published.
> What you say is quite true -- the best job is done by applying
> various skill sets at several levels.  Several of the most
> important cryptanalytic methods were devised by "engineers",
> and only later did the "professors" take over to organize
> and extend our knowledge.  However, that is not to say that
> one should expect blind groping around without sufficient
> theoretical understanding to be very effective.  Ideally, a
> team approach can be taken, where what one person is doing can
> be evaluated by another who might be able to contribute some
> insight.  This synergy is one of the reasons that a large
> organization might make more progress than a similar number
> of equally talented people working at scattered locations.

Exactly. A lot of the "wisdom" I have borrowed from those around
me came from watching how they cope with the un-expected or in
viewing their attitude to fact rather than the fact it'self. I still remember
when an old salt told me the secret of mechanical design.

"F=Ma and don't push a rope" This probably doesn't translate real
well in print but if you saw the fiasco of mine that he was commenting
on and the disgusted expression on his face, you'd bust a gut.

Thanks,

Paul




====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Fri, 9 Feb 2001 20:16:00 GMT

Steve Portly wrote:
> Good food for thought.  Assuming a sieving technique is being employed to
> factor composite prospects whether they be the product of two primes or
> more,  I don't see a big jump in the difficulty when you increase the number
> of primes used in the composite?

Unless you mean the "number field sieve" instead of "sieve of
Eratosthenes", a "sieving technique" might not be the method used
to factor a huge number.  However, it generally can be stated
(and is pretty obvious) that if you can find one prime factor
then you can find them all, because you divide out the factor
and the remaining factorization problem is easier than the
original.

> Since I am limited by the constraints of C++ to word128 values ...
> ...
> Perhaps I am being short sighted.  Consumer owned desktop
> machines in the future will probably be able to handle larger
> keys in Word256 sized fields.

In practice one uses some "bignum" (arbitrary multiple-precision)
library, in which case the only limit is the amount of RAM.

> I was thinking that since sieving primes becomes increasingly easier as the
> key size increases (since the distance between  primes also increases),  at
> some point in the future the ability of a consumer owned desktop machine to
> create a "safe" prime might be in jeopardy.

I don't know of any direct method of proving that or even of
making it particularly plausible.  Bob S?

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Encrypting Predictable Files
Date: Fri, 9 Feb 2001 12:26:27 -0800

"Bryan Olson" <[EMAIL PROTECTED]> wrote in message
news:95vmhf$uuc$[EMAIL PROTECTED]...
> Joseph Ashwood wrote:
[Bryan Olson wrote: RC4 resists known-plaintext attacks]
[Joseph Ashwood wrote: Use an AONT to protect against plaintext substitution
(other comments removed)]
[Bryan Olson wrote: RC4 only addresses privacy, not authentication.  For
symmetric authentication use a MAC (_not_ a block cipher or AONT).]

I think you have rather substantially misunderstood the purpose of at least
one of the components. A cipher is indeed for privacy, a MAC is for
authentication, and an AONT is for insurance for both. The AONT prevents any
attacks on the cipher involved unless the post-AONT text is known, in some
cases this means only knowing all the initial data (a Deterministic AONT),
or breaking additional layers (a Non-deterministic AONT). The three parts
address different issues, ciphers are commonly used to address all these
issues, and in fact a good block cipher in an appropriate chaining mode does
in fact offer all three. However RC4 is a strong chaining mode(although not
suitable for AONT consideration), that is commonly used with an
astonishingly weak block cipher, and so does not (even with the addition of
a MAC) offer what would be added by using an AONT (protection against
cryptanlaytic attacks).
                    Joe





------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Factoring (and not the Philippino :)
Date: Fri, 09 Feb 2001 21:16:55 GMT

On Fri, 9 Feb 2001 17:37:22 +1300, "Michael Brown"
<[EMAIL PROTECTED]> wrote, in part:

>No flames, please, unless you
>actually _look_ at it :)

Well, I've looked at it, and one thing puzzles me.

Not all numbers are the product of two prime numbers. Some numbers
have many prime factors.

One can't automatically tell which form a number is in just by looking
at its last few digits, either.

So how can it be possible to prove, from the last few digits of a
number, what the last two digits of the numbers multiplied to make it
must be? I don't think that an algorithm that is like regular
multiplication, but working backwards, can be possible. But maybe I
haven't looked carefully enough, and you are working backwards in a
multiplication algorithm with different properties.

John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: relative key strength private vs public key
Date: Fri, 9 Feb 2001 13:09:14 -0800

Let me begin by restating what I think you are saying, so that I know if I
am answering the questions you actually have. Please correct me if I am
wrong, or unclear.

What I think you said:
Using a standard seive technique (divide by 2, 3, 5, 7, 11, 13 . . . .
until a factor is found) to break a key means that having more than 2
factors to a number does not much increase security, your example was that
using this technique 35 and 105 take virtually the same time.

My response:
That is correct. However because of the size of the numbers in question the
search space is on the order of 2^500 this technique is very far from being
realizable. The search time for this (even assuming you have the table of
primes already built) is very exponential, so while searching for values <
32-bits using this method can be useful it very quickly becomes no-longer
usable. Instead faster methods for determining the primes factors have been
discovered, the current record is held by GNFS which has a running time that
is less than exponential but still higher than polynomial. The downside of
using many of these more advanced factoring algorithms is that the number of
factors is a non-issue, specifically factoring 2^500 and factoring the
product of two 256-bit primes will be virtually the same. This leads to a
very substantial difference between what your current perception is, and the
state of the art. In truth very few people have actually seen a large
instance of any NFS-type system working.

What I think you said:
I am restricted by the abilities of my compiler which limits me to 128-bit
numbers.

My response:
I would suggest you download either Miracl or openSSL. Both have very good
very useful and very fast implementations of large numbers. They will help
you explore the large reaches of numbers. I quite often deal with 1024-bit
primes using the openSSL bignum implementation.

What I think you said:
The space between primes increase to 100,000 very quickly

My response:
That depends completely on what you consider quickly. Like Silverman said
the distance from one prime to the next prime is approximately log(n) . So
to get to 100,000 between 2 primes would take choosing primes arround
2^100000 at least.

What I think you said:
At some point in the future prime numbers may no longer be attractive for
public key cryptography

My response:
I suspect the opposite will be true. If you look at more than just the
series N, and instead move to include all the matters of Elliptic Curve, and
other various spaces. I think that in time you'll find more reliance on
these numbers. They are easy to find, easy to prove, easy to use, intuitive
once the space becomes obvious, and (at least where I went to school) we
were taught of them starting when we were 8. Additionally they give a warm
fuzzy feeling about the security. I don't see them going away, in fact I
think the opposite will happen, they will become more prevalent.


What I think you said:
I expect that someone will find a public key encryption algorithm that does
not require the continual expansion of the size of the data used

My response:
I see no reason to expect that. One of the beauties of the public key
algorithms that have been found is that the key sizes can be chosen to offer
the security/speed tradeoff necessary for a given purpose.

What I think you said:
If someone ever builds a table of every prime used for ECC it will shorten
the time it takes to break current keys

My response:
There is no evidence to support this. That prime is a published part of the
cryptosystem. The only hidden value is a random number. If it were the case
that knowing that prime would compromise the security ECC would not be
secure.

What I think you said:
Generating large primes is costly, so I cannot generate a strong key on my
PC now that will be secure for the next 30 years

My response:
The generation of primes is actually rather cheap, and has been discussed
regularly here. It is not uncommon to be able to test 10 512-bit numbers for
primality a second, and the method scale fairly reasonably. In 30 years you
will only need at most a 4096-bit key. You can generate one using a good
BIGNUM package in a matter of a few minutes for an RSA key, for a DH key the
4096-bit prime can be chosen publically (from a public list), from there you
simply choose your 4096-bit random number (does not need to be prime) and do
an exponentiation for your public key, this is less than 1 second. ECC is a
different beast, in 30 years you will probably only need at most a 512-bit
key, again you can choose a field of someone else's choice (I'd suggest
Certicom's challenge field because they're bettig a million dollars on it),
generate your random number, and compute the public key, again less than a
second. The cost of generating the keys is not as steep as you seem to
believe.

What I think you said:
Seiving for primes becomes easier as you get to larger sizes

My reponse:
Seiveing for primes is not the right way to find them. The fastest way is to
generate a random number of the correct size (with the upper and lower bits
set to 1), test it for primality (there are various tests), if it is
non-prime add 2 and repeat. You will find a 512-bit prime in under 400
tries, or a couple of seconds. To sieve to get a 512-bit prime would take
2^n/n work which is ~2^500, far beyond what is possible.
                                Joe



------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Factoring (and not the Philippino :)
Date: Fri, 9 Feb 2001 20:39:21 GMT

DJohn37050 wrote:
> ...

Thanks for the responses.

So far, it sounds like being able to find a d' would have
a big impact on computational complexity theory in general,
but an RSA crack that didn't produce a d' as a side effect
wouldn't.  Also that nobody seems to have a feasible idea
how the latter could be done.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to