Cryptography-Digest Digest #286, Volume #10      Tue, 21 Sep 99 13:13:03 EDT

Contents:
  Re: EAR Relaxed? Really? (fungus)
  Re: Comments on ECC (DJohn37050)
  UPDATE: 23rd Sep, UK Internet decryption and interception policy, SFS3.5 ("No Spam")
  Re: Okay "experts," how do you do it? (Patrick Juola)
  Re: Comments on ECC (Bob Silverman)
  Re: Second "_NSAKey" (Frank Gifford)
  Re: Yarrow: a problem -- am I imagining it? (Anton Stiglic)
  key exchnages (again ;-) (Peter Gunn)
  Re: Yarrow: a problem -- am I imagining it? (Anton Stiglic)
  Re: AES finalists AMD Athlon performance? (Tom St Denis)
  Re: Okay "experts," how do you do it? (Patrick Juola)
  Re: arguement against randomness (Tim Tyler)
  Simple analytical tools (Dave Smith)
  Re: key exchnages (again ;-) (Anton Stiglic)
  Re: RSA encryption algorithm. (John Bailey)
  Re: Comments on ECC (Bob Silverman)
  Re: Comments on ECC (Bob Silverman)
  Re: key exchnages (again ;-) (Anton Stiglic)
  Re: Comments on ECC (Bob Silverman)
  Re: key exchnages (again ;-) (Peter Gunn)
  Re: Okay "experts," how do you do it? ("Douglas A. Gwyn")
  Re: arguement against randomness ("Douglas A. Gwyn")
  Re: some information theory (Tim Tyler)

----------------------------------------------------------------------------

From: fungus <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: EAR Relaxed? Really?
Date: Tue, 21 Sep 1999 15:00:21 +0200



Greg wrote:
> 
> Did you ever ask yourself, "well if we cannot export it, why not
> set up a small shop overseas to import from?"  It is such a simple
> solution to the EAR, why hasn't anyone, including MS, gone that
> route and formed a standard for all America to use? 
> 

A lot of people have, and will continue to do so until the
regs are completely removed.



-- 
<\___/>
/ O O \
\_____/  FTB.

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Comments on ECC
Date: 21 Sep 1999 12:56:56 GMT

Certicom will support the NIST curves.

BTW, NIST did not support 40-bit DES, this was a compromise by export control
people in the gov't (not NIST) to allow something for export that is better
than 0-bit encryption, which is what it was before.  40-bits is better than
0-bits as at least it takes "some" work to attack.  

It is a mistake to see the government (or any big organization) as monolithic. 
Rather, there are factions, each with their own points of view.  And the way a
government changes its views is by the political process, messy.
Don Johnson

------------------------------

From: "No Spam" <[EMAIL PROTECTED]>
Subject: UPDATE: 23rd Sep, UK Internet decryption and interception policy, SFS3.5
Date: Mon, 20 Sep 1999 13:37:19 +0100

PROGRAMME UPDATE - http://www.fipr.org/sfs35.html

Scrambling for Safety 3.5 - Thursday September 23 1999
======================================================

The Foundation for Information Policy Research, Privacy International , the
LSE Computer Security Research Centre, and ZDNet UK are jointly sponsoring
the fourth in the series of public conferences on cryptography policy,
e-commerce and Internet surveillance. This will be the second conference of
1999, and has been called in response to the exceptional circumstance of two
official DTI consultations in the same year, and the Home Office's recent
consultation on revising the Interception of Communications Act to cover the
Internet.

The connections between Home Office policy on interception and powers
proposed in Part.III of the DTI's draft Electronic Communications Bill (see
FIPR Press Release) will be explored, and well as the legal framework for
establishing voluntary licensing of cryptography services, and recognition
of digital signatures.

09:15 - 13:45, Thursday 23 September 1999
Old Theatre, Main Building, London School of Economics,
Houghton St., London WC2

Admission is free of charge.

Registration: Send e-mail to

 [EMAIL PROTECTED]

...with "name, organisation" in body.

Telephone enquiries: 0171 354 2333

Programme
=========

09.25 Welcome - Caspar Bowden, FIPR

09:30 Introduction - William Heath (Kable)
Tim Pearson (Internet Service Providers Association)
Chris Binns, Alliance for Electronic Business: "Progress towards
self-regulation"

10:00 "Cryptography, privacy and information warfare"
Whitfield Diffie

10:30 "Why we needed further consultation"
Alan Duncan MP, Shadow spokesman on Trade and Industry

11:00 "Cryptography's central role in e-commerce policy"
Chris Sundt, CBI

11:30 "Law enforcement access to keys - legal and human rights issues"
Nicholas Bohm (Law Society)

12:00 "Crypto and security issues post-escrow"
Ross Anderson

12:15 Keynote:
Patricia Hewitt MP - Minister of State for E-Commerce at the DTI

12:35 Panel discussion - Chaired by Caspar Bowden, Director of FIPR

Jim Norton (Cabinet Office PIU)
Stephen Pride (DTI, Head of E-Communications Bill team)
Peter Sommer (Special Adviser, Trade and Industry Select Committee)
Clare Wardle (Post Office)
John Wadham (LIBERTY)
Jack Beatson QC, (Essex Court Chambers)
Stewart Baker (Steptoe & Johnson, ex-NSA General Counsel)

13:30 - close







------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: Okay "experts," how do you do it?
Date: 20 Sep 1999 10:57:34 -0400

In article <[EMAIL PROTECTED]>,
Sundial Services  <[EMAIL PROTECTED]> wrote:
>Douglas A. Gwyn wrote:
>> Right away, the human cryptanalyst
>> thinks of trying QWERTYUIOP, because he has a "feel" for how humans
>> behave.  It would be inordinately hard to duplicate that in a machine.
>
>
>I hear what you're saying, Mr. Gwyn, but I still think that most of the
>applicability of "intuition" has gone or should have gone by the wayside
>in the computer age.
>
>Cryptosystems are computer algorithms.  Nothing more or less.  And we
>should be able to describe the characteristics of what they must do to
>the plaintext, how they must depend upon the key and upon variance in
>the key.  There OUGHT to be an objective test-bed that we can plug these
>algorithms into, to test them.

Actually, it's much harder than you think to come up with an algorithm
for analyzing other algorithms.  In point of fact, that's one of the
few things that's easy to prove *impossible* in computer science --
for instance, there's (provably) no way to write a computer program
that will take another program [for the same machine] and determine
whether or not the program can enter into an infinite loop.  It's
similarly impossible to automatically decide whether or not a program
will generate any output at all.

Of course, some cases are easy (in both directions) and some special
cases are solvable.  But the general problem of algorithmically analyzing
algorithms is known to be impossible for all questions that don't
admit of a trival solution.

        -kitten

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Tue, 21 Sep 1999 14:33:11 GMT

In article <Ox#UQI5A$GA.316@cpmsnbbsa03>,
  "Joseph Ashwood" <[EMAIL PROTECTED]> wrote:
> Just a small addendum.
>
> I will grant that ECC is stronger today than RSA is

Yes and no.  ECC is stronger if one ONLY considers CPU time as the
binding constraint. However, breaking RSA requires large SPACE, which
breaking ECC does not.

, and I will grant that
> certain proofs regarding the relative ease of break between factoring
and
> finite fields exist, or can be done.

Huh?   May I suggest that you proofread your posts?  This last
sentence  "Isn't English".  Please re-phrase.

And no such proofs are known (if I understand what you mean).  There
 are no proofs on the lower bound complexity of either problem. Or
on their relative difficulty. Known algorthms give upper bounds, but
there is no proof that these are the best possible methods.


>But I find no reason not to remain
> convinced that eventually factoring could very easily become
> sub-exponential

You are beginning to sound like a crank who hasn't done his/her
homework.  Factoring is *already* sub-exponential.

If you mean ECDL rather than factoring, might I ask: how much have you
studied the subject?  What qualifies you to have an opinion one way
or the other?


, and that by relation finite fields could also easily become
> subexponential, I simply am in no place to speculate on _when_ that
will
> occur.

Again you need to edit your posts.  What do you mean "by relation
finite fields...."  This isn't English.

What do you mean by the phrase "by relation"?

And DL over finite fields is *already* sub-exponential.

You really need to be careful about what you say or noone will
take you seriously.  Based *only* upon what you have written it
appears that you don't know anything about this subject or what you
are talking about.


--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: [EMAIL PROTECTED] (Frank Gifford)
Crossposted-To: talk.politics.crypto
Subject: Re: Second "_NSAKey"
Date: 21 Sep 1999 10:58:31 -0400

In article <7s72ht$ldm$[EMAIL PROTECTED]>, Greg  <[EMAIL PROTECTED]> wrote:
>
>> (Simple technique: NOP out the call that initializes the RNG in the
>> application of interest.  No "_NSAKEY" needed.  Ian Goldberg and I
>> implemented this for Netscape several years ago, and it was a trivial
>> 4-byte patch to the binary.)
>>
>> Personally, I don't find your scenario a terribly plausible
>> explanation for the existence of the "_NSAKEY".
>
>Nor does yours make any sense.  Both of these approaches would
>leave the cipher text unreadable at the other end.  The CPDLL
>would have to function normally, but the plain text is syphoned
>off and sent to a covert site.  This is the WHY you are looking for.

Actually, instead of sending extra bits out, it would be a bit easier to
cripple the session key generator.  This way if you examine the output
stream, or encrypt files and transfer files around, there aren't extra
bits which stand out.

I could believe that the key bits could be held somewhere else and then
piggybacked over to, say, someone's web page when you visited the web - but
that would require a fairly robust system which could tollerate things
like power failures and daylight savings time...    :)

-Giff

-- 
Too busy for a .sig

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Yarrow: a problem -- am I imagining it?
Date: Tue, 21 Sep 1999 11:01:47 -0400

>
>
> 1. Yes, that's true.
> 2. So what?
>

I guess my question would be "what does Yarrow suppose to be?"

Is it suppose to be a PRNG?  If so, then there is a big problem, it
would be saying it's something that it's not!  Is it just a random
getter, by that I mean a device that is suppose to give something
like a RNG which is very different from a PRNG: For a PRNG,
you start with a random string and you stretch it to get a larger
random string (at a lesser cost then directly picking a random
string of that larger size),  a random getter (I don't know how
better to call this, entropy generator?)  is used to provide that
initial random string (you may use a PRNG over it after, or use
it directly to create crypto keys.).

So the above question, I feel, is what is important!






------------------------------

From: Peter Gunn <[EMAIL PROTECTED]>
Subject: key exchnages (again ;-)
Date: Tue, 21 Sep 1999 16:03:13 +0100

Hi there,

my last attempt at an authenticating key exchange 'SNAKE' involved
an array of large safe primes indexed by values generated from
the shared weak secret.

(X,Y,R,S all large random ints, g is generator (prolly 3 or 4),
 m[] is array of large safe primes).

A->B: (g^X)%m[P], R
B->A: (g^Y)%m[P], S
A->B: E((g^XY)%m[P])[S]
B->A: E((g^XY)%m[P])[R]

now, I was thinking about possible ways of reducing the storage space
required for a useable array of large safe primes.

My first idea is simply to store a table of prime gaps... 3 bytes is
enough
to store the gap between 2 large safe 1024bit primes, so a table
of say 10000 values would require ~30K to store. Seems workable,
though it will take a while to generate/check all those primes :-)

My next idea is to have multiple concurrent DH exchanges
with a smaller modulous...

(basically split P={P1,P2,...}, X={X1,X2,...},Y={Y1,Y2,...})

A->B: (g^X1)%m[P1],(g^X2)%m[P2],..., R
B->A: (g^Y1)%m[P1],(g^Y2)%m[P2],..., S

shared key K=H((g^X1Y1)%m[P1],(g^X2Y2)%m[P2])

A->B: E(K)[S]     /** S encrypted with K **/
B->A: E(K)[R]     /** R encrypted with K **/

so, with an array of say 1024 large safe primes of say 512bits
each, I could combine 3 DH key exchanges to get the shared
key K utilising upto 2^30 bits for P.

Multiple smaller modexps would seem faster to execute
than a single larger one.... seems too good to be true :-)

Comments please!

ttfn

PG.



------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Yarrow: a problem -- am I imagining it?
Date: Tue, 21 Sep 1999 10:54:48 -0400

>

SHA_1 is just a deterministic function,  it would not help to
avoid the cycle.

You don't need to compute the inverse to use the cycle property...







------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: AES finalists AMD Athlon performance?
Date: Tue, 21 Sep 1999 14:42:12 GMT

In article <7s7si0$6ug$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:

> The only thing I can say is that ciphers like RC6 and MARS will
recieve
> a significant boost.  This is because the Athlon has a faster MUL
> instruction.  Rumor has it that it will only take 1 clock cycle.  I'm
> not 100% sure on this, but I do know it is a bit faster.  Otherwise,
I'd
> say you'd receive your basic speed boost from the higher megahertz.

Well the K6-2 for example as a lesser latency and faster instructions.
The multiplication is about 2 cycles, most other instructions are 1
cycle (like FPU, MMX and AMD3D now instructions) (plus latency).  My
K6-2 400mhz runs one particular test about 5 cycles faster per iteration
then my MII with the same code, it also runs about 6 times faster (the
MII ran at 233mhz).  This is a direct result of the better architech.

I would expect that most ciphers perform or outperform regular pentiums
when on a K7 or K6-2.  The cyrix chips however are on a par or slower
(but are much cheaper :) ).

I wonder why NIST has only considered the x86 for desktop testing?  I
personally would like to see a more objective test bed...

Tom


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: Okay "experts," how do you do it?
Date: 20 Sep 1999 11:00:29 -0400

In article <7s1s2d$6pn$[EMAIL PROTECTED]>,
David A Molnar  <[EMAIL PROTECTED]> wrote:
>jerome <[EMAIL PROTECTED]> wrote:
>
>> obviously maybe you are well known because you are good and the referee
>> may not be influenced by the reputation... hard to say.
>
>Aren't papers supposed to be refereed without knowledge of the authors'
>names?


Depends on the conference or journal.  And, of course, if you're sufficiently
paranoid, you just *MIGHT* feel that the reason your semicoherent paper
entitled Y SCOTT31.Q ROX AND AL OTHER CIPHERS BLO CHUNCZ was rejected
was because they figured out who wrote it and block-reject anything you
write.

        -kitten

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: arguement against randomness
Reply-To: [EMAIL PROTECTED]
Date: Mon, 20 Sep 1999 15:10:03 GMT

Jerry Coffin <[EMAIL PROTECTED]> wrote:

: Now, that might imply some randomness somewhere behind it though, like 
: the randomness of radioactive decay being used to drive the most 
: accurate clocks we've invented...

I thought atomic clocks depended on vibrations in crystalline lattices
rather than radioactive decay.

What are you talking about?
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Sente, dame, dame, sabaki, komi, tesuji, atari.

------------------------------

From: Dave Smith <[EMAIL PROTECTED]>
Subject: Simple analytical tools
Date: Tue, 21 Sep 1999 16:50:33 +0100

OK, I read the (10) FAQs, but I can't believe that it is going to be
necessary for me to write my own set of simple tools to carry out
frequency analysis, coincidence analysis, and shift/XORing etc. There
doesn't seem to be a mention of these in the FAQs, althoguh admittedly
one of the web sites quoted wasn't working when I tried.

Surely such tools have been written hundreds of times already and should
be dumped somewhere for FTPing? Or am I wrong?

Oh well! Here goes with the coding.......
=============================================================================
Dave Smith         DAVE'S DISK DOCTOR SERVICE Ltd.       tel: +44 1892 835974
E:Mail: [EMAIL PROTECTED]         WEB Site: http://www.diskdoctor.co.uk/ 
LocoScript, PCW, CP/M, PC & MAC floppy disk salvages, transfers & conversions
All profits covenanted to charity. Nearly 160,000 GBPounds raised since 1989!

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: key exchnages (again ;-)
Date: Tue, 21 Sep 1999 11:18:08 -0400


I don't fully understand what your goal is?  It seems like you
are just trying to complicat the DH key exchange?
What is your goal?

as



------------------------------

From: [EMAIL PROTECTED] (John Bailey)
Subject: Re: RSA encryption algorithm.
Date: Tue, 21 Sep 1999 15:23:59 GMT

On 21 Sep 1999 05:57:31 GMT, Jeremy Botha <[EMAIL PROTECTED]>
wrote:

>Hi all.
>
>I'm currently trying to code a single-byte-at-a-time RSA encryptor /
>decryptor.  
(snip)
>Has anyone here tried anything in a similar vein?  If so, I'd appreciate
>some pointers on what I can do to stop the program misbehaving.  

http://www.frontiernet.net/~jmb184/interests/cryptography/old_trunk/
is a record of an attempt to do a similar thing.  Its C source code.
If you can follow its model, there is no reason it could not be
extended for indefinite lenghts.  It uses crude but easy to understand
methods for inversion, exponentiation, etc.

John

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Tue, 21 Sep 1999 13:44:24 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (DJohn37050) wrote:
> Factoring and finite field discrete log are ALREADY subexponential.
Do you
> perhaps mean Polynomial?
>
> ECC known best general method is fully exponential.  For black box
groups, it
> has been proven (whatever that means) that solving is exponential.

A Black Box group is one which uses only string encodings of its
elements and uses NO arithmetic properties of the group other than
the fact that the elements form a group. It assumes that you need
an oracle to tell if  a = b * c,  given elements a,b,c.



  So it seems
> that ANY progress on attacking ECC will be based on structure of the
group.

Correct.  Of course the group does have a fair amount of structure.
It is just that noone has found a way to exploit it.  It might indeed
be the case that there is no way. But noone really knows.


--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Tue, 21 Sep 1999 13:40:49 GMT

In article <Ox#UQI5A$GA.316@cpmsnbbsa03>,
  "Joseph Ashwood" <[EMAIL PROTECTED]> wrote:
> Just a small addendum.
>
> I will grant that ECC is stronger today than RSA is, and I will grant
that
> certain proofs regarding the relative ease of break between factoring
and
> finite fields exist, or can be done. But I find no reason not to
remain
> convinced that eventually factoring could very easily become
> sub-exponential, and that by relation finite fields could also easily
become
> subexponential

Uh,  There is some confusion here.

(1) Factoring is *already* sub-exponential.

(2) What do you mean by "finite fields could .... subexponential"?
Do you mean discrete logs over finite fields?  If so,  there is more
confusion, because they too have a sub-exponential time algorithm.

It is Discrete Logs on Elliptic Curves which currently lack a
sub-exponential time method.


> Of course you should also consider that this is coming from someone
who
> remains convinced of NP-completeness

Huh?  What do you mean by "convinced of NP-Completeness".

NP-Completeness of WHAT???



Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: key exchnages (again ;-)
Date: Tue, 21 Sep 1999 11:44:24 -0400

> .
>
> EKE does this by encrypting the g^X and g^Y values,
> SPEKE uses a generator based on P (the weak shared
> secret), etc. I thought I'd try and create my own version.
>

sorry for deviating from the initial conversation, but are
there any refs concerning the description of EKE and
SPEKE.?

thanks



------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Tue, 21 Sep 1999 14:21:33 GMT

In article <[EMAIL PROTECTED]>,
  Medical Electronics Lab <[EMAIL PROTECTED]> wrote:

<snip>


> He may very well be correct.  But let's face it, after 200 years
> of looking, nobody has *found* a subexponential algorithm!

 It has only been in the last 10-12 years that mathematicians have
really looked at the problem. Not 200.   Please cite work on the
ECDL problem prior to (say) 1987. I'm  not flaming here.  I would
really like to know of such work.

Indeed,  it has only been about 25 years since the concepts of
complexity have been formalized. (mid 70's:  Karp, Cook, et.al.)


>  Some of
> the smartest mathematicians in the world today are thinking about
> this problem, and they haven't found one yet either.

Actually, this isn't the whole truth.  Some of the best experts on
Elliptic Curves have indeed look at the problem. (Frey, Silverman,
Paulus to name 3).  But others (Mazur, Wiles,  Ribet, Katz, Deligne,
Tunnel, etc. etc.)  have not *seemingly* been involved in the
algorithmic aspects of ECDL.  In fact, it seems that most of the
experts on the very deep aspects of the theory of EC have NOT been
involved in the algorithmic aspects. H. Lenstra has not said much
about what he has done either.

If someone believes I am wrong here, please correct me.  However,
I have talked directly with most of the people I cited and they
confirm what I have written.


>  So what's the
> level of neglect?  If we spent US$10e9/year do you think we'd find a
> subexponential algorithm?
>
> When Adleman help start the RSA method, the present factoring schemes
> did not exist.

Sort of. A sub-exponential time algorithm (The continued fraction
algorithm) was known at the time.  Both QS and NFS are based on CFRAC.
Yes, it is not as efficient as QS or NFS,  but the underlying
computations are the same for all 3. This fact is often overlooked.
Indeed, about 17-18 years ago, Sam Wagstaff Jr.  was designing/building
a machine to execute CFRAC.  It actually did produce some results.

Let me also quote John Brillhart: "Factoring used to be the hallmark
of an eccentric".

Until RSA,  noone except a few (Fermat, Brillhart, Morrison, Pollard,
 Lehmer, Shanks) had ever looked at inventing a factoring algorithm.



>  RSA was far more secure *algorithmicly* 20 years ago
> than it is today.

So was most everything else. And I dispute the "far" adjective.
CFRAC was sub-exponential even then.


>  ECC is more secure *today* than RSA was 20 years
> ago comparing them algorithmicly.

Mostly true.  20 years ago a sub-exponetial time method was known
for factoring. No such method is known for EC today.  But noone
had looked at ECDL until 10-12 years ago,  while factoring had
been looked at for a long time.

>
> So what's the problem?  History and ego mostly.  You can't blame a
> guy who's developed one of the most widely used crypto methods for
> attempting to stomp on the "newcomer".  It makes his stuff "obsolete".

I agree with the first part, but please explain your reasoning behind
the last sentence.  What makes RSA obsolete relative to ECDL?

ECDL does indeed currently allow smaller key sizes for "equivalent"
security. (but breaking RSA has a constraint that breaking ECDL
does not: SPACE.  One has to be careful in comparing them).  ECDL
does do faster signatures. RSA, does do faster signature verification.


> One of these days ECC will have competition in PK that will blow it
> away because a sub exponential algorithm will have been found.

No.  It will not be blown away, any more than NFS has blown away RSA.
It will simply require larger key sizes.  This will diminish its
advantage relative to RSA,  but it will still (probably) be a viable
alternative.

> for the next 10 to 20 years it will be the more important PK system
> since it uses fewer resources to encrypt and forces the attacker to
> use more resources to break than any other PK system.  Until a
> sub exponential algorithm *is* found, why not use it?

Actually,  one doesn't need sub-exponential.  An O(N^1/4) method
will have a deleterious effect as well.

One reason it is not as widely used is the expertise required to
implement it well.  It requires considerable knowledge in both math
and CS  to do it right.

>
> As attacks continue on RSA and factoring, it continuously gets
> algorithmicly "weaker".

No new attacks have been found in 10 years. It is just that we
have learned the techniques to make NFS efficient during that time.


>  There are attacks on ECC which have made
> it "weaker" in the same sense, the algorithms require slightly fewer
> resources today than they did 3 years ago to crack the same problem.

These really are not new attacks.  The algorithm is still O(N^1/2),
but the constant has been diminished.  The latter really only helps
a little.  The same is true of NFS.

> Now, let's add one more silly conjecture.  Suppose that in the
> next 5 years a really smart mathematician *proves* that no sub-
> exponential algorithm can be found for ECC.

This indeed would be a spectacular result!  However,  even an
O(N^1/4) method (still exponential) would partly remove its relative
advantage to RSA.  Such a method would double the required key-sizes.
And at 320 bits instead of 160,  EC now becomes very slightly slower
than RSA for signatures.  (But still requires less space for the
key)

I regard such a proof as unlikely.  There really aren't ANY
significant lower bound results known today on ANY meaningful problem.
Indeed,  even the true lower complexity of multiplication isn't known
today.  We know pointer machines can multiply in linear time,  and that
ordinary finite-state machines can do it in O(N log N loglogN)  but we
don't know if this is minimal.  (see Knuth for a discussion)

 How many companies
> will be kicking themselves for not switching sooner?

Availability of software and expertise for writing it is a
bottleneck.


--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: Peter Gunn <[EMAIL PROTECTED]>
Subject: Re: key exchnages (again ;-)
Date: Tue, 21 Sep 1999 16:50:01 +0100

Anton Stiglic wrote:

> > .
> >
> > EKE does this by encrypting the g^X and g^Y values,
> > SPEKE uses a generator based on P (the weak shared
> > secret), etc. I thought I'd try and create my own version.
> >
>
> sorry for deviating from the initial conversation, but are
> there any refs concerning the description of EKE and
> SPEKE.?

SPEKE info is at http://www.integritysciences.com/
SRP-3 info is at http://srp.stanford.edu/srp/

Dont have a good one for EKE, but D.Jablons's
site (the SPEKE) one has lots of info :-)

ttfn

PG.




------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Okay "experts," how do you do it?
Date: Tue, 21 Sep 1999 15:40:34 GMT

Anton Stiglic wrote:
> ... There is no way to say that we might be
> able to proove that FACTORING is difficult or not?

In one sense, factoring is easy.  You can try all possible
factors in sequence, or you can guess at random, until the
factors are found.

The real issue for cryptosystem strength is the minimum
amount of computation *required* to solve an arbitrary
instance of the problem (as encountered in practice), to
attain a specified degree of likelihood of solving the problem.
Unfortunately, statistical tools tend to find the *average*
amount of work, not the *minimum required* amount of work.
And most "difficulty" calculations find the amount of work
in the *worst* case (for the analyst), not the *best* case.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: arguement against randomness
Date: Tue, 21 Sep 1999 15:47:57 GMT

Tim Tyler wrote:
> I thought atomic clocks depended on vibrations in crystalline lattices
> rather than radioactive decay.

No.  Those are "crystal" oscillators.

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: some information theory
Reply-To: [EMAIL PROTECTED]
Date: Mon, 20 Sep 1999 16:07:06 GMT

SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:

: One should use comprssion that does not add information to the message
: that an attacker could use. One way to guarantee no information is added
: is to use a compression/decompression method that is "one to one".
: As I have stated many times it is easy to check for this property.
: And I have at my site the ONLY compression method that
: uses method. But I am looking for others. IF you know of ANY
: let me know. [...]

I have looked for other such compressors... but without success.

I don't know if you're the first person to identify the need for
such a compression program, and to actually produce one, but
congratulations anyway ;-)
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

crypt: home for the dead.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to