Cryptography-Digest Digest #537, Volume #9       Wed, 12 May 99 17:13:03 EDT

Contents:
  Re: Time stamping (complete) (Helger Lipmaa)
  a source for random number generation (Patricia Gibbons)
  Re: [Q] Are all encryption algorithms based on primes? (DJohn37050)
  Re: Random permutation ([EMAIL PROTECTED])
  Re: Crypto export limits ruled unconstitutional (Jerry Coffin)
  Re: [Q] Are all encryption algorithms based on primes? (Jerry Coffin)
  Re: Newbie:  Encrypting short strings (Matthias Bruestle)
  Re: True Randomness & The Law Of Large Numbers ([EMAIL PROTECTED])
  Layered Design (John Savard)
  Re: Crypto export limits ruled unconstitutional ([EMAIL PROTECTED])
  Re: Crypto export limits ruled unconstitutional (Serge Paccalin)
  An apology to Tom St. Denis; also, Terry Ritter is not very bright 
([EMAIL PROTECTED])
  DES analysis paper (Jayant Shukla)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Helger Lipmaa)
Subject: Re: Time stamping (complete)
Date: 12 May 1999 19:16:47 GMT

Jean Marc Dieu ([EMAIL PROTECTED]) wrote:
: Of course you need a third trusted party, whose clock is the one that will
: time stamp your document. My question in fact was: can you simoutaneously
: time-stamp AND sign a document through a one-way hash function, using a TTP?

: Have a look at http://www.surety.com

The problem of connecting a digital document with an absolute time of its
creation is unsolvable by several reasons (starting from physics). All you
can do is to get some relative stamping, where any issued time certificate
is provably one-way dependent (in one or in the other direction) with every
other certificates. Haber & Stornetta pioneered this work, and it has been
continued by several working groups, including ours. (I guess this was more
an answe to another guy)

For more you probably have to reword your question... like what do you mean
by time-stamping AND signing?

and have a look at http://home.cyber.ee/helger/crypto/link/timestamp.html

Helger
http://home.cyber.ee/helger

----

:> At 18:33 10/05/99 -0500, you wrote:
:> You could sign the document with a date stamp, humm..but then a problem I
:> want to know.
:>
:> Where can you get such a time stamp?  I mean, looking at your computer and
:> it says 6:17pm then signing the message, how does the receiver KNOW it was
:> signed at 6:17pm?  Are there any time stamp authorities?

------------------------------

From: Patricia Gibbons <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp
Subject: a source for random number generation
Date: Wed, 12 May 1999 13:16:17 -0700
Reply-To: [EMAIL PROTECTED]

I've found some info on random number
generation at "Aware Electronics"
website, and thought this may be of
interest to readers. This is a 
radiation-decay method that outputs
either hex or ascii random text to 
a filename or to a printer:   

<http://www.aw-el.com/random.htm>
-- 

Patricia E. Gibbons
Acting Chief Communications Technician
City of San Jose - ITD/communications
.......................................
My Public Key is available at: 
<http://pgp5.ai.mit.edu/pks-commands.html>
Key ID: 0xEDECB44F
This key is RSA, NOT Diffie-Hellman !!

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: [Q] Are all encryption algorithms based on primes?
Date: 12 May 1999 19:50:47 GMT

ANSI X9.63 draft contains ECAES (Elliptic Curve Augmented Encryption Scheme),
based on work of Bellare-Rogaway and which is not based on the difficulty of
factoring, but rather based on the ECDLP.  This is a variant of ElGamal
encryption.
Don Johnson

------------------------------

Date: Wed, 12 May 1999 16:32:11 -0400
From: [EMAIL PROTECTED]
Subject: Re: Random permutation

Bryan Olson wrote:
> 
> This isn't going to be threaded in the right place.  My internet
> provide doesn't get all the posts, and Dejanews hasn't worked since
> they revised it a few days ago.
> 
> [EMAIL PROTECTED] wrote:
> > [EMAIL PROTECTED] wrote:
> 
> > > If efficient means we want to use the fewest calls to our
> > > random number generator, use the same idea as we used to generated
> > > a uniform choice in [0..n) given a uniform and independent bit
> > > source.  Suppose rand16() returns a random choice from [0..15].
> > >
> > >     endRange = 1
> > >     randInRange = 0
> > >     while true
> > >         while endRange < 16!
> > >             endRange = endRange * 16
> > >             randInRange = randInRange * 16 + rand16()
> > >         if randInRange < 16!
> > >             return randInRange
> > >         else
> > >             endRange = endRange - 16!
> > >             randInRange = randInRange - 16!
> > >
> > > This returns a uniform choice from [0, 16!-1], which
> > > we cam map one-to-one to the possible permuations.  It's
> > > optimal in the expected number of calls to rand16().
> >
> > I question the characterization of this approach as efficient.  It
> > requires use of numbers well over 32-bits in length due to the
> > computation of 16 factorial.
> 
> True, but be fair.  I stated exactly in what sense it is efficient:
> it uses the fewest possible (average case) calls to rand16().
> 
> > It will work on a 64-bit machine, but not
> > on a 32-bit machine.  And even on a 64-bit machine it does not scale
> > well at all.
> 
> That's just silly - of course it works on 32-bit machines.  It
> just requires multiple precision arithmetic.  As for how it scales,
> a straightforward implementation takes time proportional to n^2 lg n
> where n is the degree of the permutation.

This algorithm is quite similar to the single-bit routine suggested by
hrubin.  So many of the comments in that thread apply here.

While the utilization of the fundamental source of random numbers looks
to be minimized the overall algorithm suffers horribly as N increases. 
While the time scales as you described the space requirements scale
exponentially on N.  (Wanted to use an exclamation but it looked too
much like factorial notation ;-)

Since the original post was aimed at efficiently selecting from the
range 0..N-1 in order to implement a classic in-place shuffle, it is
appropriate to consider avoiding multi-precision operations.

------------------------------

From: [EMAIL PROTECTED] (Jerry Coffin)
Crossposted-To: talk.politics.crypto
Subject: Re: Crypto export limits ruled unconstitutional
Date: Wed, 12 May 1999 14:38:32 -0600

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] says...

[ ... ] 

> Nobody who's read the tortuous (and torturous and potentially tortious)
> language used to translate algorithms into Patentese can doubt that
> some legal firms have a program to do this.

Having been involved, yes, I can.

> It's for a different
> purpose, allowing the Patent Office to pretend to believe that it
> describes a machine or process rather than an algorithm (which is
> not patentable),

Not true.  You might want to read _The Patent Wars_ for information 
about the patentability of algorithms, and that point at which it was 
ruled that there was no way to meaningfully separate an algorithm for  
a process from a machine carrying out a process.

Note that patent claims fall into a number of categories, some of 
which have specific restrictions on how they're applied...

> but the net effect is the same.  A case in point:
>  
>   13. The compression apparatus of claim 12 in which said transfer
>   means comprises means for transferring said data character signal
>   from said character register means to the lower significant positions
>   of said code register means, and  means for zeroing the remaining
>   high order positions of said code register means. 
> 
> I'm not making this up.

To give an example of what I was saying above, if this was changed 
from "compression apparatus" to "compression means", it might easily 
have _entirely_ different implications as to what it would or would 
not cover.  It's a bit hard to say in this case, since the rest of the 
claim uses "means for" often enough that it might still be considered 
a "means plus function" claim even though the overall statement is 
theoretically about an "apparatus" instead of a "means".

Despite appearing to be the most widely licensed patent in US history, 
this particular patent has never actually be tried in court.  It's 
getting close enough to expiration now that it looks like it probably 
never will be either.

Oh, it should also be pointed out that the example above is only one 
claim -- this particular patent has 180 MORE, most of them similarly 
written, and many of them considerably longer and more complex.  One 
thing to keep in mind is that no matter how complex a thing is being 
described, a patent claim is required to be written as a _single_ 
sentence.  It's truly amazing to read a single sentence that goes on 
for more than a page.  No, I'm not kidding either -- I _have_ seen 
such things.

------------------------------

From: [EMAIL PROTECTED] (Jerry Coffin)
Subject: Re: [Q] Are all encryption algorithms based on primes?
Date: Wed, 12 May 1999 14:38:30 -0600

In article <cbk_2.5740$[EMAIL PROTECTED]>, 
[EMAIL PROTECTED] says...
>   Are all encryption algorithms based on the fact that it is difficult to
> factor a number which is the product of two LARGE primes?

No.  RSA encryption is based on the difficulty of factoring large 
primes.  Other forms of public-key cryptography depend on discrete 
logarithms and elliptical curves.

Symmetric cryptography doesn't normally use anything even vaguely 
similar.

>   There other ways of encrypting data effectively WITHOUT using this prime
> technique, like XOR-ing the bits and reversing them, etc., but are these as
> effective? Many thanks in advance!

It depends on what you mean by "effective".  Public-key cryptography 
in general has the property of allowing you to publish a key so 
anybody can send you encoded messages, but without the matching 
private key, the the messages can't be decoded.

If you don't need that basic property and don't mind anybody who can 
encrypt a message also being able to decrypt it, then you don't need 
to worry about PK cryptography at all.

If a situation does require PK cryptography, there are methods based 
on discrete logarithms, and Elliptical curves.  There's no certainty 
that there might not be easy ways of solving these kinds of problems 
either.  At the present time, the most efficient methods of factoring 
large numbers are more or less isomorphic to working with discrete 
logarithms as well, but the versions dealing with discrete logarithms 
are less practical for a given key size.  TTBOMK, ECC requires 
entirely different approaches.

One thing to note is that even if the basic problems involved in ECC 
aren't harder than the other two, there does seem to be more research 
done in factoring than elliptical curves, so factoring might be more 
likely to have easier solutions sooner.  OTOH, a lone, exceptionally 
brilliant researcher might find a real breakthrough in an area even if 
there are very few others researching the topic at all.

One thing to note is that people have been studying factoring for 
centuries.  Most of the current methods are still based on a method 
discovered by Fermat in the mid-1600's.  The newer methods are clearly 
superior, but the improvements have been far smaller and less common 
than in many other areas that have received (in most cases) far less 
attention.

------------------------------

From: [EMAIL PROTECTED] (Matthias Bruestle)
Subject: Re: Newbie:  Encrypting short strings
Date: Wed, 12 May 1999 17:27:00 GMT

Mahlzeit


Jim Felling ([EMAIL PROTECTED]) wrote:
> A secure hash  will have the properties you look for --just pad with nulls out

> Steve K wrote:
> > I am looking for an algorithm that will be suitable for encrypting (and
> > decrypting) short strings around 3 to 10 characters in length.  I would like

He probably needs to decrypt these numbers.


Mahlzeit

endergone Zwiebeltuete

--
PGP: SIG:C379A331 ENC:F47FA83D      I LOVE MY PDP-11/34A, M70 and MicroVAXII!
-- 
Die Katzen brauchten Jahrtausende zum Domestizieren des Menschen.

------------------------------

Date: Wed, 12 May 1999 16:42:00 -0400
From: [EMAIL PROTECTED]
Subject: Re: True Randomness & The Law Of Large Numbers

Douglas A. Gwyn wrote:
> 
> "R. Knauer" wrote:
> > On Wed, 12 May 1999 06:07:42 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
> > wrote:
> > >Who cares why he says that, it's not relevant.
> > Are you saying that Feller does not know what he is talking about?
> 
> No, I'm saying that you don't.  You keep harping on this one point
> even though it has nothing to do with the test you were criticizing.
> 
> > >The Monobit Test checks the first moment of the distribution,
> > I think you need to demonstrate that formally.
> 
> I don't think so; it's obvious.
> 
> > I would not assume he is a bad salesman until he turned in several
> > poor performances. One bad year does not make a poor salesman.
> 
> To continue the analogy, your company could be out of business by the
> time you demonstrated to your satisfaction that Jones really was not
> performing adequately.
> 
> > >> ... How can Jone's poor earnings be a reflection on the typical
> > >> earnings of the vast majority of salesmen?
> > >Of course, it isn't -- it reflects only on Jones.
> > Then one "abnormal" sequence cannot reflect on the TRNG's
> > performance either.
> 
> That's a nonsequitur.  The analogy is
>         salesman in general ~ RNG in general
>         Jones ~ specific instance of a RNG
> So the abnormal sequence *does* reflect on the specific instance of
> a RNG.  (It says that the specific instance might be malfunctioning.)
> 
> > Nonsense. You can buffer and test - memory and disk space is cheap.
> 
> Not when, as is usually the case, the specific key is not known until
> the time of encryption.
> 
> > >The *reason* the methodology works is because the computed statistic
> > >is essentially an indicator of membership in a *class* of samples,
> > >and those classes have different relative sizes, thus differing
> > >likelihoods.
> > That is an expression of the central fallacy here. You believe that
> > one sample such as 20,000 bits taken by itself can be representative
> > of a complete distribution of samples. That is fallacious, as Feller
> > and others have pointed out.
> 
> What I said is not fallacious; it is central to statistics.
> Your incorrect statement of what I believe is what is fallacious.
> 
> > I have used the term "reasonable certainty". Likelihood is a fiction
> > from probability theory.
> 
> Oh, so now probability theory is fictitious, too?
> It is amazing how much smarter than everyone else you are.
> 
> > The alarms have to be based on something in the real world and not
> > some fallacious notion about what should be happening.
> 
> Of course *nothing* should be based on "some fallacious notion".
> But since there is nothing fallacious about statistical testing,
> this too isn't relevant.
> 
> > Just because a TRNG *should* output sequences with small bias does not
> > mean that any given sequence must be of low bias.
> 
> But it does mean that a very high degree of bias is a sample is
> evidence for the hypothesis that the generator is broken.  If you
> have substantially higher evidence from other sources *at the same
> time* that the generator is *not* broken, then that would offset
> the evidence from sampling.  But that is not the case for the
> FIPS-140 tests.
> 
> > My objection is that you cannot infer the properties of the TRNG from
> > one sequence, which is what the Monobit Test is attempting to do.
> 
> To repeat yet again a point that you continue to ignore, the FIPS-140
> Monobit Test exists only to raise a flag when there is a substantial
> likelihood that the generator has broken down and is not outputting
> a sufficiently secure key stream.  It must raise that flag immediately
> upon detection of the possible fault, with a latency of not more than
> 20,000 bits, that being deemed the maximum amount of broken key stream
> that could be tolerated.  It must not give false alarms more than once
> per 20,000,000,000 key bits, on average.
> 
> If you dispute that the test accomplishes the above, you should
> explain where it fails.

Now I simply have to take excpetion to that.  You are asking him to do
something that we've have ample evidence (overwhelimg, even inundating
evidence) that he is not capable of performing.

How very unfair!  But conclusive.

> Otherwise, it doesn't much matter whether
> you personally like the test, if it accomplishes what it is meant to.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Layered Design
Date: Wed, 12 May 1999 19:03:44 GMT

David Wagner's _boomerang attack_ inspired me to take another look at
the relationship between the "inner" and "outer" parts of block
ciphers versus rotor machines.

I had thought that the rule might have been the opposite in both
cases, which was confusing, but I saw I was mistaken. Weak in the
middle and strong on the outside is bad in both cases.

(Interesting question: can the boomerang attack lead to a variation of
the isomorph attack applicable to rotor machines in which the entrance
and exit rotors are both fast?)

This has led me to thinking that Quadibloc III could be improved by
changing the order in which its components are used.

First the two whitening layers, as before. Then some GoodStuff rounds
to thoroughly mask everything. Then the Quadibloc II standard rounds.
And the Mishmash rounds are on the inside - as the hardest to analyze,
but also dangerously subject to manipulation.

The idea being that as one proceeds from "outside" to "inside", one
should shift first from rounds that mask the data in the simplest way
to rounds that give diffusion in a straightforward way, and then
change to operations that are harder to analyze, but which could
create openings for analysis if they were on the outside, subject to
manipulation, e.g., data-dependent S-boxes.

Naturally, the designers of MARS may well have already pursued this
line of thinking quite far.

John Savard ( teneerf<- )
http://members.xoom.com/quadibloc/index.html

------------------------------

Date: Wed, 12 May 1999 16:45:06 -0400
From: [EMAIL PROTECTED]
Crossposted-To: talk.politics.crypto
Subject: Re: Crypto export limits ruled unconstitutional

Jim Gillogly wrote:
> 
> Mok-Kong Shen wrote [regarding translating C to English-like text]:
> > Whichever the direction, you have to provide rules of transformation.
> > That isn't apparent. I was asking about the specific software that
> > Bravo claimed to have used.
> 
> Uh, claimed?  No need to be offensive.
> 
> Try http://personal.sip.fi/~lm/c2txt2c/
> 
> Nobody who's read the tortuous (and torturous and potentially tortious)
> language used to translate algorithms into Patentese can doubt that
> some legal firms have a program to do this.  It's for a different
> purpose, allowing the Patent Office to pretend to believe that it
> describes a machine or process rather than an algorithm (which is
> not patentable), but the net effect is the same.  A case in point:
> 
>   13. The compression apparatus of claim 12 in which said transfer
>   means comprises means for transferring said data character signal
>   from said character register means to the lower significant positions
>   of said code register means, and  means for zeroing the remaining
>   high order positions of said code register means.
> 
> I'm not making this up.

It's modularity applied to legalese.

------------------------------

From: [EMAIL PROTECTED] (Serge Paccalin)
Crossposted-To: talk.politics.crypto
Subject: Re: Crypto export limits ruled unconstitutional
Date: Wed, 12 May 1999 22:37:15 +0200

In article <[EMAIL PROTECTED]>, mok-
[EMAIL PROTECTED] says...
> Johnny Bravo wrote:
> > 
> 
> >   There is a program that can convert C code directly to english
> > readable text and back again.  Here is a sample of the code for the
> 
> Can you provide a pointer to that? I am interested in the manner
> the translation gets done.

It's probably there:

http://personal.sip.fi/~lm/c2txt2c/

-- 
  ___________
_/ _ \_`_`_`_)  Serge PACCALIN
 \  \_L_)       [EMAIL PROTECTED]
   -'(__)  People take different roads seeking fulfillment
_/___(_)   and happiness. Just because they're not on your
road doesn't mean they've gotten lost. - H. Jackson Browne

------------------------------

From: [EMAIL PROTECTED]
Crossposted-To: alt.aol,alt.games.diablo,rec.toys.cars
Subject: An apology to Tom St. Denis; also, Terry Ritter is not very bright
Date: Wed, 12 May 1999 20:58:09 GMT

To Tom St. Denis:
I would sincerely like to apologize for sending numerous flames to the
sci.crypt newsgroup. I recognize that everyone has to start somewhere,
and that making at least an effort to learn more about a subject is a
trait to be greatly admired. Not only was my comment ill-considered,
but repugnantly rude and counterproductive. Thank you for your time.

Terry Ritter:
Just becuase you aren't as famous as Bruce Scheiner is, and nobody uses
your crappy propietary algorithms that you charge licensing fees for,
doesn't mean that everyone elses algorithm is weak. Quite frankly, your
products are little more than glorified snake-oil: admitedly better
than more, but still snake-oil none the less. I see at least one
substantial conflict of interest in your constant peddling of a wide
variety of algorithms regardless of their analysis: you yourself
"happen" to be in the business of making such "security products" (and
I use this phrase in the broadest and most charitable sense.) While no
general proof of the security of a block cipher exists, much in fact
can be gained from analysing an algorithm without result. The fact that
the best minds in the academic field are unable to come close to
breaking an algorithm strongly suggests that the algorithm does possess
some strength, something that cannot be said for your's. I view your
type of snake-oil peddler as even worse than David A. Scott, because at
least his product is free, he releases the source code, and he is
willing to defend his claim that his algorithm possesses adventagous
traits over existing algorithms (instead of a broad, meaningless
assertion that having many, unanalysed algorithms is stronger than one,
well-analysed algorithm).


--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---

------------------------------

From: [EMAIL PROTECTED] (Jayant Shukla)
Subject: DES analysis paper
Date: 12 May 1999 20:59:41 GMT

Hi,
  I am looking for a paper called "Analytical Characterisitcs of
the Data Encryption Standard". It was presented in CRYPTO'83.
Does anyone know where I can get an electronic copy of the paper?

regards,
Jayant


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to