Cryptography-Digest Digest #490, Volume #14       Fri, 1 Jun 01 12:13:01 EDT

Contents:
  Re: A Newbie Question ("Tom St Denis")
  Re: Help with Comparison Of Complexity of Discrete Logs, Knapsack, and Large Primes 
("Tom St Denis")
  Re: A Newbie Question ("Robert J. Kolker")
  Entanglement Equation and Identity ("Osher Doctorow")
  Re: crypt education ("Douglas A. Gwyn")
  Re: National Security Nightmare? ("Douglas A. Gwyn")
  Re: RSA's new Factoring Challenges: $200,000 prize. ("Douglas A. Gwyn")
  Re: Medical data confidentiality on network comms (Roger Fleming)
  Re: Question about credit card number (Roger Fleming)
  Re: Uniciyt distance and compression for AES ("Douglas A. Gwyn")
  Re: Uniciyt distance and compression for AES ("Douglas A. Gwyn")
  Re: A Newbie Question ("Tom St Denis")
  Re: Uniciyt distance and compression for AES (SCOTT19U.ZIP_GUY)

----------------------------------------------------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: A Newbie Question
Date: Fri, 01 Jun 2001 13:10:05 GMT


"Robert J. Kolker" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> <apology>
> I have looked in the FAQ, however the questions
> I have do not seem to be there.
> </apology>
>
> Assume you have done a differential attack on a set
> of crypts and have gotten part of the key. However
> there are still a few thousand possible keys for which
> bruit force is used.
>
> Now, you are particularly interested in breaking one
> crypt, and  you run a moderate number of key guesses
> against it.
>
> Doesn't one still have to * read * ( I mean with a pair of
> eyes and a brain) the resulting decrypts to see if they
> make sense?  Isn't this time consuming? Can a computer
> program speed up this phase? What sort of computer
> program?
>
> Thank you for your input,

Again I answered this question yesterday (um you really should read the
posts here).

Look for digraphs, etc... if you decrypt and it's ascii and has TZ or PM in
it, it's probably not the key.

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Help with Comparison Of Complexity of Discrete Logs, Knapsack, and Large 
Primes
Date: Fri, 01 Jun 2001 13:14:09 GMT


"Bob Silverman" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> [EMAIL PROTECTED] (Merc42) wrote in message
news:<[EMAIL PROTECTED]>...
> > I am semi-new to cryptography and am currently in the middle of a
> > school project based on it.  I was wondering if anybody could give me
> > any advice in helping me with my project in which i hope to compare
> > the mathematical differences
>
> I'll be glad to help, but your meaning isn't clear.  What do you
> mean by "compare the mathematical differences"? They are  based
> on different hard problems.  What comparison did you have in mind?
>
>
>
> > in using discrete logs, the knapsack
> > (super increasing and non), and factoring large primes
>
> Factoring large primes is a waste of time.....
>
> > as a basis of
> > cryptographic security.  I was wondering if anybody knows any good
> > books on complexity theory
>
> For discrete logs and factoring, you need to learn about the number field
> sieve. Look up
>
> Lenstra & Lenstra  (eds)
> Development of the Number Field Sieve   Springer-Verlag

I would learn to walk then run.  Learn number theory and the QS first.  IIRC
Both discrete log and factoring are based on the same type of solution.
(One is adding prime powers and the other is multiplying them).

I don't know alot of the details but the NFS is much more complex then the
QS.

Tom



------------------------------

From: "Robert J. Kolker" <[EMAIL PROTECTED]>
Subject: Re: A Newbie Question
Date: Fri, 01 Jun 2001 09:41:19 -0400



Tom St Denis wrote:

>
> Look for digraphs, etc... if you decrypt and it's ascii and has TZ or PM in
> it, it's probably not the key.

Suppose the plain text was:

"I saw the move ANTZ in the PM yesterday. I liked it."

However I accept your chiding for overlooking your post.

Bob Kolker



------------------------------

From: "Osher Doctorow" <[EMAIL PROTECTED]>
Subject: Entanglement Equation and Identity
Date: Fri, 1 Jun 2001 07:41:37 -0700

From: Osher Doctorow [EMAIL PROTECTED], Fri. June 1, 2001 5:59AM

In my recent contribution, I developed the Memory (M) Theory probability
correlation P(A<-->B) = P(AB) + P(A' B' ) for nonnegatively correlated
events A, B, and 1 - P(A<-->B) = P(AB' ) + P(A' B) for nonpositively
correlated events A, B.  For random variables X, Y, letting A = {X < = x},
which is to say the set of points w of the probability space for which X(x)
< = x where x is an arbitrary value of X, we get P(A<-->B) = P(X < = x, Y <
= y) + P(X > x, Y > y), and 1 - P(A<-->B) = P(X < = x, Y > y) + P(X > x, Y <
= y).   It therefore seems reasonable to define C+(X,Y), the Nonnegative M
Theory correlation of X and Y (regardless of whether linear or nonlinear) as
P(A<-->B) in the above, and write it alternatively as P(X<-->Y) (which is
defined by the two sum expression).  Likewise, C-(X, Y), the nonpositive
correlation of X and Y (regardless of whether linear or nonlinear), is
defined as 1 - P(A<-->B) and is written alternatively as 1 - P(X<-->Y) or
P(X<--> -Y) since X and Y are in the opposite directions with respect to
inequalities.  I also pointed out that inequalities and subset inclusion
have been shown to be more fundamental than equations (equations are even
intersections of inequalities in the opposite direction for real valued
equations and a number of other important cases) in nonsmooth analysis
(Clarke et al 1998, etc.).

I then pointed out that entanglement is a subset of A<-->B in the above
scenario when the entangled objects behave similarly, and entanglement is a
subset of (A<-->B)', the complement of A<-->B (which is reflected in 1 -
P(A<-->B) = C-(X, Y) = P(X<--> - Y) ) when the entangled objects are
experimentally prepared to behave oppositely.  Both scenarios can be
experimentally created, and readers are referred to the entanglement
literature for that.

However, there are some interesting additional facts.   Both scenarios are
not simultaneously experimentally created, and therefore when A and B, the
entangled sets/events/objects (objects here are considered to be types of
sets/events, although one could probably create an abstract object
probability too), behave similarly or *identically*, the entanglement must
be a subset of A<-->B, and since P(A<-->B) + (1 - P(A<-->B)) = C+  + C- = 1,
we would presumably get P(A<-->B) = C+ = 1 and C- = 0.  I qualify this by
*presumably* because we are traveling on new ground here, though this should
not alarm the adventurous members of sci.crypt.  One thing that I have
learned in probability/statistics is that the slightest assumption may be
erroneous or at least subject to qualification, which was eventually learned
in Euclidean (relative to Non-Euclidean) geometry and a few other places,
although it seems to be occasionally overlooked by those accustomed to rush
to publish or perish.   For example, a common mistake of those *superficial*
probability experts outside of the mathematical specialty is the belief that
events of probability 0 are impossible or null sets and that events of
probability 1 are *certain to occur*.   If all random variables under
consideration have continuous probability distributions on a bounded volume
of Euclidean or common Non-Euclidean spaces, say R^n, any n-1 dimensional
event/set has probability 0 even though it undoubtedly is not null for n > =
1.  The analogous result also holds for Lebesgue measure on bounded sets
(see any good text on real analysis, e.g., the pioneers Hewitt and Stromberg
(1965) Springer-Verlag: N.Y., in their Real and Abstract Analysis).

The *Positive Entanglement Equation*  C+ = 1 or P(X <-->Y) = 1 is a rather
remarkable equation.  In the form P(A<-->B) = 1, it is equivalent to P(AB) +
P(A' B') = 1, which is equivalent to P(AB' ) + P(A' B) = 0 where B' is the
complement of B (the part of the universe outside of B) and AB' is the
intersection of A with B'.   However, since probability is nonnegative, the
last equation is equivalent to P(AB' ) = 0 and P(A' B) = 0.   Now, if A, B
are n-dimensional sets not equal to the universe then the first equation
P(AB' ) = 0 says that the part of A outside B has probability 0, and an
analogous argument gives it Lebesgue measure 0.  Therefore, with probability
1, A is a subset of B.  However, the second equation P(A' B) = 0 says the
same with respect to the part of B outside of A, so with probability 1 and
Lebesgue measure 1, A = B.  They may still be unequal in some probability 0
lower dimensional sets/events.  Therefore, up to dimension n, A and B are
identical.

We therefore conclude that positively entangled events/sets/objects are
IDENTICAL up to the dimension of the relevant space.  This is a completely
different phase of matter/radiation or whatever from the usual one.   Yet
its main application appears to be much more subtle.  We are concerned here
with quantum cryptography and (for me at least) quantum computers.  However,
we know that there is one computer-like object that has been shown (by
Pribram and others) to have global, holographic, non-local properties, and
which is commonly subjectively described by *itself* as having a state of
global unity and awareness of *extended identity* or *consciousness*,
namely, the human brain.

If the human brain is an entangled computer, and I conjecture that it is,
then there is another subtle implication which may be revolutionary, namely,
the quantum world is directly observable and measurable by human observrs.
We may quibble over whether measurement and perception (at least for well
trained scientists) differ, but there is certainly a common ground here.  I
am rather curious as to how the Uncertainty Principle would fare in this
situation.   I think that we may have reached the second phase which I have
hypothesized in sci.physics and elsewhere for so long, namely, the phase in
which the Uncertainty Principle does not hold.   It is one thing to say that
we cannot arbitrarily precisely measure position and speed of an object, but
it is quite another to say that we can directly observe and perceive the
object.

Before one falls into the trap of claiming that all this is a relic of the
macroscopic world which *obscures* the underlying quantum reality because of
its large *rough* scale, remember that there are already global processes in
quantum physics, and the wave associated with a particle also has global
properties.

We have also reached the level at which physics and psychology and biology
need to interact, namely, the human brain.  It is a remarkable laboratory
for entanglement and cryptography if I am correct, and research into it from
this direction should be accelerated both in speed and funding.

Osher Doctorow Ph.D.
Doctorow Consultants
Formerly (and still intermittently) California State Universities and
Community Colleges
(see fairly detailed abstracts of 55 of my papers at
http://www.logic.univie.ac.at, Institute for Logic of the University of
Vienna - select ABSTRACTS, then BY AUTHOR, then my name).











------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: crypt education
Date: Fri, 1 Jun 2001 13:08:02 GMT

We know where bin Laden is, but we have laws prohibiting assassination,
and his host nation refuses to extradite him.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: National Security Nightmare?
Date: Fri, 1 Jun 2001 14:11:04 GMT

David Wagner wrote:
> >actually our regulations
> >apply to intelligence received about US citizens from foreign
> >sources as well.
> That would be reassuring.  Do you have a reference to the text of those
> regulations?  If the policy gives protection to US citizens no matter
> who does the intercepting, I can't imagine a good national security
> reason to keep such a policy secret.  Am I missing something?

I guess so.  The Foreign Intelligence Surveillance Act of 1978
(50 USC sections 1801-1829), which is the primary source of
regulation in this area (along with the Constitution of the USA),
is certainly public, as are Executive Orders 12139 and 12333,
which gives tasking and constraints for intelligence agencies,
and precursors to EO 12333 such as EO 11905, which was quite
explicit in prohibiting use of information obtained from foreign
agencies to surveil domestic activities of any US person.  It may
take legal training and experience to wade through such documents
and determine exactly how they apply to this issue; I merely
note that there is nothing in them that excepts information
obtained from governments of other nations.

>  Even if
> full disclosure of the relevant regulations would not entirely lay all
> fears to rest in one swoop, it seems it would be a nice way to bring
> something concrete to the public debate and to help convince outsiders
> that NSA is acting in good faith.
> In recent news interviews, the Director has expressed an interest in
> reassuring the public that the NSA is not up to anything nefarious,
> and this would appear to be a very simple step to further that goal at
> very little cost to the NSA.  Is there some cost to publicizing these
> regulations that I have overlooked?

I guess you haven't bothered to read
http://www.nsa.gov/releases/DIR_HPSCI_12APR.HTML
which is a transcript of DIRNSA's testimony to Congress on this issue.
The referenced documents are all available on the Web and can
be found with a simple Web search.

------------------------------

Crossposted-To: sci.math
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: RSA's new Factoring Challenges: $200,000 prize.
Date: Fri, 1 Jun 2001 14:24:24 GMT

Rob Warnock wrote:
> Michael Brown <[EMAIL PROTECTED]> wrote:
> | "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> | > The main unresolved question seems to be, how many operations
> | > can we expect for finding a typical N-bit prime factor?
> | If you are referring to my algorithm (hopefully :P), then there
> | are x^3 - x^2 - x boxes to complete, where x is the number of digits
> | in the maximum prime.
> So I'd be more concerned about Doug's question about "number of operations"
> than about the memory...

I concur.  What I wonder, not having studied this algorithm yet,
is the rate of growth of work-per-box as the problem size is
increased.  If, for example, it is subexponential, then this
could be a practical method of factoring very large products-
of-primes; perhaps NASA or some other possessor of a system
like the ones Rob mentioned would be happy to run the
remaining RSA factoring challenges against Michael's algorithm,
especially in exchange for a cut of the prize..

I for one would be happy to see factoring become clearly too
risky as a base for cryptosecurity.  It would make for some
excitement in this field, which has been rather dull lately.

------------------------------

Crossposted-To: comp.security.misc
From: [EMAIL PROTECTED] (Roger Fleming)
Subject: Re: Medical data confidentiality on network comms
Date: Fri, 01 Jun 2001 13:48:35 GMT

Lassi =?iso-8859-1?Q?Hippel=E4inen?= <[EMAIL PROTECTED]> wrote:
>Roger Fleming wrote:
><...> 
>> Never disappear? I presume you are talking about emailing it, not posting it
>> on a newsgroup. Email will disappear from the net in 5 - 10s unless some
>> cracker intercepts and archives it for 30 years in the hope of one day
>> decrypting it to blackmailing someone. Not completely impossible, but pretty
>> unlikely - especially if it is one of millions of encrypted emails, and has
> no
>> plaintext identifying it as especially interesting.
>
>It will remain in your mail server until you pick it up. In the mean

My mail server is not "on the public networks" in the sense that he meant, ie 
accessible to anyone browsing around randomly looking for blackmail data. In 
any case, most folks check their email several times a day, or at least 
weekly; he was talking about this data remaining accessible _forever_, with 
context referring to periods greater than 30 years.

>time the admin of the server may have made a safe copy of its state just
>in case. If that safe copy isn't shredded properly... etc <insert here
>your favorite amount of paranoia>.

Yes, but that isn't "on the public networks" where it can be found by a random 
browser. We were discussing the fact that trusted insiders have all sorts of 
ways to divulge the information, either accidentally or deliberately, that 
have nothing to do with the encryption strength. He countered that information 
on the net has the disadvantage of being accessible to any joe blow, forever. 
That isn't true for email.

>Anyway, the header of the message contains lots of useful stuff, so it
>is easy to sort the messages to subgroups according to their potential
>amount of interest.
>
>-- Lassi

------------------------------

From: [EMAIL PROTECTED] (Roger Fleming)
Subject: Re: Question about credit card number
Date: Fri, 01 Jun 2001 13:56:16 GMT

[EMAIL PROTECTED] (Mark Borgerding) wrote:

[...]
>I imagine most web credit card systems store the number in a database
>that is accessible to the webserver.
>
>Instead, wouldn't it be better to give the webserver very limited
>capability to use the credit card.  i.e. After the user has added
>their credit card, it gets wisked away to a more secure machine than
>the webserver.  [...]

It makes so much sense that, in fact, most really large corporate sites use 
exactly this method. The industry jargon is "two tier" or "three tier" 
architectures. Two tier has separate web server and database. In the three 
tier architecture, a transaction service layer further separates the webserver 
and database.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Fri, 1 Jun 2001 14:32:53 GMT

Dennis Ritchie wrote:
> Still, if he's "the man", you might want to spring for the book;
> there's lots in it.

Another approach is to pursuade a local library to obtain the book.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Fri, 1 Jun 2001 14:49:37 GMT

"SCOTT19U.ZIP_GUY" wrote:
> ... Does Shannon every talk about the kind of compression. I am I
> know it has to be lossless. But doesn't he also imply that for a
> piece of cipher text eack key test would have to lead back to a
> plain text so that imples some sort of bijective compression.

There are all sorts of papers based on Shannon's information
theory published in places like IEEE Transactions on
Communications and IEEE Transactions on Information Theory.
It is not clear to me exactly what theorem it is that you
are seeking, or putting it another way, what assumptions and
conclusions you want to combine.
For example, why do you say you "know it has to be lossless"?
There are both lossy and lossless compression techniques;
generally one just specifies the degree of "noise" that is
tolerable in the decompressed output.
It is certainly not the case that for every cryptosystem
every key within some contiguous range has to usable for
decryption; counterexamples are readily devised.  My guess
as to what you mean is: ?theorem?: cryptosecurity requires
that every key that works at all decrypts a valid ciphertext
to plaintext that conforms fairly closely to the source model.
I am sure from reading Shannon that he was well aware that
redundant information could in principle be encoded using
fewer bits (i.e. compressed).  I'm also sure that he never
proved the above ?theorem?, which is evidently not true in
general.  If you meant something else, please state it more
precisely.

------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: A Newbie Question
Date: Fri, 01 Jun 2001 15:19:58 GMT


"Robert J. Kolker" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
>
> Tom St Denis wrote:
>
> >
> > Look for digraphs, etc... if you decrypt and it's ascii and has TZ or PM
in
> > it, it's probably not the key.
>
> Suppose the plain text was:
>
> "I saw the move ANTZ in the PM yesterday. I liked it."
>
> However I accept your chiding for overlooking your post.

you don't discount those keys, you just put them at the bottom of your list.

If you have say 200 bytes of ciphertext you should see something like 'e' or
'th' occur more then 'z' or 'pq'.

Tom



------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Uniciyt distance and compression for AES
Date: 1 Jun 2001 15:25:02 GMT

[EMAIL PROTECTED] (Douglas A. Gwyn) wrote in <[EMAIL PROTECTED]>:

>"SCOTT19U.ZIP_GUY" wrote:
>> ... Does Shannon every talk about the kind of compression. I am I
>> know it has to be lossless. But doesn't he also imply that for a
>> piece of cipher text eack key test would have to lead back to a
>> plain text so that imples some sort of bijective compression.
>
>There are all sorts of papers based on Shannon's information
>theory published in places like IEEE Transactions on
>Communications and IEEE Transactions on Information Theory.
>It is not clear to me exactly what theorem it is that you
>are seeking, or putting it another way, what assumptions and
>conclusions you want to combine.
>For example, why do you say you "know it has to be lossless"?

   I meet when he talks about compression to increase the entropy
pet bit. It has to be lossless. I didn't want someone to give
the fippant anwser "just use lossless compression and drop headers"
which is the usually brain dead response.

>There are both lossy and lossless compression techniques;
>generally one just specifies the degree of "noise" that is
>tolerable in the decompressed output.
>It is certainly not the case that for every cryptosystem
>every key within some contiguous range has to usable for
>decryption; counterexamples are readily devised.  My guess
>as to what you mean is: ?theorem?: cryptosecurity requires
>that every key that works at all decrypts a valid ciphertext
>to plaintext that conforms fairly closely to the source model.

   I agree with the above sort of. If you have a cipher text
ideally according to Shannon you would want every possible key
to decrypt decompression to unique plain text.  This implies
bijective compression. Yet the so called experts seem to stuipd
to notice this fact.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to