Cryptography-Digest Digest #869, Volume #8        Sat, 9 Jan 99 01:13:03 EST

Contents:
  Re: Help: a logical difficulty (Mike McCarty)
  Re: coNP=NP Made Easier? (Mike McCarty)
  Re: coNP=NP Made Easier? (Mike McCarty)
  Re: Birthday Attack calculations. (Fred Van Andel)
  Re: MAGENTA question... (Nicol So)
  Re: On leaving the 56-bit key length limitation ([EMAIL PROTECTED])
  Re: Birthday Attack calculations. (Fred Van Andel)
  Re: coNP=NP Made Easier? (Planar)
  Re: coNP=NP Made Easier? (Planar)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Mike McCarty)
Crossposted-To: sci.math
Subject: Re: Help: a logical difficulty
Date: 9 Jan 1999 03:16:34 GMT

Dictionary order is a complex and interesting topic, though no much
related to mathematics. For example, consider my name, McCarty, and that
of a friend of mine, MacDougal. Which of them comes first in dictionary
(or, equivalently, alphabetical) order? MINE!

Mike

In article <[EMAIL PROTECTED]>,
Nicol So  <[EMAIL PROTECTED]> wrote:
)Mark Ingram wrote:
)> 
)> Nicol So wrote:
)> 
)> > On the other hand, a finite binary string can be interpreted as encoding a
)> > natural number.  If we order all finite binary strings in lexicographical
)> > (dictionary) order, we can interpret the i-th string as encoding the integer i.
)> > It is easy to see that the set of all finite binary strings has the same
)> > cardinality as the set of natural numbers.
)> 
)> Dictionary order, you say.  So '0' comes before '00', which comes before '000', ...
)> .  What integer does the string '1' encode in this ordering?
)
)My carelessness, thanks for catching it.  I had a different ordering in
)mind (the "increasing order") when I was writing the above, but decided
)to just name a familiar ordering instead (thinking, mistakenly, it
)wouldn't make a difference.  I thought I was using too many words
)already to explain something simple and explaining clearly what I meant
)by "increasing order" wouldn't make things more concise).
)
)Nicol


-- 
----
char *p="char *p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
This message made from 100% recycled bits.
I don't speak for Alcatel      <- They make me say that.

------------------------------

From: [EMAIL PROTECTED] (Mike McCarty)
Crossposted-To: sci.math,comp.theory
Subject: Re: coNP=NP Made Easier?
Date: 9 Jan 1999 02:44:28 GMT

In article <[EMAIL PROTECTED]>,
Matt Brubeck <[EMAIL PROTECTED]> wrote:
)In article <[EMAIL PROTECTED]>, rosi <[EMAIL PROTECTED]> wrote:
)
)>   Three things. First, you perhaps did not consider my other question.
)> That is if NDTM is real, NP problems can be solved in P time.
)
)Well, a nondeterministic Turing machine (NDTM) is a Turing machine which,
)at a given state and input, may have several potential transitions.
)Therefore it may not behave the same when given the same input more than
)once. The NDTM is just a theoretical model of computation; a "real" NDTM
)would be useless, since its output is unpredictable.

This isn't the definition *we* used.

)Given an instance of a decision problem, if the answer is "yes," an NDTM
)may or may not accept, but if the answer is "no," then it definitely does
)not accept. "NP time" means that there exists some NDTM for which there is

Oh, I see. You used the word "unpredictable" in a way I am not
accustomed. I also had to reparse "Therefore it may not behave the same
when given the same input more than once." I thought that you were
saying "For every given input string the machine has the possibility to
produce a different result on successive runs."

[snip]

)In short: NDTMs are theoretical constructs only.

Interesting. I have used virtual NDTMs for solving real life problems.
In fact, I have written (and use) LEX and GREP type programs for
pattern matching which implement NDTMs.

[snip]

)Side-note: Quantum computing, which is still largely theoretical, may
)someday allow the construction of real devices which behave like
)non-deterministic models of computation, and solve NP-hard problems in
)polynomial time. However, this is still unclear; it also doesn't change
)the theoretical bases for questions about P, NP, and coNP, though it does
)change the practical consequences of the answers.

This is true. But OTOH, we *already* have hardware which can be
programmed to act as an NDTM. Quantum computers will simply be more
efficient (hopefully, anyway).

[snip]

)Existence proofs and examples of NDTMs can found for well-known NP
)problems in algorithms and computational theory textbooks, references, and
)journals. I see no need to waste bandwidth and time by typing in big
)state/transition tables for NDTMs.


(To original poster:)

For a reference, see Aho and Ullman "Syntax Directed Language
Translation" (or something like that, the "Dragon" book). They give
examples of NDTMs used to recognize regular expressions of various
types. 

Mike
-- 
----
char *p="char *p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
This message made from 100% recycled bits.
I don't speak for Alcatel      <- They make me say that.

------------------------------

From: [EMAIL PROTECTED] (Mike McCarty)
Crossposted-To: sci.math,comp.theory
Subject: Re: coNP=NP Made Easier?
Date: 9 Jan 1999 02:49:20 GMT

In article <775em4$[EMAIL PROTECTED]>,
Steve Tate <[EMAIL PROTECTED]> wrote:
)rosi ([EMAIL PROTECTED]) wrote:
)> Planar wrote:
)
)> >       The complexity class P is the set of all decision problems
)> >       that are solvable in polynomial time BY DETERMINISTIC TURING
)> >       MACHINES.
)
)> Or are you saying that NDTM's can not solve NP problems in P time?
)> I hope you are not calling NDTM's pieces of crap. :) That would truly
)> offend a lot of people including their inventors. :)
)
)Ok, a lot of people have tried, but let me try to get at the problem
)here.
)
)Are you working under the assumption that NDTM's are actual, real
)machines?  That solve NP problems in polynomial time?
)
)If so, then you've gone way down the wrong path.  NDTMs are
)mathematical models.  They are convenient abstractions so that we can
)talk about things.  They do not exist beyond a definition on paper.

Well, DTMs are also mathematical models. They are also convenient
abstractions so that we can talk about things. They also do not exist
beyond a definition on paper.

)If NDTMs were real it would be a real boon to computing.  We could
)solve an awful lot of important problems that we can't solve now.  But
)that's simply not the case.

Virtual NDTMs are a real boon to computing. Most implementations of LEX
use them. They can be quite efficient. They can't run everything that a
real NDTM would in the same time a real NDTM would, but many of the
useful things NDTMs do virtual NDTMs can also do in polynomial time.
Virtual NDTMs are not useless.

Mike
-- 
----
char *p="char *p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
This message made from 100% recycled bits.
I don't speak for Alcatel      <- They make me say that.

------------------------------

From: [EMAIL PROTECTED] (Fred Van Andel)
Subject: Re: Birthday Attack calculations.
Date: Sat, 09 Jan 1999 04:46:16 GMT
Reply-To: [EMAIL PROTECTED]

[EMAIL PROTECTED] (Gregory G Rose) wrote:

>Now, let's not get testy. When David has a good
>idea, we should encourage it. The way I
>interpreted his comment was that, instead of
>testing the cut-down algorithms, cut down the
>output of the real algorithm. Generate your
>256-bit hashes, split them into 32- or 64- bit
>chunks, and test *those* as if they had come from
>a smaller generator. Any collision problems in the
>larger hash, or any implementation glitch in
>scaling it up, should show up.

The smaller hash sizes are not from a 'cut-down' algorithm. The nature
of this algorithm is that is can create hashes of any desired size
(actually in 8 bit multiples). Any given size is no more 'natural'
than any other size. If a smaller hash is collision resistant then I
would expect that a larger hash is also collision resistant. The only
real difference is that the small hashes are very easy to test, the
larger one are more difficult.  

I hope to have access to about 12 Pentium machines during evenings and
weekends for testing purposes,  this will allow me to crunch quite a
few samples in search of collisions.

Fred Van Andel


------------------------------

From: Nicol So <[EMAIL PROTECTED]>
Subject: Re: MAGENTA question...
Date: Fri, 08 Jan 1999 23:45:26 -0500

John Savard wrote:

> 
> The basis of Magenta is an S-box with 8 bits of input and 8 bits of
> output, called f(x). f(255) = 0, and for other values of x, f(x) =
> alpha ^ x where alpha is an *unspecified* primitive element of GF(256)
> with the generating polynomial is x^8 + x^6 + x^5 + x^2 + 1.
> 
> That's my question - what is alpha, ...

I haven't read the Magenta document, but assuming your description above
is accurate, you should be able to determine alpha as f(1), since f(1) =
alpha ^ 1 = alpha.

Nicol

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: On leaving the 56-bit key length limitation
Date: Sat, 09 Jan 1999 05:21:32 GMT

Ed Gerk wrote:
>   [EMAIL PROTECTED] wrote:
> >
> > Consider the situation in which the message space has several
> > plausible messages, but the conditional probabilities, given the
> > ciphertext, show that "Attack at dawn with 3000 men." has a
> > probability of 0.599999, and "Attack at dawn with 3006 men"
> > has a probability of 0.399999.  Using Shannon's formula for
> > entropy, I calculate the equivocation of the plaintext is
> > 0.97 bits.  Shannon defined the unicity distance as the number
> > of intercepted characters for which the equivocation in the
> > plaintext is very close to 0.  0.97 bits is not very close to
> > zero, therefore unicity has not been reached.
> >
> > I claimed that an attacker may get useful information from
> > the ciphertext even if he does not have enough ciphertext to
> > reach unicity.  You, Ed, stated that I was wrong.  The above
> > example is of course contrived, but nevertheless, the
> > conditional probabilities are possible.  The attacker does
> > not have enough ciphertext to reach unicity.  The attacker
> > does obtain a great deal of useful information from the
> > ciphertext.  My claim that you quoted above, "A cryptanalyst
> > may still get large amounts of useful information." is true.

> Let us re-start here and recall that I used DES to exemplify some of my
> arguments.

Though not the one at issue here.

> I agree with you that you can define unicity in such a way as to make the
> phrases "Attack at dawn with 3000 men." and "Attack at dawn with 3006 men"
> still shorter than the length required by your definition of unicity.
>
> I also agree you could define pi to be 3.

I try to avoid proliferating new definitions, which is why I
used Shannon's.

> However, both would be useless exercises because they fail to reflect the
> reality behind the concepts. When Shannon defined unicity he was struggling
> to define a concept -- which goes far beyond a mere expression of it as n =
> H(K)/D.

H(K)/D is an approximation of unicity distance for a random cipher.
"Unicity", from the English "unique", is a property or state: the
state in which something is the only one of it's kind - in this case
the only plausible decryption.  "Unicity distance" is the expected
number of encrypted letters needed to bring the analyst to unique
decryption.

    the equivocation curve approaches zero rather sharply.  Thus
    we may, with but little ambiguity, speak of a point at which
    the solution becomes unique.  This number of letters will be
    called the unicity distance.  For a random cipher it is
    approximately H(K)/D.
    [C. E. Shannon 1949]

How do you Shannon was struggling?  Did you ask him?

> Let me make my declaration precise. Unicity is (exactly in Shannon`s sense) a
> fundamental limitation to the least amount of plaintext which can be uniquely
> deciphered from the corresponding ciphertext -- given *unbounded* resources by
> the attacker. The attacker can even hire you ;-)

Yes you've written that several times, and even claimed it
is what Shannon actually wrote.  But though it's correct to a
first approximation, what I quoted above is Shannon's actual
definition.  What you quoted is what you, not Shannon,
actually wrote.

> Now, as you say that "Attack at dawn with 3000 men." and "Attack at dawn with
> 3006 men" cannot be decided by your system and, "therefore unicity has not
> been reached" -- then my conclusion is that you have left reality.
>
> Why? Because if you are using DES (as I used, so I assume that in order to
> discuss my results you would need to also use DES)

Sorry if it wasn't clear - the example is not about DES. I first
posted it as a counterexample to the claim that a cipher used
within it's unicity distance cannot be attacked by exhaustive
search and is secure against any attacker.  I reposted it to
support my claim that even with ciphertext that does not cover
the unicity distance, an attacker may get useful information
from the ciphertext.

> then the unicity-5
> distance of DES is 3 letters as I argued -- in your case, "Att". Certainly
> your phrases are much longer than DES unicity-5, so I could break into your
> DES key with only those 3 letters and then unambiguously decipher your whole
> long message. This means that I could have uniquely chosen which message was
> correct.

> I affirm this is not possible with DES.

I grant that my example proves nothing about DES.  On the other
hand, I note that unlike my usage of "unicity distance" which
comes from Shannon and is established in the discipline, the
term "unicity-5" is of recent fabrication.  In fact, as
explained in another branch, the DES results you got from these
new concepts are mostly incorrect.

> And, one cannot just invent
> "features" to support e-mail discussions. This may be useful in rec.humor but
> not here --a technical discussion group.
>
> Now, in the case you tell me you did NOT use DES in your "example" then I say
> that is firstly a lack of basic method -- since you cannot compare apples with
> speedboats -- also perhaps useful in rec.humor but not here.

The question I was considering is whether it is possible for
an analyst who lacks ciphertext covering a system's unicity
distance to gain useful information from the ciphertext.  I
had written that this is possible, and you had said I was
wrong.  I granted,

> > The above
> > example is of course contrived, but nevertheless, the
> > conditional probabilities are possible.

I do not know if the probabilities are possible for DES. I don't
know how they could apply to speedboats.  What's at issue here are
"Secrecy Systems", also defined by Shannon.  Obviously we could
construct a secrecy system and ciphertext that induces the given
probabilities.  What does that tell us?  It proves that providing
less ciphertext than required for unicity does not imply that an
attacker cannot get useful information from the ciphertext.

> Otherwise, it is just so simple -- define pi to be 3 and next thing you know
> you can start to criticize much more people, not just me.

My criticism has been entirely directed to the content of your
posts.  Do you recall writing,

: Comments?

Your posted work has been full of errors, so my comments have
not been favorable.  If you don't want to know, don't ask.

--Bryan

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (Fred Van Andel)
Subject: Re: Birthday Attack calculations.
Date: Sat, 09 Jan 1999 05:44:15 GMT
Reply-To: [EMAIL PROTECTED]

[EMAIL PROTECTED] wrote:
>  [EMAIL PROTECTED] (Gregory G Rose) wrote:

>> Now, let's not get testy. When David has a good
>> idea, we should encourage it. The way I
>> interpreted his comment was that, instead of
>> testing the cut-down algorithms, cut down the
>> output of the real algorithm. Generate your
>> 256-bit hashes, split them into 32- or 64- bit
>> chunks, and test *those* as if they had come from
>> a smaller generator. Any collision problems in the
>> larger hash, or any implementation glitch in
>> scaling it up, should show up.
>>
>> regards,
>> Greg.
>> --
>
>NO it is not even close to my thoughts
>
>  I thought it was a very good reply on my part sorry
>you where not capable of understanding it. So here
>it is in different words. It was previously stated
>that a cut down version of method tested. Apparently
>short enough to be able get meaning ful data based
>on probability of collisions. But the thing was
>worded that he should never test the FULL LENGTH
>case since the collision probabilty was virtually
>ZERO for any real number of tries one could make
>with the FULL LENGTH caae. But if the FULL LENGTH
>case is what one is programming the stupid hash
>function for. It would be FUCKING stupid not to run
>a few cases just to be DAM SURE non occurred. People
>can fuck up code you know.
>
>David Scott
>
Unfortunately running one test of a 256 bit hash for even a tiny part
of its range will require far more disk space than I have access to
and will have virtually zero chance of finding a collision.  Even
testing a mere 2^32 hashes compared with the > 2^64 required for a 50%
collision of a 256 bit hash would require 128 GigaBytes of disk space
to store all of the calculated hashes. I only wish that I had that
much. 

I will carry the testing into as far as I can go and still generate
statistically significant numbers. There is no point expending all my
computing power trying to force a collision of one relatively large
hash because the result would be a matter of dumb luck and completely
meaningless.  But by testing millions of tiny hashes and tens of
thousands of 40 or 48 bit hashes I can generate some meaningful
numbers.

If the hash is so fatally flawed that I could routinely find a
collision at much lower than the normal values then I would have found
this out at the smaller hash sizes.  The algorithm is scalable, there
is no real difference between a small hash and a large hash except for
the obvious.  Therefore I expect that the properties of the smaller
sizes to carry into the larger sizes.

Fred van Andel

------------------------------

From: Planar <[EMAIL PROTECTED]>
Crossposted-To: sci.math,comp.theory
Subject: Re: coNP=NP Made Easier?
Date: 9 Jan 1999 05:42:10 GMT

>Planar wrote:
>> Aha.  Now we know where the problem is.  The correct definition of P            ^^
OK, this "we" is just me.  And maybe also anyone who has a clue.


>From: rosi <[EMAIL PROTECTED]>

>   Who? Who is/are this 'we'? Sure this 'we' know(s)?


>> would be more like this:                ^^^^
>
>   'like'? You do mathematics in this 'like' manner?

Well, there's an infinite number of equivalent definitions of P, so
you can choose any one you like (that's where 'like' comes in), as
long as you make sure it's equivalent to the one everybody uses.


>>       The complexity class P is the set of all decision problems
>>       that are solvable in polynomial time BY DETERMINISTIC TURING
>>       MACHINES.
>> 
>
>   Where did you get that? Where did you quote from? From one of your
>own volumes?
>   Interesting typo or omission by M. et. al.

I got it from my memory.  I guess if you look up the definition of
"solvable" in M. et al., you'll find their definition is in fact
equivalent to mine, if less explicit.


>   Cryptographic stuff depends on P!=NP for its security. Now, even
>NDTM's are real, P!=NP. But what security? Profoundly Confused. OR are
>you saying that RSA is 'super NP', in other words, even NP problems
>are solvable in P time (by those real NDTM's), RSA is still secure?

Are you saying that RSA's security relies on P != NP ?  If you do,
then you are wrong.  Breaking RSA is not known to be as hard as
factoring, and factoring is not known to be NP-complete.  You have
*two* open problems to solve, here.

Moreover, RSA's security doesn't even have to depend on exponential
stuff to be (practically) secure.  If factoring turns out to be
polynomial (i.e. theoretically trivial) but the polynomial has degree
1000, RSA is still very secure.


>Or are you saying that NDTM's can not solve NP problems in P time?
>I hope you are not calling NDTM's pieces of crap. :) That would truly
>offend a lot of people including their inventors. :)

Of course, they can.  That's the definition of NP.  Moreover, I don't
think their inventors are still around to be offended.


>   I guess the people who talk about the dependency of crypto efforts
>on P!=NP must not know what they are talking about.

Indeed.  (at least in the case of RSA)


>> >   I here also commit my understanding of the definition. As long as
>> >the problem is solvable in polynomial time, be via DTM or NDTM, it is
>> >in P.
>> 
>> And this is equivalent (for people who know the correct definition of
>> P and NP) to claiming P = NP.
>
>   I guess you are also a complexity theory super guru. Show us the
>equivalence, which would be your daily cup of tea.

OK.  Here is the proof for the difficult half of the equivalence.  The
other half is left as an exercise.

THEOREM
  Assume every problem solvable by an NDTM is in P.  Then P = NP.

PROOF
Take any problem in NP.  It is solvable by an NDTM (by the definition
of NP).  Thus it is in P (by the assumption).
Thus NP is a subset of P.

Take any problem in P.  It is solvable by a DTM (by the definition of
P).  This DTM is also an NDTM (by the definitions of NDTM and DTM).
Thus the problem is solvable by an NDTM, so it is in NP (by the
definition of NP).
Thus P is a subset of NP.

We have: NP \subseteq P \subseteq NP
thus: P = NP.  QED             (I've always wanted to write this line :-)


>   The quote did not say the extra words you gave. I do not have the
>habit of putting my stuff in other people's mouths.
[...]
>   [Ilias] agrees with me;

You sure you're not putting words in Ilias' mouth ?

-- 
Planar

------------------------------

From: Planar <[EMAIL PROTECTED]>
Crossposted-To: sci.math,comp.theory
Subject: Re: coNP=NP Made Easier?
Date: 9 Jan 1999 05:52:08 GMT

>From: rosi <[EMAIL PROTECTED]>

>That is if NDTM is real, NP problems can be solved in P time.

What do you mean, real ?  An NDTM is a mathematical object.  It is no
more (and no less) real than a natural number, an unreachable cardinal
or even a real number.


>   Assuming NDTM is real, whether P equals NP or not, as to complexity
>theory, I think experts in complexity theory can say better than I do.

This is hardly expert-level stuff.  I wouldn't speak up if it was.


>   2. You were a bit loose as well. You said:
>         NP is obviously a superset of P ...
>I think we need a proof. If you meant that it is your belief, it cam be
>listened to as nother story.

Duh.  Since every DTM is an NDTM, you only need to look at the
definitions of P and NP to prove that NP is a superset of P.


>Why no one so far has posted a single NDTM that solves (in a hypothetical
>way) the NP problems? Well, bluffing that it somehow solves them can
>deprive me from saying: when BluffBluff(S, i", n) < f(n), after f(n) we
>can conclude i" is not a subset sum of S?

Oh boy.  You really shouldn't wave this red cape.  I'll explain why in
a follow-up to this message, but not cross-posted to sci.crypt, as it
would be irrelevant there.

-- 
Planar                               (sci.crypt removed from note followups)

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to