Cryptography-Digest Digest #488, Volume #12 Sat, 19 Aug 00 22:13:00 EDT
Contents:
Re: Breaking Simple XOR Encryption ("Douglas A. Gwyn")
Re: Breaking Simple XOR Encryption ("Douglas A. Gwyn")
Re: Looking for a DES or RSA chip with write-only key. ("Douglas A. Gwyn")
Re: DES: Say it or spell it? (Newbie question) ("Douglas A. Gwyn")
Re: 215 Hz five-qubit quantum processor ("Douglas A. Gwyn")
Re: OTP using BBS generator? (Mok-Kong Shen)
Re: How Many? (Mok-Kong Shen)
Re: How Many? ([EMAIL PROTECTED])
Re: 1-time pad is not secure... ("Douglas A. Gwyn")
Re: OTP using BBS generator? ("Douglas A. Gwyn")
Re: How Many? ("John A. Malley")
Re: How Many? ("Douglas A. Gwyn")
Re: OTP using BBS generator? (Mok-Kong Shen)
Re: How Many? ("Douglas A. Gwyn")
Re: How strong is this algorithm? (Jeff Davies)
Re: How strong is this algorithm? (Jeff Davies)
----------------------------------------------------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Breaking Simple XOR Encryption
Date: Sat, 19 Aug 2000 20:36:01 -0400
> >Peter wrote:
> >> I would appreciate an explanation of the attack that is
> >> used against simple XOR "encryption" schemes.
> Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> >The first thing to understand is that XOR is *not* an
> >encryption scheme, nor a class of such schemes; it is
Matthew Skala wrote:
> I understand the phrase "simple XOR" to refer to one specific algorithm:
> the one where you take the key as a string of bytes, repeat it enough
> times to make it the length of the message, and XOR the two together.
That's a periodic additive binary cipher.
The use of XOR (noncarrying binary addition)
is not the most essential feature and should
not be dominant in a name for the system.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Breaking Simple XOR Encryption
Date: Sat, 19 Aug 2000 20:55:32 -0400
Peter wrote:
> Douglas, you said ,"For learning cryptanalysis in general, there are
> better texts than Schneier's.". Could you please recommend some such
> texts.
The US Army course metioned in another response by JPeschel is a
fairly good one. My all-time favorite is the Military
Cryptanalytics series by Callimahos and Friedman (actually
Friedman wrote the older Military Cryptanalysis series from
which much of Callimahos's rewrite was drawn), available from
Aegean Park Press. Since these predate computer-oriented
cryptography, they don't specifically address modern ciphers,
but many of the techniques and basic principles apply just as
strongly to the cryptanalysis of modern ciphers as to the old
classics. And it is easier to learn when you can actually
*solve* realistic examples, which might not be feasible for
some modern ciphers. The "Zendian problem" (in Callimahos'
MilCryp Part 2 Vol. 2) is an excellent practical exercise in
cryptanalysis (as well as traffic analysis); it can be tackled
without aid of a computer, although it is easier with one.
Other good texts are Helen Gaines' "Cryptanalysis" and Abraham
Sinkov's "Elementary Cryptanalysis: A Mathematical Approach".
Whatever book(s) you choose, be sure to work the exercises --
it is truly impossible to learn everything important in
cryptanalysis, especially how to judge which methods to use
under which circumstances, without first-hand experience.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Looking for a DES or RSA chip with write-only key.
Date: Sat, 19 Aug 2000 20:59:33 -0400
Benjamin Goldberg wrote:
> Obviously the sarcasm passed over your head. Lookup write-only
> memory in the New Hacker's Dictionary, ...
Paul Rubin used the term correctly. If the NHD says otherwise,
it is mistaken. I was around for the original Signetics WOM
spoof spec sheet, but WOM is real and has uses (as we just saw).
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: DES: Say it or spell it? (Newbie question)
Date: Sat, 19 Aug 2000 21:08:02 -0400
Kristopher Johnson wrote:
> How do you pronounce "NASA"? How about "AWACS"?
Like everybody else..
When acronyms get very long (around 4 characters seems
to be the threshold), they are usually chosen so that
they *can* be pronounced, in the expectations that they
will be. Although in the case of HWMMV (or is it HMMWV)
not much thought seems to have gone into this, but noone
wanted to use a lot of syllables to replace the old easy
"Jeep" so it instantly became Hum-Vee.
How do you pronounce DEA? FBI? CIA? KY-12? AN/TLQ-14?
Spelling-out is the usual rule for short acronyms even
when one can devise a phonetic pronunciation.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Crossposted-To: comp.arch
Subject: Re: 215 Hz five-qubit quantum processor
Date: Sat, 19 Aug 2000 21:15:06 -0400
Maynard Handley wrote:
> I would contend that what's interesting about Godel is more that certain
> things ARE true, but can't be PROVED true. The standard example is always,
> of course, Goldbach's conjecture ...
Actually I don't think that is a good example, because our intuition
is that its proof should be feasible (if it's indeed true). The
kind of "unprovable truths" we are sure are Godelian are ones that
are carefully engineered to be self-referential in a suitable coding,
but they aren't very interesting compared to statements like
Goldbach's conjecture. Using Godel incompleteness as a possible
reason why we haven't yet proved some possible theorem seems to be
a cop-out (a lame excuse).
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: OTP using BBS generator?
Date: Sun, 20 Aug 2000 03:30:23 +0200
"Trevor L. Jackson, III" wrote:
>
> We know better than to test for the randomness of a sequence (as opposed to a
> source), but do we know better than to test for the orderliness of a sequence rather
> than a source?
>
> In principle, one can rely upon the source to produce the output it is designed to
> produce. But when one encouters an artifact, even an expected one, the urge to
> apply quality control to the output becomes intense.
In statistical tests, one tests in essence the presence of
certain amount of patterns (orderliness). If that amount
is lower than a threshold, it means the sequence is
sufficiently random (with respect to the type of patterns
examined).
A test gives for one sequence only one value (result), the
test statistic. If we let the source generate a large
amount of such sequences, we can see whether the distribution
of these values sufficiently well correspond to that of an
ideal random source. If we do the same for all statistical
tests available, we can have indeed a quite good feeling of
the quality of the source in my humble view.
On the other hand, one can let a single sequence that is
actually used in the encryption process be subject to
the diverse tests. If all tests are passed, then, on the
assumption that the opponent doesn't have more/better
tests, we can say that he can't discern any non-randomness
(patterns) in it (since we can't) and hence the sequence
is o.k. It is my humble opinion that on using ANY new
source this procedure should always be applied until
(hopefully) one collects sufficient amount of good
experience with it such that one could with confidence
dispense with the tests in order to save computing
resources.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 03:30:15 +0200
Future Beacon wrote:
>
> I don't know whether to call them principles or elements. How many
> are essential to encryption? I see transposition as a special case
> of substitution. Both binary operations and transformations could
> be viewed as look-up tables. I seem to view every element of
> encryption as look-up table formation, look-up table use or random
> number sources.
>
> Does anybody view any other principle or element as essential to
> encryption?
>
> If they are few in number, it might be instructive to study them
> rather than studying a compendium of techniques that each contain
> more than one of them.
A look-up table is a mapping, in other words giving for
each argument the function value. Any one-to-one mapping
can be implemented as a look-up table. A substitution
is such a mapping and so can be implemented this way.
If you consider for a given transposition all possible
cases, i.e. 0 and 1 for each of the bit positions, then
you could (explicitly) compute the the underlying
one-to-one mapping, thus regarding it as a substitution
which can then be implemented as a look-up table. However,
it is conceptually simpler (more convenient) to consider
transposition as such and perform the movements of the
bits instead of implementing it with a look-up table,
which could be very large if the transposition involves
many bits.
If some steps of an algorithm are given in form of
equations (actually assignments in the sense of CS), one
can also consider that to be a one-to-one mapping
and hence a substitution and implementable as a
look-up table. It's however not advantageous for the
same reason as above.
If one takes homophones into account, then one gets a
one-to-many (invertible) mapping. This is certainly the
most general concept. However, the more generality you
get, the more you lose in specific details. An analogy:
Suppose we consider sine, cosine etc. We can employ
the general term 'trigonometric function'. But then we
lose in discourse the specific informations pertaining
to the individual cases, don't we? We could go a step
further and consider 'real function' (as against complex
valued function) but then we lose even more useful
details. Do you see my point?
I don't think that you wanted to consider such stuffs as
block chaining. But if one considers the whole massage,
then the plaintext is mapped one-to-one to the ciphertext
even with block chaining. Now one has really a huge
substitution which is in general much too big for
implementation as a look-up table. Even if one succeeds
to do that, the idea/effect of block chaining that is
actually at work becomes no longer clearly discernable,
which I believe is disadvantageous for studying the
algorithm.
In my humble view, the various concepts/formalisms that
have (naturally) evolved in crypto from ancient times
till the present reflect what are probably the
conceptually best and implementationally most suggestive
(meaningful) entities for the state of the art. One
can certainly not exclude that more would be desirable
with further advancements of the state of the art.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: How Many?
Date: Sun, 20 Aug 2000 01:09:12 GMT
In article <Pine.LNX.4.21L2.0008191803050.28806-
[EMAIL PROTECTED]>,
Future Beacon <[EMAIL PROTECTED]> wrote:
>
>
> I don't know whether to call them principles or elements. How many
> are essential to encryption? I see transposition as a special case
> of substitution. Both binary operations and transformations could
> be viewed as look-up tables. I seem to view every element of
> encryption as look-up table formation, look-up table use or random
> number sources.
>
> Does anybody view any other principle or element as essential to
> encryption?
>
> If they are few in number, it might be instructive to study them
> rather than studying a compendium of techniques that each contain
> more than one of them.
What you are thinking about are called constructs or primitives. Such
as substitutions and linear transforms which are the only two real
primitives anyways.
Almost all other things can be made from either of the two primitives.
Rotations for example are just (n + log2(n)) by n substitutions....
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: 1-time pad is not secure...
Date: Sat, 19 Aug 2000 21:28:24 -0400
Tim Tyler wrote:
> : A fundamental unknowability is not the same as an accidental lack of
> : knowledge about something that is in principle knowable.
> I agree with this as well - although it is not known to be relevant to
> the case in question.
It is the crux of the matter, because it results in:
> : The difference in character leads to different mathematical properties.
> : That difference has been empirically confirmed, so we *know* that the
> : underlying randomness is irreducible. [...]
> No we don't. Empirical confirmation has only gone so far. Equally well,
> 19th century observation would have "confirmed" that Brownian motion
> was random. Any attempt to state that observations suggested that no
> lower-level deterministic processes were responsible for Browniam
> motion would - in that instance - have been mistaken.
Actually I could argue about Brownian motion too, but it's a red
herring. The discussion was about the *qualitative* difference
between fundamental quantum randomness and ordinary statistical
randomness. The relevant experiments set up a binary outcome:
Go or No Go. If the underlying randomness were of the ordinary
type, a Go would result; if the underlying randomness could not
be of the ordinary type, a No Go would result. The results were
invariably No Go, i.e. ordinary statistical randomness (resulting
from incomplete information about detailed lower-level behavior)
was definitely ruled out. This means that no amount of future
refinement *can* determine any more detailed description.
> : That is not an assumption, it is a conclusion.
> *Your* conclusion, and one which does not appear to follow from the known
> information.
I already said that I was not giving the complete development that
would be necessary to draw the conclusion. You have to study QM
for that.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: OTP using BBS generator?
Date: Sat, 19 Aug 2000 21:49:17 -0400
Mok-Kong Shen wrote:
> In statistical tests, one tests in essence the presence of
> certain amount of patterns (orderliness). If that amount
> is lower than a threshold, it means the sequence is
> sufficiently random (with respect to the type of patterns
> examined).
As has been observed before, it is not the *data* that might
be random, but the *process* that generates it. It is thus
better to consider the property of "randomness" as meaning
agreement with a specific source model. Then the test merely
measures the degree of deviation from the model behavior.
Some small fraction of the time, even a perfectly functioning
true random generator *will* produce a highly patterned
stretch of output data. When it does, the statistical test
will flag the deviation; such occasional misdiagnosis is
unavoidable. There is a trade-off between minimizing the
chance that a valid source is rejected, and minimizing the
chance that an malfunctioning source is accepted. Tests
should be designed to make an appropriate trade-off.
> A test gives for one sequence only one value (result), the
> test statistic. If we let the source generate a large
> amount of such sequences, we can see whether the distribution
> of these values sufficiently well correspond to that of an
> ideal random source.
That is merely a different ("batched") kind of single test.
> If we do the same for all statistical tests available, we
> can have indeed a quite good feeling of the quality of the
> source in my humble view.
"All" tests available is an essentially unbounded number.
It would be best to use tests designed specially for
detecting the most likely kinds of deviation from the model,
which for the most part requires analyzing the structure of
the generator and its likely failure modes.
This entire business is a well-established branch of
statistics, and there are professionals who are paid to
set up such tests. It's best to employ an expert than to
try to "wing it" and end up with unwarranted confidence in
a defective system due to poor test design.
------------------------------
From: "John A. Malley" <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sat, 19 Aug 2000 18:53:57 -0700
Future Beacon wrote:
>
> I don't know whether to call them principles or elements. How many
> are essential to encryption? I see transposition as a special case
> of substitution.
A transposition alters dependencies between consecutive symbols in the
plaintext. Transposition helps mask correlations between plaintext
symbols so these correlations attenuate in the ciphertext. It's a
permutation on the positions of the plaintext. It doesn't care what
plaintext symbol is at that position in the plaintext sequence to
permute. Transposition preserves the frequencies of occurrence of
plaintext symbols in the ciphertext. And that helps cryptanalysis.
A substitution maps plaintext symbols to ciphertext symbols. A
substitution can be a permutation on the alphabet of plaintext symbols.
For simple mono alphabetic substitution the same plaintext symbols gets
replaced with the same ciphertext symbol no matter where it occurs in
the plaintext - but that's not true for a periodic transposition. The
frequency of occurrence of a plaintext symbol doesn't match the
frequency of occurrence for that symbol in the ciphertext. But in
general, substitutions preserve correlations between plaintext symbols
so these correlations appear between ciphertext symbols. And that helps
cryptanalysis. ( I acknowledge polygram substitutions, polyalphabetic
substitutions and aperiodic substitutions make cryptanalysis progressive
harder, but attacks on these systems take advantage of correlations in
ciphertext.)
Combining these two cryptographic primitives together is the heart and
magic of good cryptography. Transposition alters the order of plaintext
symbols. Substitution alters the symbols used to represent the
plaintext. Successive transposition ( also called permutation) and
substitution on the same plaintext block, a round, is at the core of
DES and other block ciphers.
> Both binary operations and transformations could
> be viewed as look-up tables. I seem to view every element of
> encryption as look-up table formation, look-up table use or random
> number sources.
These look like implementations of math or logic functions. As such one
can implement permutations and substitutions.
>
> Does anybody view any other principle or element as essential to
> encryption?
Well, algorithmically transposition and substitution seem to cover
cryptography. Even public key ciphers. For example, take RSA. (Out on a
limb here, asbestos suit ON. Check.)
Look at M^e mod n = C as C = (M + K) mod n. K is a secret key. The key
value K = (C - M ) mod n. M is substituted to a new number, C, with an
additive substitution modulo n. To decrypt, subtract K from the
ciphertext (the number C ) modulo n as M = C^d mod n = (C - K) mod n =
M. The mathematical operations of exponentiation to e to encrypt and
to d to decrypt could be done exactly the same with a secret key K
added to M modulo n and subtracted from C modulo n.
So yes, implementation here is done with modulo arithmetic, but we can
describe RSA as a substitution cipher taking plaintext represented as a
numerical value on a field and substituting a new value on that same
field for it.
John A. Malley
[EMAIL PROTECTED]
>
> If they are few in number, it might be instructive to study them
> rather than studying a compendium of techniques that each contain
> more than one of them.
>
> Jim Trek
> Future Beacon Technology
> http://eznet.net/~progress
> [EMAIL PROTECTED]
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sat, 19 Aug 2000 21:57:03 -0400
[EMAIL PROTECTED] wrote:
> What you are thinking about are called constructs or primitives. Such
> as substitutions and linear transforms which are the only two real
> primitives anyways.
That's nonsense. One could equally well (or better) say that
the only "real" primitive is NAND, since the entire digital
system could be constructed using just that one operator.
However, this sort of microanalytical approach is doomed,
because in most cases the real *meaning* can only be dealt
with at higher levels of structure. One should work at an
appropriate level for best effect.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: OTP using BBS generator?
Date: Sun, 20 Aug 2000 04:13:45 +0200
Tim Tyler wrote:
>
> I for one don't see how the nature of the BBS modulus might help prevent
> cycles in the low bits.
I suppose you mean short cycles. BBS's paper has a section
devoted to special types of moduli where the cycle length
of the output of the congruence relation can be computed
from the modulus. BBS states only that the cycle length
of LSB must divide the first mentioned cycle length but
leaves the exact relationship between the two to be an
open question. Thus it seems that one can currently say
very little about the cycle length of LSB (even for the
special types of moduli), which conforms to what you wrote
above. But the proof of the main theorem about the
security of BBS (under the assumption of QRA and with
proper interpretation of the claim of the theorem)
doesn't involve the cycle length of LSB (as far as I can
see). So I suppose one could say that BBS's unnecessary
investigation of cycle length in their paper has caused
confusion that led to much discussions in this thread.
The cycle length of LSB of BBS is as a statistical
feature on the other hand certainly of interest, if one
accepts my humble viewpoint that all bit sequences used
in serious applications should conservatively pass all
statistical tests available/feasible to the communication
partners.
M. K. Shen
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: How Many?
Date: Sat, 19 Aug 2000 22:02:50 -0400
"John A. Malley" wrote:
> ... Transposition preserves the frequencies of occurrence of
> plaintext symbols in the ciphertext. And that helps cryptanalysis.
Well, no, in general transposition hinders cryptanalysis.
Would you rather cryptanalyze system A or system T*A where
T is an unknown transposition?
It is true that transposition has no further effect on the
uniliteral frequencies (letters being a name for the units
of the transposition), and in cryptanalysis that property
can sometimes be exploited, but the problem as a whole is
usually made harder by the transposition.
Transposition can be *thought of* as substitution in a
very large space, but that isn't a *useful* way of looking
at things.
------------------------------
From: Jeff Davies <[EMAIL PROTECTED]>
Subject: Re: How strong is this algorithm?
Date: Sun, 20 Aug 2000 03:17:48 +0100
Scott Fluhrer wrote:
> Jeff Davies <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > I have an idea for a non-profit electronic cash system (a bank)
> > for which I have devised I think a new algorithm.
> Why? What's wrong with 3DES, Serpent or AES? Seriously: the banking
> industry currently uses (I believe) 3DES, Serpent was designed by someone in
> the banking industry (Ross Anderson), and AES (which might turn out to be
> Serpent) is considered "good enough" for the US government uses. Why should
> I trust my money to a bank that uses home-brew crypto?
>
The idea is that all IP is free (in Gnu public licence way) including the
crypto.Is 3DES, Serpent or AES?
> If each key is used to transmit a single message, it looks safe, if rather
> crude. If it is used to transmit multiple messages (and that includes
> multiple messages from the same "transaction"), then it looks easy enough to
> break (or at least, give the attacker enough idea about the messages to know
> which bits to manipulate). And, not putting in any sort of MAC allows bored
> attackers to have fun flipping random bits, and seeing what happens...
>
I did miss out the fact (I thought it was obvious I suppose) that a secret it
sent inside theencrypted part of the message, and all fields are checksummed.
The bits, randomly dispersed
in the background noise of the larger packet are also randomly inverted by the
key.
What does MAC mean?
Maybe I don't need to say it but the keys are hashed by a phrase so that they
aren't totally
open to someone opening your snailmail etc..
> So, if someone manages to grab the smartcode out of the mailbox, he will be
> able to impersonate you, and withdraw all your funds from the bank?
>
As far as I can tell, this impersonation is easy in any case where your client
is compromisedregardless of security system employed.
i.e. just about any Microsoft OS.
> Oh, and what do you do if someone's a bit busy, and needs to exchange more
> than 2048 messages with the bank in a month?
>
Then you get the "platinum service" of 4096 keys per card. Or perhaps, like a
chequebook, youget another sent out when you're 50% through the last one. (or
when you order one).
> Mobile phones? You have got to be joking -- the huge packets this
> encryption requires are totally impractical there (as well as the difficulty
> of giving the phone the keys...)
>
Huge packets? 1536 bytes??? This is 1.5 seconds on a 9600 baud link. Mobile
phones are soon to have2mbit per second links!! (0.005 seconds).
> --
> poncho
Jeff Davies
------------------------------
From: Jeff Davies <[EMAIL PROTECTED]>
Subject: Re: How strong is this algorithm?
Date: Sun, 20 Aug 2000 03:22:57 +0100
Lyalc wrote:
> Problem 1) who pays for the smartcards, and the necessary readers, and
> hint, bond investments are unlikely to fund this expenditure
> Reality 1) banking and payments systems already do this - it's called ATM
> transactions, just credited to different recipients (a bank account instead
> of a cash dispenser)
>
This is not a problem
> It's trivial to generate single use symmetric keys that automatically
> synchronise in each message, regardless of reception order, and without
> requiring new master keys, or PK overheads. Some schema for this can
> provide forward and backward message security as well i.e. breaking a single
> message does not lead to easier access to previous or subsequent messages.
>
Yes but the problem to be solved here is the fairly regular compromisation of
clientsdue to lax security in the most popular operating systems, not
compromisation of
the packet in-flight.
Having throw away keys either end sent my snail mail is no more of a logistical
problem
than checkbooks were. If one is stolen, get another, and the old one is
cancelled.
Thanks for your input, it is very valuable for me.
Jeff Davies
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************