Cryptography-Digest Digest #452, Volume #10      Tue, 26 Oct 99 16:13:03 EDT

Contents:
  Re: This compression argument must end now (Tim Tyler)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (wtshaw)
  Re: Weakness in the Rijndael key schedule? (solidash)
  Re: This compression argument must end now (Tim Tyler)
  Re: Twofish Description Improved (John Savard)
  Re: Keyed permutation ("Adam Durana")
  Re: Modern secret writing (wtshaw)
  Re: Note on Feistel Ciphers (John Savard)
  Re: Biometric Keys are Possible (Alex MacPherson)
  Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
  Re: Strict Avalance Criteria in SBox design (Alex MacPherson)

----------------------------------------------------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: This compression argument must end now
Reply-To: [EMAIL PROTECTED]
Date: Tue, 26 Oct 1999 16:23:23 GMT

Geoffrey T. Falk <gtf[@]cirp.org> wrote:
: In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:

:>David Scott has recently invented perhaps the first compressioon system
:>suitable for cryptography known to man.  Since *all* other known
:>compression routines potentially weaken your encypherment - and add
:>"unnecessary" information to the file - I believe this event should not be
:>under-estimated.

: That is quite an overstatement. David's system might be suitable for
: cryptography, if it compresses the data well. The much-hyped "one-on-one"
: property does not guarantee this at all.

Indeed it does not.

: For example, here is my favourite compression algorithm: Copy the input
: file to the output file. This algorithm is "one-on-one." But it is
: worthless for cryptography.

In what sense is this a "compression" program.  It totally fails to
compress all possible target sets.

: I understand that David uses adaptive Huffman coding, which is not
: a very good compression scheme by itself. Adaptive arithmetic coding
: is a bit better. But neither will give you very good compression by
: itself, because they only look at the frequency count of symbols in
: portions of the input file.

I don't think you can criticise the idea of one-on-one compression by
saying the only currently known non-trivail implementation is inefficient.

All the best compressors (in terms of raw compression ratio) are
one-on-one.

:>At the moment we have very few "one-on-one" compressors.

: To the contrary, there are any number of "one-on-one" compressors.
: My null compressor (above) is the simplest. Do you want more examples?

Some examples that actually did some compression and were at all useful
might help.  Best of all would be something that performed better than
David's Huffman code on a range of "typical" files.

: In short, there is no reason to think that this "one-on-one" property
: by itself is helpful in any way. Much more is needed.

The *lack* of the one-on-one property introduces new security problems
which were not present in the plaintext.  One-on-one compression /always/
eliminates these, so yes by itself it can be helpful compared to other
types of compression.

Iy you're saying it may not necessarily help secure the data, it will
probably do so approximately to the extent it compresses.

: The quality of the compression makes vastly more difference to the
: difficulty of attacking the system.

Given that attacks based on plaintext regularity and attacks based
on regularities introduced by bad compression both seem quite possible,
I wonder how you might go about saying one characteristic is more
important that the other.

Poor compression is likely to introduce the same regularities regardless
of whether you're compressing executable, object-based drawings, or
plaintext.  If a range of filetypes are being sent, poor compression
may be expected to be more insecure than no compression al all.

Given this I don't see how it can be claimed that huffman compression
is worse than using a good compressor - and not just worse, "vastly" so.

The security problems introduced are different.  Which is more important
depends on factors like what is being sent.

: This is because the attacker must decrypt the entire message anyways
: before he can test if it is a valid compressed file.

Nope.  Not necessarily.  I /keep/ saying that there are other security
problems that apply when you have only a tiny fragment of a file.

A non-one-on-one compressor introduces information systematically
into the compressed file.  This regularity can potentially be targetted
by cryptanalysts.

Despite my drumming on this point, others /continue/ to talk as though the
*only* attack prevented is one where you try to decompress an unencrypted
file.  This is false - the poor compression *will* introduce other
regularities - unless it is attached to a hardware source of random
numbers.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Get a head, stay ahead.

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Tue, 26 Oct 1999 11:09:11 -0600

In article <[EMAIL PROTECTED]>, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

> wtshaw wrote:
> 
> > The recepient generates all the keys as needed, tries one at a time with
> > an appropriate algorithm on the ciphertext.  The system should lock to the
> > one with the most appropriate recovered plaintext, likely only one that
> > makes sense.
> 
> This becomes trivial if the initial message contains a standard plaintext.
> 
>From time to time I have put in code that looks for text in decryption. 
The simple GIGO is adequate to warn that text was not encrypted.   If
spaces are encrypted one way of the other it is simple to get an average
word length, perhaps variance, or any of many liguistic tests as well.

As you say, putting encrypted text at the beginning makes algorithm
verification reasonable.  
> >
> > To change cryptosystems, just do it; at the other end, detection falls out
> > of lock as the current system fails to recover reasonable output, and go
> > back to checking the various algorithms, getting the new system, locking
> > until unlocked.
> >
-- 
Sometime ago, Gates bought lots of shares of Apple.
My preference is to keep the Worm out of the Apple.

------------------------------

From: solidash <[EMAIL PROTECTED]>
Subject: Re: Weakness in the Rijndael key schedule?
Date: Tue, 26 Oct 1999 07:39:01 +0200

Hello Mr Scotty

So I guess your gospel is that" All Block Ciphers  have backdoors if they are used
in CBC , CFB or ECB mode cause that weekens the cipher?"...is that
right...well...maybe we should all go home and pack up...

P.S.....has anyone cryptoanalysed your Cipher?


P.S.  Mathew...why dont you send your findings to the guys who wrote Rijndael
It would be interesting to get feedback from these guys....


"SCOTT19U.ZIP_GUY" schrieb:

> In article <7usskb$v75$[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
> >A comment on the key schedule for Rijndael.
> >
> >Hello sci.crypt,
> >
> >I believe I have found a weakness in the key schedule for Rijndael.  I
> >turn to this august body for confirmation or harsh refutation of my
> >theory.  A quick note on my credentials. I am only an amateur
> >cryptographer but a long time professional computer scientist.  This
> >post is a follow up to the 'Curious Keys in Rijndael' post.
> >
> >Abstract
> >
> >In this paper the block cipher Rijndael is analyzed.  Rijndael is
> >submitted as a candidate for the Advanced Encryption Standard.  The
> >cipher has variable key and block length.  This paper focuses on the key
> >length of 128 bits with the block length of 128 bits.  The Rijndael
> >cipher is an iterated ten round block cipher for this case.  Here it is
> >shown, the Cipher Key can be recovered if an attacker has the 'exclusive
> >or' of several Round Keys.  Moreover, one byte of key recovery can be
> >mounted on Rijndael for certain key relationships and strange chosen
> >cipher texts. Extension for this attack exist but appear to be
> >impractical.
> >
> >This note is just the beginning of the full paper I am working on.  If
> >it turns out that I goofed up then this posting will save me the work:-)
> >
> >I will use the notation adopted in 'The Rijndael Block Cipher' by Joan
> >Daemen and Vincent Rijmen. To understand the attack, you will have to
> >read 'The Rijndael Block Cipher.'
> >
> >
>
>   I am not sure you understand that AES is not to be used for data that the
> United States governement wants to be secure. It is for data that the US wants
> every one else to use so the NSA can read the mail. You should not bother to
> show weakness in the cipher till the AES weak cipher is adopted for use. Then
> blow it all to hell.  If your attack is strong enough the NSA my have to use
> another of there trojan cyrpto methods in the contest. We ameturs should wait
> until the phony crypto gods bless the final candidate.
>
> David A. Scott
> --
>                     SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
>                     http://www.jim.com/jamesd/Kong/scott19u.zip
>                     http://members.xoom.com/ecil/index.htm
>                     NOTE EMAIL address is for SPAMERS


------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: This compression argument must end now
Reply-To: [EMAIL PROTECTED]
Date: Tue, 26 Oct 1999 16:31:26 GMT

Steven Alexander <[EMAIL PROTECTED]> wrote:

: Used correctly, certain types of compression(Not .zip!) can increase the
: difficulty of a ciphertext only attack.

Yes.

: However, it will not make a known plaintext attack anymore difficult.

No.  This is false.  This is the /third/ time someone has raised this
point - so perhaps it should get into the one-on-one FAQ.

A plaintext attack on a cypher can /still/ made more difficult, due to
the fact that the enemy has less cyphertext available to him.

Imagine a EOR-with-PRNG stream cypher.  A known plaintext attack on this
boils down to predicting the PRNG from X bytes of its output.  If X is
made smaller, the crack is harder.  QED.

: If either the original or compressed plaintext is known brute force
: attack is not any more difficult.

This is closer to the truth.  In *principle* use of a one-way compressor
and a broadcast system where the compressor is not known if only the
decompressor is available could mean that owning the plaintext does not
help produce the compressed text.

Under such circumstances you would have to decompress each message
you decrypted, which would increase the time for the break by some
constant factor.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

I'm back.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Twofish Description Improved
Date: Tue, 26 Oct 1999 17:17:37 GMT

[EMAIL PROTECTED] wrote, in part:

>When you describe Kerberos, you may want to include something I read,
>though I know not where, that one suggestion to improve it's security
>was to use SPEKE for the initial authentication step.

I'll probably be keeping the description simple and introductory. But
*somewhere* on my pages, although I don't quite know where is most
appropriate, I will have to include a discussion of EKE, since it is
an important basic idea. (I do recall running into a paper on
"Distributed Authentication" by adding public-key methods to Kerberos
once again the other day, but I was looking for something completely
different...)

John Savard ( teneerf<- )
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: "Adam Durana" <[EMAIL PROTECTED]>
Subject: Re: Keyed permutation
Date: Tue, 26 Oct 1999 13:20:36 -0400

Yeah, this is totally linear.  But its a permutation, not a cipher.  All it
is supposed to do is change the combination of the bits based on the key
provided.  And sure it can be used in rounds, I just made that page to
explain the basic idea.  One more thing, I did not bring this to the news
group to boast that it was some sort of security breakthrough.  I thought it
was neat and I thought some of you might think the same thing.  Perhaps it
will spark some new idea in your head and set you on the path to designing
something new.

- Adam

Hideo Shimizu <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> I think 1 round keyed permutation is weak against chosen plaintext
> attack. Or it have any other weak point, because there is no
> diffusion.
>
> Hideo Shimizu
> TAO, Japan



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Modern secret writing
Date: Tue, 26 Oct 1999 11:37:41 -0600

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

> The bureaucrats wanting to suppress the same being exploited
> by criminals would be forced to install sophisticated equipments
> at all the customs and minutely examine every surface on materials 
> carried in the luggages of travellers at the resolution of 
> one-thousandth the size of a pinhead. I predict that this will 
> entail a big boom of the industry that manufactures such special 
> detecting equipments and that current crypto regulations and
> the crypto clauses in the Wassenaar Arrangements probably have
> to be proportinately tightened up in order to be effective at all.
> 
Thus the bureaucrats who attempt invasive monitoring and turn all people
into suspects are violating the rights of all who are trampled in the
process.  Government cannot be the thought-police, or concerned with fine
tuning everyones lives; it is absurd to envision conplete control, which
is the idea behind repression in the first place...numeroous noxious
examples deleted.  

The hard lesson is that individuals collectively will tend to do what is
in their own self-interests, with or without approval by elite hall
monitors.  And, the means of getting away with things will always surpass
the ability to detect them.  

It is better to work with the good intentions of people to really be
concerned about each other that to try to push them around wantonly. They
should not be pushed to do or not do things by strange requests which
violate their widespread understanding of what is good for them. 
Governments exist to serve people with their approval and consent.
-- 
Sometime ago, Gates bought lots of shares of Apple.
My preference is to keep the Worm out of the Apple.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Note on Feistel Ciphers
Date: Tue, 26 Oct 1999 17:28:51 GMT

[EMAIL PROTECTED] wrote, in part:

>The constant separation does not exist.  Feistel
>rounds can generate the alternating group, as DES
>rounds are known to do.  (I don't have the ref but
>I recall that the result is due to Coppersmith.)

I may have misinterpreted you here. On this reading, I think you mean
A(2^64), and not A(64) of bit transpositions, as I had originally
thought:

and this means that if you are allowed to perform a very large and
variable number of DES rounds, for a given set of subkeys, and a given
number of rounds, you can simulate one-half of the possible codebooks
of 64-bit blocks.

That does mean that DES is not a group - at least, not in any harmful
way - but since to emulate a *particular* even codebook, one may need
millions of DES rounds, it is not at all clear to me that it says all
_that_ much about the particular question you are using that result to
address. It certainly does say that Feistel rounds aren't confined to
a narrow space of transformations that is not escapable, but while
that is encouraging, it is no more a proof of cryptographic strength
than good statistical properties for a PRNG prove its strength.

John Savard ( teneerf<- )
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: Alex MacPherson <[EMAIL PROTECTED]>
Subject: Re: Biometric Keys are Possible
Date: Tue, 26 Oct 1999 14:03:33 -0500

Sorry if this was already brought up in this thread, but this is the first
message I have seen on this topic.  There are biometric products available that
uses a proprietary algorithm to make biometric measurements of a keyboard user's
individual typing rhythm.  No additional scanners for fingerprints, voice
prints,..etc are required.

If anyone is interested, information can be found at:
http://www.biopassword.com/biopassword/products.htm
--
Alex MacPherson
Department of Mathematics and Computer Science
Royal Military College of Canada
email: [EMAIL PROTECTED]




------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Tue, 26 Oct 1999 19:23:53 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] () wrote:
>Tim Tyler ([EMAIL PROTECTED]) wrote:
>: However, let's not get this out of proportion.  Your objection only
>: applies to one particular form of one-on one compression.  Further
>: it only applies when a certain alignment problem occurs.  Most computer
>: files are only stored to the nearest byte anyway, so I have difficulty
>: in seeing your objection as much of a practical problem.  I doubt even
>: embedded application typically try to encrypt files which are not 8-bit
>: aligned frequently.
>
>The point is that compression fairly naturally produces files that are an
>arbitrary number of bits in length, whether or not it is one-to-one. At
>first, I wasn't sure such a technique was practical. I may have
>misunderstood the posts in which David A. Scott first discussed this idea,
>but eventually I found that I could at least approximate one-to-one
>compression. But if I try to obtain byte alignment, which I thought Mr.
>Scott was claiming he was able to obtain, the price of keeping one-to-one
>compression is bias in the last byte.
    Obviously you don't read my posts. The bias of the last byte exists in
your mind. Take any binary file and replace the last byte with all 256 
possiblites. Each of the 256 files will uncompress to 256 different files
so where is this bias.
   However since you don't understand the concept of one to one compression
I can see where one could have your views if one refused to do more than
a shallow look at the problem. The real reason that you think there is a bias
is because in the first version you are confused as to how it relates to a
stream of compressed bits that don't end on a nice 8 bit boundary. And yes
when it difffers I am adding 1 to 7 zeros or dropping 1 to 8 ones. Yet I 
anwser the problem above. But if you would take the time to look at the
Focused adapative huffman compression the ending game is far different
just follow my site at http:/members.xoom.com/ecil/compress.htm
but be careful it may require you to think a little.
      John I can see that you are confused becuase the dropping and adding
zeros and ones by itself would most likely not lead to a one to one 
compression  for the adaptive huffman compression in the general case.
Much care had to be used in the selection of which paths are one and 
zero in the tree. Note the way ones and zereos are assigned in the tree
are quite critical.

>
>Of course, the objection is trivial, but what I gained by going
>"one-to-one" was also trivial, since it, too, only concerned the *last
>symbol in the message*...so another part of my point is that if you are
>going to be so fussy as to use one-to-one compression, then you should
>take into account this trivial objection - because it equals in magnitude
>the rationale behind the compression method to begin with!

   I think you are still lacking the insight of what is going on.  The
way a file end is more than just looking at the last byte it is also
how the bits are assigned in the tree itself or one ends up with a
non one to one compression. But at least you are looking and that
is more than most have been willing to do.




David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: Alex MacPherson <[EMAIL PROTECTED]>
Subject: Re: Strict Avalance Criteria in SBox design
Date: Tue, 26 Oct 1999 14:25:25 -0500

> Also; Any suggestions/url's or tips when designing SBoxes
> would be appreciated.

Hi Glenn,

    There are a few good articles on the topic from the Queen's
University Cryptography and Data Security Lab down in the "Papers"
section at:

http://saturn.ee.queensu.ca:8000/SPN/

--
Alex MacPherson
Department of Mathematics and Computer Science
Royal Military College of Canada
email: [EMAIL PROTECTED]


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to