Cryptography-Digest Digest #472, Volume #10      Sat, 30 Oct 99 15:13:04 EDT

Contents:
  Re: Compression: A ? for David Scott (SCOTT19U.ZIP_GUY)
  Re: HELP !! El Gamal for JDK 1.1 ([EMAIL PROTECTED])
  OAP-L3: I think I've worked out what's wrong. (Paul Crowley)
  Re: Bruce Schneier's Crypto Comments on Slashdot (David Crick)
  Re: ComCryption (Mok-Kong Shen)
  Re: ComCryption (Mok-Kong Shen)
  Re: Build your own one-on-one compressor (Tim Tyler)
  Re: Bruce Schneier's Crypto Comments on Slashdot (DJohn37050)
  Re: Build your own one-on-one compressor (Tim Tyler)
  Re: Preventing a User from Extracting information from an Executable (Chad Hurwitz)
  Re: Compression: A ? for David Scott (Tim Tyler)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Compression: A ? for David Scott
Date: Sat, 30 Oct 1999 13:27:48 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>On Fri, 29 Oct 1999 16:59:24 GMT, [EMAIL PROTECTED]
>(SCOTT19U.ZIP_GUY) wrote:
>
>>In article <7vc81f$128$[EMAIL PROTECTED]>, "Tim Wood"
> <[EMAIL PROTECTED]> wrote:
>>
>>>True. But it does depend on your definition of bad (low
>>>compression/predictable structure).
>>    My defination of bad is a compressor that adds information
>>to a file as it compresses.
>
>I used good/bad to quantify a compressor in the traditional sense,
>that is the better the compressor, the higher the compression ratio.
>As this reduces the amount of space a given amount of information
>occupies, it reduces the occurrences of patterns, an overall reduction
>of information per byte.
   And that is just the problem. For most problems the goal of compression
is to save only space. But if one is using compression before encryption and
if ones goal is to keep a file secure then one most take a deeper look
at what the compression is actaully doing. For example the oringal adaptive
huffman compressor I modifed. I made very few changes yet. With mine
there is no information given about the compression method after encryption
yet the original one is such that even compressing a long random file it 
leaves enough information in the file that there may only be one valid key 
that gives a file which can be a sloution to the decompression. Yet both would 
appear from most statistical tests to have compressed to a random file.

>
>Measuring by compression ratio is objective.  You're definition of
>adding information sounds subjective.  If we try to compress a file
>compressed by pkzip - a "bad" compressor by your definition, we'll
>find it doesn't compress much, if any.  This could be taken as an
>objective measurement of a lack of information added, overall.  A few
>bytes in the header?  Sure.  Patterns throughout the file?  Doesn't
>seem likely.
    It may not seem likely but then you have to think a bit and
analyse the problem since in this case things are not as you think
they are.
>
>If the concern is that known data at the beginning of the plaintext
>could ease analysis, then why not deal with the header, rather than
>trying to reinvent compression?  If the compression isn't as efficient
>in a traditional sense, and it doesn't add random data, then it's
>either going to add or leave patterns throughout the file that, IMHO,
>could also be used to aid in analysis.
>
>>>
>>>>If the problem is only that a
>>>>good compression program (i.e. pkzip) adds known information to the
>>>>header (as exist in many file formats), why wouldn't moving the
>>>header
>>>>to the end of the plaintext, and running in a feedback mode
>>>>effectively eliminate or greatly reduce the problem?
>>>
>>>Running in a feedback mode would reduce the problem - however it would
>>>be better to eliminate the problem (i.e. find a cure not treat the
>>>symptoms) by removing all added structure from the post-compression
>>>plaintext.
>>    And one way to do that is to use a one to one compression scheme.
>
>But are we only talking about the structure of the header?  From what
>I can tell from looking at a couple of zip's, there is a header of
>maybe 80 or so bytes that have a distinct pattern, and a trailer of
>about 60 bytes that also has a pattern (looks like name and checksum),
>but the body of the data looks pretty paternless.  How could it not
>be, otherwise it would still be compressible?  
   Look at my example of the mods to adaptive huffman compression
and you will see that what seems to the eye as pretty patternless is
not to the compressor. It
>
>By "one to one", I assume you're speaking of is a compression
>algorithm where any data will decompress to something, not a
>compression scheme with a one to one compression ratio, or a symmetric
>compression algorithm.  I haven't studied it, but it would seem to me
>that many, if not most, compression/decompression functions will
>decompress any data to "something", other than the header and footer
>of the file.  
   By one to one I mean for any file X  Compress( Decompress (X)) = X
while most only consider for any file Y Decompress( Compress (Y)) = Y
most only consider the second but with encryption you need to consider
both.
..

>Ok, even if we accept that, I'd still contend that the subset of files
>that have the most general application for encryption are those that
>have large, well defined, known patterns, and are highly
>compressible - namely databases and documents.  Multimedia files
>(movie, sound, photo), which are often in an already compressed format
>often have a very well defined header.

  IF your goal is to send files that are aready in a compressed format that
is not one to one. My one pass compression technique would do nothing
to aid you since it would only incerase the lenght of file by a small amount
and the attacker would only look at the first few bytes to check if encryption
correct.
  However if you used the two passes of compression in both direction
then even with weak AES type of encryption schemes the attacker would
be force to decrypt the whole file and make one complete pass through
the decompression before he could even so what the first few bytes in the
file are.




David A. Scott
--

SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
                    
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm

Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm

Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm

**NOTE EMAIL address is for SPAMERS***

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: HELP !! El Gamal for JDK 1.1
Date: Sat, 30 Oct 1999 13:04:35 GMT

I remember looking at the java api a couple months ago and I noticed
that it had a very nice math api for large numbers.  If jdk 1.1.x has
this same api, you could use that to do that actual calculations.  You
may have to write your own routines for random number generation
converting from text to a number.  This is assuming you don't want to
purchase or download any other java toolkits.

csybrandy.

In article <[EMAIL PROTECTED]>,
  Jan Goyvaerts <[EMAIL PROTECTED]> wrote:
> Hello everybody !
>
> I need to use the El Gamal algorithm for one our (urgent) java
projects.
>
> I saw some jdk 1.2 standard functions are quite usefull for this.
> Unfortunately we're stuck in a jdk 1.1 environment. I Am looking for
any
> kind of help. Classes, documents, anything ! As long as I can use them
> on jdk 1.1.
>
> All help and clues will be greatly appreciated !!
>
> Jan.
>
>


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Paul Crowley <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: OAP-L3: I think I've worked out what's wrong.
Date: 30 Oct 1999 12:45:06 +0100

I think I've worked out what's wrong here.

People have berated Szopa for making inflated claims, or for not
understanding security, and they seem justified.  But to me, I think
the bigger mistake that he makes is that he doesn't seem to understand 
the relationship between him and us:

He's the salesman.  We're the customers.

This is still true even though the software is gratis.  If some aspect 
of what the software offers doesn't please us, ultimately it isn't our 
job to do a damn thing about it; it's *his* job to try and make us
happy, and to persuade us to look again.  We say "give us clear
descriptions of the algorithms on your Web pages"; he doesn't have to
do it, but he doesn't get to browbeat us for wanting it, because we're 
the customers.

I'm an amateur cryptologist myself, and I've bent over backwards to
try and meet all the standards that the smart people on this newsgroup 
want to set: clear descriptions of the algorithm on my Web pages (I've 
had some nice compliments on the clarity here), *public domain* sample 
source code implementations in C (two such implementations: one
readable, the other optimised), and justifications of why I think the
world needs yet another cipher.  There are also no patent encumberances.

With all this, I've found it hard to attract much in the way of
cryptanalytic attention, and frankly I chirrup with delight whenever I 
learn that someone new has taken the time to look at it.  After all,
there's no a priori reason why anyone should feel obliged to look at
any amateur designer's latest ideas; it's my job to persuade them that 
I've got something worth seeing.

Szopa is in the rare and fortunate position amongst sales people that
his potential customers are making explicit things he can do to
improve their interest.  If he doesn't want to take that advice, he
can hardly be surprised that he doesn't get many "sales".
-- 
  __
\/ o\ [EMAIL PROTECTED]     Got a Linux strategy? \ /
/\__/ Paul Crowley  http://www.hedonism.demon.co.uk/paul/ /~\

(PS: if you *are* interested in my cipher, a new revision is out soon)

------------------------------

From: David Crick <[EMAIL PROTECTED]>
Subject: Re: Bruce Schneier's Crypto Comments on Slashdot
Date: Sat, 30 Oct 1999 15:36:01 +0100

Bill Lynch wrote:
> 
> http://slashdot.org/interviews/99/10/29/0832246.shtml

I thought Bruce's comments on "giving away" crypto were good:

"It is impossible to make money selling a cryptographic algorithm."

"There are free encryption algorithms all over the place: triple-DES,
Blowfish, CAST, most of the AES submissions." [notably the finalists
Rijndael, Serpent and Twofish - DC]

"If I patented [Blowfish] and charged for it, it would be much less
widely used. IDEA is a good example of this. IDEA could have been
everywhere; for a while it was the only trusted DES replacement. But
it was patented, and there were licensing rules. As a result, IDEA
is barely anywhere. SEAL is a great-looking stream cipher. But
because IBM has a patent on it, no one uses it."

   David.

-- 
+-------------------------------------------------------------------+
| David Crick  [EMAIL PROTECTED]  http://members.tripod.com/vidcad/ |
| Damon Hill WC96 Tribute: http://www.geocities.com/MotorCity/4236/ |
| M. Brundle Quotes: http://members.tripod.com/~vidcad/martin_b.htm |
| ICQ#: 46605825  PGP Public Keys: RSA 0x22D5C7A9 DH/DSS 0xBE63D7C7 |
+-------------------------------------------------------------------+

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: ComCryption
Date: Sat, 30 Oct 1999 17:45:26 +0200

SCOTT19U.ZIP_GUY wrote:
> 
> Mok-Kong Shen<[EMAIL PROTECTED]> wrote:
> >John Savard wrote:
> >>
> >> One "Dr. Richard Crandall of Apple Computer" is credited with
> >> suggesting, at a conference in 1998, an encryption idea called
> >> "ComCryption", where a key of n bits is used to select one of 2^n
> >> compression algorithms...since the result of compression appears
> >> random, the cipher might be secure.
> >>
> >> I'll have to admit I don't think this is a particularly good idea.
> >> It's a fruitful original thought, which one can play with in some more
> >> pedestrian ways, such as Huffman coding with random, keyed code
> >> assignment as a simple subsitution without easily visible boundaries.
> >
> >It's certainly not a good idea for 'compression', but not a bad idea
> >for encryption in my humble view. Compression is the technique being
> >'borrowed' here for persuing another purpose, namely encryption. It
> >is the large number of potentially possible compression schemes that
> >thwarts the analyst. This is an example of application of the principle
> >of variability.
> >
> >M. K. Shen
> 
>     But don't you think one has to be very carful which compression schemes
> are used so that by inspection one can't immeately tell which method was
> used.

No. One can e.g. tell the analyst that one uses adaptive Huffman 
(without telling him how to obtain the initial frequency distribution, 
of course).

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: ComCryption
Date: Sat, 30 Oct 1999 17:46:02 +0200

[EMAIL PROTECTED] wrote:
> 
> Mok-Kong Shen ([EMAIL PROTECTED]) wrote:
> : It's certainly not a good idea for 'compression', but not a bad idea
> : for encryption in my humble view. Compression is the technique being
> : 'borrowed' here for persuing another purpose, namely encryption. It
> : is the large number of potentially possible compression schemes that
> : thwarts the analyst. This is an example of application of the principle
> : of variability.
> 
> While I would have nothing against it as a technique to further frustrate
> a cryptanalyst, trying to use it alone without "real" encryption
> afterwards is, in *my* humble opinion, a stupid idea.

I agree with you, as far as the current (publically known) state
of the art in compression techniques is concerned. I mean it
can't be excluded that the 'boundary' between compression and
encryption is a fluid one. (On the other hand we don't yet have much 
detailed informations about ComCryption that is presently being 
discussed, do we? Of course, before availability of these we can't 
have any trust in such algorithms, which must continue to be regarded 
as snake-oils, independent of whether they in fact are such.)


> Also, while it may be a "stupid idea" to just go ahead and implement it as
> it stands and expect high security, that doesn't mean it isn't still a
> fruitful idea, a source of inspiration. With some extra features added, or
> used for a special purpose, or as a starting point for something
> different, it could still be helpful. An idea can be thought-provoking
> without being immediately very useful in its original form.

This is a good point.

M. K. Shen

------------------------------

Crossposted-To: comp.compression
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Build your own one-on-one compressor
Reply-To: [EMAIL PROTECTED]
Date: Sat, 30 Oct 1999 16:39:23 GMT

In sci.crypt Mok-Kong Shen <[EMAIL PROTECTED]> wrote:

:DS> Are you trying to say that you don't think my adaptive huffman
:DS> compressior actaully compress and decompress each file in a one
:DS> to one way. What are you attempting to say?

: Basically I intended to say the same that is now contained in the
: answer to Tim Tyler. Please read that. You were giving agreement
: to his original post. I am of the opinion that that's shifting
: the problem from the bits level to (the larger) bytes level,
: without in principle solving the one-to-one problem.

You're saying my compression scheme would /fail/ to be one-on-one?

Surely not!

As you /can't/ mean this, what is the (unsolved?) "one-on-one problem" of
which you speak?
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

The cigarette does the smoking - you're just the sucker.

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Bruce Schneier's Crypto Comments on Slashdot
Date: 30 Oct 1999 17:05:16 GMT

I have met a few NSA reps at standards meetings and all of them were top-notch
technically and I appreciated the contributions they made.
Don Johnson

------------------------------

Crossposted-To: comp.compression
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Build your own one-on-one compressor
Reply-To: [EMAIL PROTECTED]
Date: Sat, 30 Oct 1999 16:47:32 GMT

In sci.crypt Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: SCOTT19U.ZIP_GUY wrote:
:> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
:> >Tim Tyler wrote:
:> >> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:

:> >> : If I understand correctly, you are now on compression not operating
:> >> : on groups of 8 bits but on group of bytes.
:> >>
:> >> This is a strange way of putting it.  There is no possibility of a
:> >> non-byte aligned end-of-file, if you work with byte-aligned symbol
:> >> strings.
:> >>
:> >> Of course my method is not *confined* to the use of byte-confined
:> >> symbol strings - but your strings *must* differ from their replacements
:> >> by a multiple of 8 bits or you will introduce the type of file-ending
:> >> problems that only David Scott knows how to solve ;-)
:> >

:> >Let me explain what I meant. You have certain symbols; let's
:> >for convenience identify these with A, B, C, etc. (they could in fact
:> >be anything you define through grouping any number of bits). Now your
:> >dictionary does translations. An example could be that 'ABCD' is
:> >translated to 'HG'. This is the compression direction. On decompression
:> >it goes backward to translate 'HG' to 'ABCD'. Am I o.k. till here?
:> >Now I denote the side with 'ABCD' above side1 and the side with 'HG'
:> >side2. So on compression one matches the source string with entries
:> >of side1 and replaces the found entries with the corresponding
:> >entries of side2. On decompression one reverses that. Now side1,
:> >if correctly constructed, should be able to process any given source
:> >input and translate that to the target output. (I like to note
:> >however that it needs some care for ensuring this property, if your
:> >symbols are arbitrarily defined.) Let's say that in a concrete case
:> >one has XYZ....MPQABCD compressed to UV.....ERHG. Suppose now I
:> >change the compressed string to UV.....ERHN (HG is changed to HN).

[snip DAS's reply]

: I was making an example of a 'wrong file' such as one that is obtained
: with decryption using a wrong key. For simplification, I assume
: it is wrong only in one symbol. (This justifies the 'change'.)
: If side2 of the dictionary does not happen to have the entry 'HN',
: then one has the same problem that one discussed in the context
: of your one-to-one problem concerning the ending bits of a wrong
: file.

There /is/ no problem corresponding to a data stream ending in mid-byte,
in the method I proposed.

This is because all the individual operations of the method maintain the
size of the file as a multiple of eight bits.

Consequently, maintaining the one-on-one property is fairly easy.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Tend to the molehills and the mountains will look after themselves.

------------------------------

From: [EMAIL PROTECTED] (Chad Hurwitz)
Subject: Re: Preventing a User from Extracting information from an Executable
Date: 30 Oct 1999 10:06:14 -0700

In article <[EMAIL PROTECTED]>,
jerome <[EMAIL PROTECTED]> wrote:
>On 29 Oct 1999 13:14:23 -0700, Chad Hurwitz wrote:
>>
>>Is there a proven way that makes it difficult for a user to see what is
>>stored
>>with in an executable they are running?
>
>No way to do it (proven or not) on a system like unix. Somebody
>can modify the OS to interrupt your executable and read the key.
>
>The only way to do it would be to prevent the OS from handling 
>interruptions(IRQ timer, hd etc...) when your process is in 
>the critical section. By the way, you obviously can't do any
>system call in this section. This critical section can't
>be too long, some devices don't like to have 'interruptions 
>in the air'.
>
>Doing that is likely to trigger some 'strange' behaviours
>if the OS hasn't been designed to handle this kind of 
>situation.


thanks for the responses!  I do agree it's a pretty impossible problem.

hmm, that is a very reasonable idea to disable interrupts so a debbger couldn't
break with in the code building the key from the encrypted/spread memory
in the program.  You could stop interrupts for everything that you could
besides the unstoppable ones, which i believe would stop any debugger from

The code wouldn't have to be large at all; just enough to gather the key
elements, encrypt the string and erase the built key from memory and then
make interrupts interrupt again.

I guess a hacker could probably see that they couldn't debug/break right
before the encryption algorithm is, and then notice that i was disabling
interrupts around that specific section, and then they would have to
replicate the key generation code AND all the inputs/references to build
the key.  Which is possible but harder than just copying the key from
memory before the encryption is called; especially if you hide the references
the generation routine uses, possibly even modifying them so they wouldn't
be the same each time the program is run.  Still the hacked could modify the
string before the encryption section is executed to encrypt they string they
wanted even if they can't figure out what the key is.


The other suggestion was to use multi layered P-Code which i don't know
what that is, does anyone have any URL references?
-- 
/-------------------------------------------------------------------\
| spam if you can, spam if you dare, spam if you must, i won't care |
| spam is futile,  spam is free,  spam is filtered,  so i won't see |
\-------------------------------------------------------------------/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compression: A ? for David Scott
Reply-To: [EMAIL PROTECTED]
Date: Sat, 30 Oct 1999 17:02:46 GMT

Tom <[EMAIL PROTECTED]> wrote:

: I just don't get it at all.

:-(

: Seems to me the purpose of compression, as far a [crypto] goes, is to
: reduce the size of the plaintext, and reduce the occurrence of
: patterns in the plaintext.  These two things would tend to reduce
: the effectiveness of both known plaintext and chosen plaintext attacks.

Yes - but that's only half of the story.  In /addition/ to removing
patterns in the plaintext, the compressor should avoid adding patterns
of its own.

: So - if the compression is bad, it doesn't work as well for affecting
: these attacks, all else being equal.

Yes, "all else being equal".

: If the problem is only that a good compression program (i.e. pkzip) adds
: known information to the header (as exist in many file formats), why
: wouldn't moving the header to the end of the plaintext, and running in a
: feedback mode effectively eliminate or greatly reduce the problem?

Moving any header to the end of the file might help /slightly/.

We're trying to totally eliminate the problem, though - not
shuffle it under the rug.

: Additionally, why wouldn't running a bad compression program be potentially
: as bad as no compression, as it could lead to predictable patterns within
: the body of the plaintext?

Because it could be even worse? ;-)

: What if the "compression" program is actually designed to allow/insert
: patterns specifically to provide someone with a known plaintext
: attack?

It sounds like the enemy has infiltrated your intelligence unit ;-)
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

The cat that ate the ball of wool just had mittens.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to