Cryptography-Digest Digest #502, Volume #14       Sun, 3 Jun 01 11:13:00 EDT

Contents:
  Re: Jacobian projective coordinates ("Anthony Mulcahy")
  Re: Fast 8-bit mults on smartcards (Robert Harley)
  Re: unpredicable random number generator ? (Tim Tyler)
  Re: Random number generation. (Tim Tyler)
  Re: James Felling:  Sorry to break your bubble (Tim Tyler)
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Re: Uniciyt distance and compression for AES ("Tom St Denis")
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Making n => 1 bit functions ("Tom St Denis")
  Re: Uniciyt distance and compression for AES ("Tom St Denis")
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Re: Question about credit card number (Anne & Lynn Wheeler)
  Re: Dynamic Transposition Revisited Again (long) ([EMAIL PROTECTED])
  UK legislation on cryptographic products ("demon news")
  UK legislation on cryptographic products ("demon news")
  UK legislation regarding cryptography ("demon news")
  Re: Making n => 1 bit functions ("demon news")
  Re: Uniciyt distance and compression for AES (SCOTT19U.ZIP_GUY)

----------------------------------------------------------------------------

From: "Anthony Mulcahy" <[EMAIL PROTECTED]>
Subject: Re: Jacobian projective coordinates
Date: Sun, 3 Jun 2001 19:32:08 +0900

"himanee" <[EMAIL PROTECTED]> wrote
> Hello,
>
> In recent snapshots of openssl ecc code, in a comment, I came across the
> term 'Jacobian projective coordinates' and coversion between Jacobian
> projective coordinates and affine coordinates.
>
> I have the following *mathematics* question. At least, I would like to
> request a reference.
>
> I know about the usual projective coordinates and affine cooordinates, and
> the "usual" relationship between them, viz.
> x = X/Z,
> y = Y/Z,
> where (x, y) are affine coorninates, (X, Y, Z) are projective coordinates,
> and we have assumed that Z is not zero.
>
> I can imagine the following. A projective plane is the set of all straight
> lines through the origin in 3-space. Instead of straight lines through the
> origin, one can think of some curves through the origin in 3-space,
> (corresponding to the transformations x = X/(Z^2), y = (Y/Z^3)). Is that
the
> idea behind Jacobian projective coordinates?
>
> Projective plane means affine plane + points (and lines) at infinity.
>
> Is there some kind of a 'Jacobian projective plane' or something? If so,
is
> there something corresponding to the above paragraph? If not, what is the
> idea behind Jacobian projective coordinates?
>
> What is the mathematical significance of the transformations :
> x = X/(Z^2)
> y = Y/(Z^3) ?
>
> Also, can someone give me reference/s for Jacobian projective coordinates?
>
> Thanks in advance.
>
> Best wishes,
> Himanee

Jacobian projective representation is referred to in "Elliptic Curves in
Cryptography" by I.F. Blake, G. Seroussi and N.P. Smart where it is also
called Weighted Projective representation. The curve equation is given as:

Y^2 = X^3 + aXZ^4 + bZ^6

There aren't many mathematical details, but the references that are quoted
look like they might be what you're looking for. I haven't read them however
so I can't say for sure:

D.V. Chudnovsky and G.V. Chudnovsky. Sequences of numbers generated by
addition in formal groups and new primality and factorization tests. Adv. in
Appl. Math., 7, 1987.

H. Cohen, A. Miyaji and T. Ono. Efficient elliptic curve exponentiation
using mixed coordinates. Advances in Cryptology, ASIACRYPT 98.

Anthony Mulcahy


------------------------------

From: Robert Harley <[EMAIL PROTECTED]>
Subject: Re: Fast 8-bit mults on smartcards
Date: 03 Jun 2001 12:43:25 +0200


[EMAIL PROTECTED] (Mark Wooding) writes:
>[...]  I presume you mean F_{2^8} represented as F_2(x) with x a root of
>a degree-8 monic irreducible polynomial p(x) \in F_2[x].  I've sometimes
>seen that written F_2[x]/(p(x)) although the denominator has the wrong form.

What's wrong with it?  The parentheses around p(x) indicate "the ideal
generated by".  Seems fine to me.


Bye,
  Rob
     .-.                [EMAIL PROTECTED]                .-.
    /   \           .-.         +33 6 7271 2496         .-.           /   \
   /     \         /   \       .-.     _     .-.       /   \         /     \
  /       \       /     \     /   \   / \   /   \     /     \       /       \
 /         \     /       \   /     `-'   `-'     \   /       \     /         \
            \   /         `-'     ArgoTech        `-'         \   /
             `-'               http://argote.ch/               `-'

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: unpredicable random number generator ?
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:16:37 GMT

Sam Yorko <[EMAIL PROTECTED]> wrote:

: I've been fond of:
: http://www.lavarand.com/

http://lavarand.sgi.com/
http://www.lavarand.com/
http://www.lavarnd.org/

Some one seems to be rather excited about those lava lamps...
-- 
__________
 |im |yler  http://rockz.co.uk/  http://alife.co.uk/  http://atoms.org.uk/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Random number generation.
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:24:39 GMT

Benjamin Johnston <[EMAIL PROTECTED]> wrote:

: I've been considering the need for generating random numbers in my "toy"
: crypto project.

: One approach I was considering is amassing a heap of semi-random data like
: time stamps and key strokes (ie. as suggested in "all the books")...

: But, then I need to turn this data into actual "random" data (well, close
: enough for cryptographic purposes).

: To do this, I figured there are two approaches;

: 1. have plenty of data and then hash, say, 1Kb (or whatever I estimate is
: necessary given the entropy of the semi-random data) into 160 bits at a
: time (and not use the same 1Kb of data, again, for any other purpose).

: 2. have plenty of data, and prepend an integer to all of the data, and
: hash the whole lot to 160 bits. If I need more "random" data, I just
: increment the integer and hash it all again.

: I think approach 2 is easier, and it means that I don't have to worry
: about "wasting" the semi-random data if I start discarding the hashed
: values. And I figure it should be just as secure as approach 1, if we
: assume that SHA, for example, is "good" - so the two hash values should be
: completely unrelated.

: So, what I'm asking, is:

: 1. is my approach (ie. the second one) silly?

The second approach is fine - at least, as far as security goes, you will
have a hard job doing better than it.

However, it may not be terribly speedy.  Whether you can use it may depend
on whether this is important or not.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

Crossposted-To: alt.hacker,talk.politics.crypto
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: James Felling:  Sorry to break your bubble
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:32:15 GMT

In sci.crypt Anthony Stephen Szopa <[EMAIL PROTECTED]> wrote:

: [...] I sure hope everyone is listening.

: Array number:   0     0 1 2 3 4 5 6 7 8 9 
: Array number:   1     0 1 2 3 4 5 6 7 9 8 
: Array number:   2     0 1 2 3 4 5 6 8 7 9 
: Array number:   3     0 1 2 3 4 5 6 8 9 7 

[...]

: Array number:   101     0 1 2 3 4 9 5 8 7 6 
: Array number:   102     0 1 2 3 4 9 6 5 7 8 
: Array number:   103     0 1 2 3 4 9 6 5 8 7 
: Array number:   104     0 1 2 3 4 9 6 7 5 8 [...]

...if so, you just bored them to tears :-(
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:44:28 GMT

[EMAIL PROTECTED] wrote:
: "SCOTT19U.ZIP_GUY" wrote:

:>  Uncity distance is a concept found by Shannon that is valid even
:> today. It has to do with the amount of cipher text needed to be
:> looked at in a ciphertext only attack to determine the key and
:> the rest of a secret message.
: [snip]

: Unicity distance can be interpreted as the smallest number of
: of ciphertext letters which must be intercepted in order to
: expect a unique, *meaningful* decipherment. It
: is a function of both keyspace entropy and the redundancy of the
: "language". Language redundancy arises from the fact that not
: all possible messages are meaningful. 

: I don't see how compression makes a difference. Compression reduces
: the redundancy (a message in a language that contains no redundancy
: cannot be compressed), but since the redundancy is a property
: of the *language* then not all decompressions will be meaningful.

: I don't think you can get around this. It seems to me that all you're
: doing is adding another step, i.e. decrypt and then decompress to
: determine
: whether or not the message is meaningful, rather than just decrypting as
: the case would be if no compression were used.

: Simply put, redundancy is a feature of the language. You can't change
: the redundancy without changing the language. Without changing the
: redundancy you can't change the unicity distance (assuming no
: change in the entropy of the keyspace).

: Am I overlooking something?

Yes you are.  Unicity distance is a function of the redundancy of the
inputs to the encryption algorithm, not that of the original message.

If the redundancy in the input the the encryption algorithm is
decreased (e.g. via compression), then the unicity distance
increases - a computationally unbounded adversary needs more
cyphertext before he can uniquely determine a plaintext.

Note that there may be no upper bound on the degree by which
compression can increase the unicity distance.  It can transform
a system with a unicity distance of 128 bytes to one where
the unicity distance is effectively infinite.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:47:48 GMT

Tom St Denis <[EMAIL PROTECTED]> wrote:

: The original reply does make sense.  You are not switching languages
: just the representation.  I.e I will swap A with B, B with C, C with
: D, etc... The words look different but it's basically the same
: language.

Uh - compression before encryption increases the unicity distance.
Surely you are not claiming otherwise...?
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Sun, 03 Jun 2001 11:58:44 GMT


"Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> : The original reply does make sense.  You are not switching languages
> : just the representation.  I.e I will swap A with B, B with C, C with
> : D, etc... The words look different but it's basically the same
> : language.
>
> Uh - compression before encryption increases the unicity distance.
> Surely you are not claiming otherwise...?

Not really.  Think about it.  You take a 100 byte message and pack it into
16 bytes (just an example).  Now I try all the keys to decrypt the 16 bytes.
For some of the keys the material will actually decompress, for those I can
still check for the biases in english.

Think of compression in this case as just transposing the alphabet to an
isomorphic alphabet (i.e equivalent).  You're not making the original
language less biased, you're just changing it's representation and adding a
layer.

Tom



------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 11:52:34 GMT

[EMAIL PROTECTED] wrote:

: Then again, compression would seem to reduce the number of ciphertext
: characters required for a unique, meaningful decompressed decipherment
: so maybe it *reduces* the unicity distance, which is a benefit to the
: cryptanalyst.

: Is this right?

Not at all.  Lots of correct looking decrypts *hinders* the cryptanalyst -
since he has no idea which one is the real message.

Similarly, the best place to hide a tree is in a forest.

The cryptanalyst's task is easiest when there's only one correct looking
decrypt, and all the others look like random garbage.

Compression increases the unicity distance, by decreasing the redundancy
in the inputs to the cypher.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Making n => 1 bit functions
Date: Sun, 03 Jun 2001 12:04:53 GMT

I got bored last night so I tried out some methods for making n=>1 bit
functions like those in Haval, etc..

My idea was to make up random terms of the form (a00x0)(a01x1)(...) +
(a10x0)(a11x1)(...) + ..., where x is the input and aij is some random
series of vectors.  (well a_i are the vectors).

So I made the program make up these things removing duplicate terms and
complimentary ones (I allow not gates) and I have found some 4 => 1 bit
functions such as

(I used A instead of X in my C code output)
(A0&A1&A2)^~(A1&A2)^~(A0&A2&A3)^(A0)^~(A2)^(A1&A2&A3)^~(A0&A1)^~(A0&A1&A3) ^
0

(latex code)
$Y = X_{0}X_{1}X_{2}  \oplus  \lnot X_{1}X_{2}  \oplus  \lnot
X_{0}X_{2}X_{3}  \oplus X_{0}  \oplus  \lnot X_{2}  \oplus X_{1}X_{2}X_{3}
\oplus  \lnot X_{0}X_{1}  \oplus  \lnot X_{0}X_{1}X_{3} \oplus 0 $

This 4 => 1 has no linear bias over 1/4, and the absolute sum of all the
linear biases is 48 (i.e on average each one is 3).  It also is 0/1 balanced
and satifies SAC.

The source is on my website (misc source section) as
http://tomstdenis.home.dhs.org/src/func.c

I realize that in Haval they used a different approach.  They made a bent
function and used nonlinear variants afaik... which seems a bit more
elegant...

Any comments on my method?
--
Tom St Denis
---
http://tomstdenis.home.dhs.org



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Sun, 03 Jun 2001 12:08:38 GMT


"Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> [EMAIL PROTECTED] wrote:
>
> : Then again, compression would seem to reduce the number of ciphertext
> : characters required for a unique, meaningful decompressed decipherment
> : so maybe it *reduces* the unicity distance, which is a benefit to the
> : cryptanalyst.
>
> : Is this right?
>
> Not at all.  Lots of correct looking decrypts *hinders* the cryptanalyst -
> since he has no idea which one is the real message.
>
> Similarly, the best place to hide a tree is in a forest.
>
> The cryptanalyst's task is easiest when there's only one correct looking
> decrypt, and all the others look like random garbage.
>
> Compression increases the unicity distance, by decreasing the redundancy
> in the inputs to the cypher.

Not really.  Again think of this from an attackers POV.

I'm going to guess a key, I'm going to decrypt, I'm going to try and
decompress, I'm going to check for english

vs

I'm going to guess a key, I'm going to decrypt, I'm going to check for
english

You've just added a step in there.  I'm still quite capable of doing the
rest of the attack.  Rememeber that most compression codecs are
deterministic so the transformation doesn't change the meaning of the
message just it's format.
> --
> __________
>  |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/



------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 12:00:29 GMT

Matt Timmermans <[EMAIL PROTECTED]> wrote:

: I'm actually not a fan of the compression-before-encryption thing [...]

Yet you are the author of a program (BICOM) that compresses and then
encrypts - what gives? ;-)
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 12:12:50 GMT

SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:

: If you have a cipher text ideally according to Shannon you would want
: every possible key to decrypt decompression to unique plain text.  This
: implies bijective compression. Yet the so called experts seem to stuipd
: to notice this fact.

That's not /quite/ what Shannon's perfect secrecy implies:

Perfect secrecy means that the cyphertext contains no information about
the plaintext.  This can happen if one cyphertext maps to one plaintext -
but can also happen if exactly ten cyphertexts map to every plaintext.

I don't think perfect secrecy logically implies 1-1 compression.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

Subject: Re: Question about credit card number
Reply-To: Anne & Lynn Wheeler <[EMAIL PROTECTED]>
From: Anne & Lynn Wheeler <[EMAIL PROTECTED]>
Date: Sun, 03 Jun 2001 12:31:18 GMT

[EMAIL PROTECTED] (Roger Fleming) writes:

> To be fair, most people indeed cannot remember 8 digit PINs. But they could 
> use passphrases instead, or issue X.509 certs, or at least put in a long delay 
> (and report to security) every 3 errors. All of these, however, require a 
> little work to transfer onto the web from a PIN based system on stateful 
> machines, and work means eroding the bottom line.

... or support x9.59 

http://www.garlic.com/~lynn/


-- 
Anne & Lynn Wheeler   | [EMAIL PROTECTED] -  http://www.garlic.com/~lynn/ 

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Dynamic Transposition Revisited Again (long)
Date: 3 Jun 2001 13:59:38 GMT

This is the result of running the sci.crypt FAQ (all 10 
sections) through a byte/bit counter.  (NB DOS format ie CRLFs at 
end of lines) 

Maximum bits in file = 1060616 (132577 bytes)
     Actual bits set =  467377 (44.067%)
               Bit 7 =       0 ( 0.000%)
               Bit 6 =   94724 (71.448%)
               Bit 5 =  120046 (90.548%)
               Bit 4 =   38573 (29.095%)
               Bit 3 =   46584 (35.137%)
               Bit 2 =   59031 (44.526%)
               Bit 1 =   46759 (35.269%)
               Bit 0 =   61660 (46.509%)

So as the bias is not too bad for ASCII text, why not (pseudorandomly) 
invert half the bits and, assuming large blocks, this should correct 
the bias?

------------------------------

From: "demon news" <[EMAIL PROTECTED]>
Subject: UK legislation on cryptographic products
Date: Sun, 3 Jun 2001 15:20:41 +0100

Hello!.

    Could anybody let me know what the current legislation is for
cryptographic products in the UK?.

    I have just finish the development of a cryptography research project
which includes the implementation of strong symmetric and asymmetric
encryption engines and I would like to include a section in the current
legislation.

    I have gone through some of the documentation contained in the
Electronic Communications Act (2000) but I don't can not make sense of half
of points expressed on it.

Thanks very much in advance.
Alex.






------------------------------

From: "demon news" <[EMAIL PROTECTED]>
Subject: UK legislation on cryptographic products
Date: Sun, 3 Jun 2001 15:21:01 +0100

Hello!.

    Could anybody let me know what the current legislation is for
cryptographic products in the UK?.

    I have just finish the development of a cryptography research project
which includes the implementation of strong symmetric and asymmetric
encryption engines and I would like to include a section in the current
legislation.

    I have gone through some of the documentation contained in the
Electronic Communications Act (2000) but I don't can not make sense of half
of points expressed on it.

Thanks very much in advance.
Alex.






------------------------------

From: "demon news" <[EMAIL PROTECTED]>
Subject: UK legislation regarding cryptography
Date: Sun, 3 Jun 2001 15:22:32 +0100

Hello!.

    Could anybody let me know what the current legislation is for
cryptographic products in the UK?.

    I have just finish the development of a cryptography research project
which includes the implementation of strong symmetric and asymmetric
encryption engines and I would like to include a section in the current
legislation.

    I have gone through some of the documentation contained in the
Electronic Communications Act (2000) but I don't can not make sense of half
of points expressed on it.

Thanks very much in advance.
Alex.






------------------------------

From: "demon news" <[EMAIL PROTECTED]>
Subject: Re: Making n => 1 bit functions
Date: Sun, 3 Jun 2001 15:22:59 +0100


"Tom St Denis" <[EMAIL PROTECTED]> wrote in message
news:FJpS6.15541$[EMAIL PROTECTED]...
> I got bored last night so I tried out some methods for making n=>1 bit
> functions like those in Haval, etc..
>
> My idea was to make up random terms of the form (a00x0)(a01x1)(...) +
> (a10x0)(a11x1)(...) + ..., where x is the input and aij is some random
> series of vectors.  (well a_i are the vectors).
>
> So I made the program make up these things removing duplicate terms and
> complimentary ones (I allow not gates) and I have found some 4 => 1 bit
> functions such as
>
> (I used A instead of X in my C code output)
> (A0&A1&A2)^~(A1&A2)^~(A0&A2&A3)^(A0)^~(A2)^(A1&A2&A3)^~(A0&A1)^~(A0&A1&A3)
^
> 0
>
> (latex code)
> $Y = X_{0}X_{1}X_{2}  \oplus  \lnot X_{1}X_{2}  \oplus  \lnot
> X_{0}X_{2}X_{3}  \oplus X_{0}  \oplus  \lnot X_{2}  \oplus X_{1}X_{2}X_{3}
> \oplus  \lnot X_{0}X_{1}  \oplus  \lnot X_{0}X_{1}X_{3} \oplus 0 $
>
> This 4 => 1 has no linear bias over 1/4, and the absolute sum of all the
> linear biases is 48 (i.e on average each one is 3).  It also is 0/1
balanced
> and satifies SAC.
>
> The source is on my website (misc source section) as
> http://tomstdenis.home.dhs.org/src/func.c
>
> I realize that in Haval they used a different approach.  They made a bent
> function and used nonlinear variants afaik... which seems a bit more
> elegant...
>
> Any comments on my method?
> --
> Tom St Denis
> ---
> http://tomstdenis.home.dhs.org
>
>



------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Uniciyt distance and compression for AES
Date: 3 Jun 2001 14:52:08 GMT

[EMAIL PROTECTED] (Tim Tyler) wrote in <[EMAIL PROTECTED]>:

>That's not /quite/ what Shannon's perfect secrecy implies:
>
>Perfect secrecy means that the cyphertext contains no information about
>the plaintext.  This can happen if one cyphertext maps to one plaintext -
>but can also happen if exactly ten cyphertexts map to every plaintext.
   I have to look it up but there a difference between ideal and perfect.
I thought in perfect every key tested has to lead to a unique input
message so the bijection is between the set of 3 the key plaintext
and message. But of course many ciphertexts map to same input just
change the key. 
 
>
>I don't think perfect secrecy logically implies 1-1 compression.

  I think your right. The compressor would not always have to be
bijective to work. A simple example  is take a bijective compressor
add the letters 'PK" to front of text. Encrypt the message. To
get a cipher text. Now examine what happens if we test every key.
If every key leads to a file of form with "PK" in front of it,.Then
the over all system could still be a bijective compression encryption
system even thought the compressor was not bijective. But you realize
strange capibilites would have to be added to the encryptor to get
a system like this. For one the encryption part of system itself would not
be bijective since each cipher text would have to decrypt to something
with "PK" in the front two words.

  But its clear that for a given possible cipher text. That when one
forms the set of possible inputs to the encryptor portion. Which is
the set created by encrypting every message and then on each cipher
text message you test every key to get the input form of data to
encryptor. Many such message will be the result of different 
key,cipher text compbinations and many repeated. But when you have
this thid set its clear any member of this set when decompressed and
recompressed back has to come back to itself. If it does not
then information about plain text gained. This feature is lacking
in most compressors. But your right it does require true bijection
in the since I use. But it requires if not bijection a very strange
relationship between the compressor and encryptor portion of system.

  However if the ecnryptor portion is bijective from some input
form to binary output files. THe compressor would have to be bijective
from its input set to a form the encryptor expects as input.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to