Cryptography-Digest Digest #200, Volume #10       Wed, 8 Sep 99 13:13:04 EDT

Contents:
  Re: Different Encryption Algorithms (John Savard)
  Re: Linear congruential generator (LCG)
  Re: compression and encryption (Patrick Juola)
  Re: THE NSAKEY (jerome)
  Re: Linear congruential generator (LCG)
  Re: compression and encryption (SCOTT19U.ZIP_GUY)
  Re: Confused about public key encryption (DJohn37050)
  Re: NSAKEY as an upgrade key  (Was: NSA and MS windows) ("Trevor Jackson, III")
  Re: Different Encryption Algorithms (Anton Stiglic)
  Re: GnuPG 1.0 released
  Re: Hash of a file as key (Anton Stiglic)
  Re: GnuPG 1.0 released (JPeschel)
  Re: NSA and MS windows (SCOTT19U.ZIP_GUY)
  Re: Hash of a file as key ("Richard Parker")
  Re: MUM III (3 Way Matrix Uninvertable Message) (Tom St Denis)
  Re: arguement against randomness (Tim Tyler)
  Re: THE NSAKEY (SCOTT19U.ZIP_GUY)
  Re: Random and pseudo-random numbers
  Re: NSA and MS windows (Patrick Juola)
  Re: Linear congruential generator (LCG) (Tim Tyler)
  Re: Random and pseudo-random numbers (Tim Tyler)
  Re: compression and encryption (Tom St Denis)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Different Encryption Algorithms
Date: Wed, 08 Sep 1999 15:41:39 GMT

"entropy" <[EMAIL PROTECTED]> wrote, in part:

>I'm doing a high school research paper on different encryption algorithms,
>such as CAST, IDEA, blowfish, RCx, DES, etc.   Could anyone point me to
>informative web sites pertaining to the differences between these encryption
>methods?

IDEA, DES, RC6, and Blowfish are described on my web page, if that
helps.

John Savard ( teneerf<- )
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Linear congruential generator (LCG)
Date: 8 Sep 99 14:24:24 GMT

David Goodenough ([EMAIL PROTECTED]) wrote:
: However, on the subject of LCG's, I seem to remember that a certain
: operating system once used a LCG as it's "random number generator"
: (sic), that had a strong tendancy for the lower bits to go in a
: repeating cycle.  Is this just a property of that particular poor
: choice of a and b, or is this a problem with all LCG's

Yes, it is a problem with them all. Each bit only depends on itself and
the less significant ones, and thus the last n bits always cycle through
2^n states (or less).

John Savard

------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: compression and encryption
Date: 8 Sep 1999 11:14:49 -0400

In article <7r5jp2$[EMAIL PROTECTED]>,
Shaun Wilde <[EMAIL PROTECTED]> wrote:
>
>should I compress my data before or after encryption? (binary data - with
>possibly repeated blocks i.e .exe etc)
>
>1) If I compress before encyption the final data block is small.
>2) If I compress after encryption the data block is much larger (hardly any
>saving as the encryption removes any repetitiveness
>that exists in the original data.)
>
>From the above I would say go for the 1st option, however I have a concern
>and it is as follows.
>
>If someone was trying to break the encryption all they would have to do is
>
>a) try a key
>b) try to decompress
>    if decompression works - no errors - then the odds are on that thay have
>broken the code
>    else repeat
>
>Which would lead to an automated attack, whereas the second approach would,
>in my opinion, require a more
>interactive approach - as you would need to know what sort of data exists in
>the original to know whether you
>have decrypted succesfully.

Six of one, half a dozen of another.  Lots of headerless compression
methods exist, which would imply that any file can be compressed.
Similarly, "what sort of data exists" is a sufficiently general question
to which sufficiently general answers are known that automated
attacks are extremely practical.

In general, the result of an incorrect decryption with a good algorithm
are almost always "indistinguishable from random."  All I need is
a good test for randomness, and if the test fails, then I'm on to
something -- this applies irrespective of whether my plaintext is
English, German, .EXE files, or financial records....

I'd recommend precompression on the can't-hurt-may-help theory.

        -kitten



------------------------------

From: [EMAIL PROTECTED] (jerome)
Subject: Re: THE NSAKEY
Date: 8 Sep 1999 14:50:56 GMT

On Wed, 08 Sep 1999 11:43:36 GMT, Tom St Denis wrote:
>
>You might not believe this but David Wagner is a smart, talented person.  He
>is not 'attached' to Bruce as you might think, he has done work with Rivest
>and others as well.  I think this thread is way out of line.
>

http://www.counterpane.com/cpaneinfo.html lists d.wagner as a part of 
counterpane personnel and b.scheiner is the president of counterpane.

i don't take position in this debat, i simply show that cryptography 
is a small world and it isen't exactly fair to say that an employee 
and his president arent 'attached'.


------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Linear congruential generator (LCG)
Date: 8 Sep 99 14:27:47 GMT

Tony Wingo ([EMAIL PROTECTED]) wrote:
: If n is prime, the low bits behave much better.

On the other hand, if n is a power of 3, the low bits still behave "well",
but the last few digits if the number is represented in base-3 behave
badly.

A prime modulus only spreads around the long-period nature of the first
bit of the result to cover the whole number; it doesn't make the whole
number composed of independent long-period pieces. Thus, the visible
weakness of a power-of-2 modulus generator is masked, but not resolved, by
a prime modulus.

John Savard

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: compression and encryption
Date: Wed, 08 Sep 1999 16:55:32 GMT

In article <7r5uh9$6l0$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Patrick Juola) 
wrote:
>In article <7r5jp2$[EMAIL PROTECTED]>,
>Shaun Wilde <[EMAIL PROTECTED]> wrote:
>>
>>should I compress my data before or after encryption? (binary data - with
>>possibly repeated blocks i.e .exe etc)
>>
>>1) If I compress before encyption the final data block is small.
>>2) If I compress after encryption the data block is much larger (hardly any
>>saving as the encryption removes any repetitiveness
>>that exists in the original data.)
>>
>>From the above I would say go for the 1st option, however I have a concern
>>and it is as follows.
>>
>>If someone was trying to break the encryption all they would have to do is
>>
>>a) try a key
>>b) try to decompress
>>    if decompression works - no errors - then the odds are on that thay have
>>broken the code
>>    else repeat
>>
>>Which would lead to an automated attack, whereas the second approach would,
>>in my opinion, require a more
>>interactive approach - as you would need to know what sort of data exists in
>>the original to know whether you
>>have decrypted succesfully.
>
>Six of one, half a dozen of another.  Lots of headerless compression
>methods exist, which would imply that any file can be compressed.

   Actually this is what I first that. But it is not ture. However you can
test the compression/decompression routines quite easily yourself to see
if they don't have flaws which make it easyer for an attacker to break.
I am assuming that the method you select has the ability to compress 
a file and decompress it back to same file or you would not even be
consedering it.
 But to check for the reverse you must create several test binary random
files or verious lenght. Check to see that when you decompress these that
when you recompress you get back the starting file. This is an easy test
if Mr Patrick Juola is so dam certain many such compression methods are
common maybe he could actaully name one such method. But then
again he is the type to spout "facts" with out even thinking. I have
look for such "headerless" methods and have not found any. That
is way I made my own compression routine at 
http://members.xoom.com/ecil/compress.htm and you can feel
free to test it.


>Similarly, "what sort of data exists" is a sufficiently general question
>to which sufficiently general answers are known that automated
>attacks are extremely practical.
>
>In general, the result of an incorrect decryption with a good algorithm
>are almost always "indistinguishable from random."  All I need is
>a good test for randomness, and if the test fails, then I'm on to
>something -- this applies irrespective of whether my plaintext is
>English, German, .EXE files, or financial records....
>
>I'd recommend precompression on the can't-hurt-may-help theory.
>
>        -kitten
>
>


David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Confused about public key encryption
Date: 08 Sep 1999 15:58:11 GMT

EC methods in P1363 are based on EC analogs of DL methods, not RSA methods. 
There is an EC version of RSA, but I hear it does not have any advantage over
regular RSA.
Don Johnson

------------------------------

Date: Wed, 08 Sep 1999 11:58:37 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: NSAKEY as an upgrade key  (Was: NSA and MS windows)

[EMAIL PROTECTED] wrote:

> Thomas J. Boschloo ([EMAIL PROTECTED]) wrote:
> : Microsoft's explanation "Why is a backup key needed?" is bogus (they
> : claim it would be needed for when the building in which it is kept is
> : destroyed by a natural disaster, LOL).
>
> Well, while keeping two copies of the key would solve that, two copies of
> the same secret key won't help if one key is _compromised_. For that, a
> second key, to which the corresponding secret key is stored _elsewhere_,
> would serve a useful backup function.

This only makes sense if there is a revocation mechanism for the primary
key.  Do you see such a mechanism?


------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Different Encryption Algorithms
Date: Wed, 08 Sep 1999 11:13:29 -0400

Kostadin Bajalcaliev wrote:

> Hello
>
> Usualy people do not publish comapration of algorithms, you can easy find

There are a couple of benchmark comparisons

here is just one:

http://security.nsj.co.jp/products/cstv6/benchmark.html


------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: GnuPG 1.0 released
Date: 8 Sep 99 14:21:41 GMT

JPeschel ([EMAIL PROTECTED]) wrote:
:  [EMAIL PROTECTED] () writes in part:

: >but PGP is not approved for export.

: John, I believe NAI got PGP export approval a couple years ago,
: shortly after it acquired PGP, Inc.

I should have been more complete; unless I'm badly mistaken, full PGP
couldn't get export approval from the U.S. because the keys are too long.
But NAI may have had permission to market foreign-written versions.

John Savard

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Hash of a file as key
Date: Wed, 08 Sep 1999 10:54:35 -0400

Gary wrote:

> Would using the hash of a file (just before its symmetric encryption with
> the session date and time as salt) as a session key be a bad idea?

If an attacker ever gets to predict that file, he has a great advantage.
Why not just go for a key generated by a pseudo random device?

Anton


------------------------------

From: [EMAIL PROTECTED] (JPeschel)
Subject: Re: GnuPG 1.0 released
Date: 08 Sep 1999 12:21:18 GMT

 [EMAIL PROTECTED] () writes in part:

>but PGP is not approved for export.

John, I believe NAI got PGP export approval a couple years ago,
shortly after it acquired PGP, Inc.

Joe

__________________________________________

Joe Peschel 
D.O.E. SysWorks                                 
http://members.aol.com/jpeschel/index.htm
__________________________________________


------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: NSA and MS windows
Date: Wed, 08 Sep 1999 16:59:32 GMT

In article <[EMAIL PROTECTED]>, Jim Russell <[EMAIL PROTECTED]> wrote:
>BillU> I do not think you understand cryptography.
> 
>DavidW> No, not nearly as well as I'd like to...
>
>Bill, you should note that David is being unduely modest here.  Perhaps
>you should drop by counterpane.com, and find out who the creators of the
>Twofish algorithm are.
>
>Jim Russell
>LockStar, Inc.

 But Jim how do we know that actaul creators of 2fish are in Minnisota. What
about the possible input due to people in the Fort Mead area.



David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: "Richard Parker" <[EMAIL PROTECTED]>
Subject: Re: Hash of a file as key
Date: Wed, 08 Sep 1999 15:37:59 GMT

"John Savard" <[EMAIL PROTECTED]> wrote:
> One technique:
> 
> - divide a file into a hash-length piece and the rest,
> - hash the rest, XOR the result with the first piece
> - encrypt the rest with the XORed first piece and hash as the key
> - use RSA to encrypt the first piece, also the session key for the rest,
> plus as much of the rest as will fit
>
> while I think it's optimal, seems to be covered by a patent.

Both your Hash/Encrypt construction and Mihir Bellare's CBC-MAC/CBC
construction in particular are very similar to construction used by
Ross Anderson and Eli Biham for the the BEAR cipher.  Unfortunately,
the technical report in which BEAR appears was published ten months
after Bellare applied for his patent, so it appears that it can't be
used to as an example of prior art.

-Richard

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: MUM III (3 Way Matrix Uninvertable Message)
Date: Wed, 08 Sep 1999 16:10:03 GMT

In article <Z_sB3.6547$C7.149312@wards>,
  "Gary" <[EMAIL PROTECTED]> wrote:
> MUM III (3 Way Matrix Uninvertable Message)
>
> How would this be cracked?
>
> Alice has a message which she turns into an uninvertable square matrix, M.
> She picks a random invertable square matrix, A.
>
> Alice sends Bob the product of these 2 matrices, AM.
>
> Bob generates a random invertable square matrix, B.
>
> Bob sends Alice the product AMB.
> Alice then sends (Inverse of A)AMB=IMB=MB, where I is the identity matrix.
> Bob now has MB(Inverse of B)=MI=M.
>
> Gary.

Quite simply no encryption algorithm can soley be made out of one way
functions at M.  (or more specifically a non-invertable function at M)

For example C = F(M), if P != F'(C) then no matter what you do it won't
decrypt.

I think you have to rethink the algorithm a bit.

Note:  non-invertable functions can be used in ciphers, but not as the sole
mechanism.  Think of RSA with composite moduli.... won't work well will it?

Tom
--
damn windows... new PGP key!!!
http://people.goplay.com/tomstdenis/key.pgp
(this time I have a backup of the secret key)


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: arguement against randomness
Reply-To: [EMAIL PROTECTED]
Date: Wed, 8 Sep 1999 16:16:13 GMT

Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:

:> Whether the analogy holds true depends on to what extent the universe
:> is funadmentally a cellular automata - i.e to what extend Fredkin's
:> "Digital Mechanics" holds.

: Good; then we're perfectly safe.

Not according to Digital Physics.  See http://cvm.msu.edu/~dobrzele/dp/
for an example of how this could be the case.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

        <--------- The information went data way ----------->

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: THE NSAKEY
Date: Wed, 08 Sep 1999 17:16:54 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (jerome) wrote:
>On Wed, 08 Sep 1999 11:43:36 GMT, Tom St Denis wrote:
>>
>>You might not believe this but David Wagner is a smart, talented person.  He
>>is not 'attached' to Bruce as you might think, he has done work with Rivest
>>and others as well.  I think this thread is way out of line.
>>
>
>http://www.counterpane.com/cpaneinfo.html lists d.wagner as a part of 
>counterpane personnel and b.scheiner is the president of counterpane.
>
>i don't take position in this debat, i simply show that cryptography 
>is a small world and it isen't exactly fair to say that an employee 
>and his president arent 'attached'.
>

 Yes if David Wagner showed to much independent thought he might
be out a job. And from what people have told me Bruce needs a lot
of stroking and does not accept valid critizism very well. So David
Wagner is just playing it smart. This may also be a valid reason why
he has attacked my method so fericely but then when push comes
to shove. He says it is to difficlut for him to actually read the source
code and follow what it does.
 It is my understanding that for some reason Bruce hates my guts
though we have never meet.





David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Random and pseudo-random numbers
Date: 8 Sep 99 14:19:13 GMT

Eric Lee Green ([EMAIL PROTECTED]) wrote:
: In any event, other than my initial state I don't have to be
: "truly" random (and I seriously question whether you can get such a
: thing, even measuring cosmic rays with a geiger counter doesn't produce
: truly random numbers), I need to be unpredictable. Which is almost, but
: not quite, the same thing 

Well, that is my point. You do need an initial state that contains at
least as much true randomness as one of the keys you're generating.

Then, if you generate random numbers by a good cryptosecure method - and
you should use something elaborate, slower than regular encryption - while
not perfect, that would indeed be adequate (IMO, and I think others will
be more demanding).

But if your initial randomness is smaller than the keys you generate, then
brute-force search over the smaller pool of possibilities for the initial
randomness will do you in. To make 1024-bit keys, you *need* 1024 bits of
true randomness to start from, and 2048 bits is better. (Even then, by
being deterministic from then on, the internal state of the PRNG is now as
sensitive as any key, and is therefore an additional vulnerability to
physical compromise.)

John Savard

------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: NSA and MS windows
Date: 8 Sep 1999 12:38:39 -0400

In article <7r615f$2hu4$[EMAIL PROTECTED]>,
SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>In article <[EMAIL PROTECTED]>, Jim Russell <[EMAIL PROTECTED]> 
>wrote:
>>BillU> I do not think you understand cryptography.
>> 
>>DavidW> No, not nearly as well as I'd like to...
>>
>>Bill, you should note that David is being unduely modest here.  Perhaps
>>you should drop by counterpane.com, and find out who the creators of the
>>Twofish algorithm are.
>>
>>Jim Russell
>>LockStar, Inc.
>
> But Jim how do we know that actaul creators of 2fish are in Minnisota.

They aren't.  At least one of them is (was) a student of mine in
Colorado.

Which says absolutely nothing about whether Mr. Wagner (should it be
Dr. Wagner?) "understand[s] cryptography."  His being part of the
Twofish team suggests that if he doesn't understand cryptography, he
must make a *hell* of a good cup of coffee. 8-)

        -kitten

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Linear congruential generator (LCG)
Reply-To: [EMAIL PROTECTED]
Date: Wed, 8 Sep 1999 16:08:42 GMT

David Goodenough <[EMAIL PROTECTED]> wrote:

: However, on the subject of LCG's, I seem to remember that a certain
: operating system once used a LCG as it's "random number generator"
: (sic), that had a strong tendancy for the lower bits to go in a
: repeating cycle.

Java does this.  The java.util.Random class uses a LCG and picks
a modulus which /happens/ to be a power of two (despite the warning
in Knuth's TAOCPv2 concerning doing this).

To see the results, look at the Java applet at:
http://www.alife.co.uk/nonrandom/

: Is this just a property of that particular poor
: choice of a and b, or is this a problem with all LCG's

Picking a power of two for 'n' caused the problem in Sun's case.

However, LCGs are rather poor RNGs at the best of times, partly
due to lack of signal propagation from the high bits to the
lower ones.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

'i' before 'e', except in "pleiotropy".

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Random and pseudo-random numbers
Reply-To: [EMAIL PROTECTED]
Date: Wed, 8 Sep 1999 16:00:00 GMT

[EMAIL PROTECTED] wrote:

: To make 1024-bit keys, you *need* 1024 bits of
: true randomness to start from, and 2048 bits is better.

If you want 1024 bit keys - and you have 1024 bits of true randomness -
what more could you possibly ask for?
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

If it's God's will, who gets the money?

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: compression and encryption
Date: Wed, 08 Sep 1999 16:20:50 GMT

In article <7r5oln$1qhq$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY) wrote:
> In article <7r5jp2$[EMAIL PROTECTED]>, "Shaun Wilde" 
><[EMAIL PROTECTED]> wrote:
> >
> >should I compress my data before or after encryption? (binary data - with
> >possibly repeated blocks i.e .exe etc)
> >
> >1) If I compress before encyption the final data block is small.
> >2) If I compress after encryption the data block is much larger (hardly any
> >saving as the encryption removes any repetitiveness
> >that exists in the original data.)
> >
> >From the above I would say go for the 1st option, however I have a concern
> >and it is as follows.
> >
> >If someone was trying to break the encryption all they would have to do is
> >
> >a) try a key
> >b) try to decompress
> >    if decompression works - no errors - then the odds are on that thay have
> >broken the code
>      This is ture if you use most compression methods. But if you use
> a "one to one" compressor any file can be the compressed results of
> another file. Therefore all files that could result from guessing a worng key
> would be uncompressable. See http://members.xoom.com/ecil/compress.htm
> If your are like me you may have wondered wht PGP was not designed with
> this type of compression. I feel that a weak compressor can be used as
> a back door to help with the breaking of encryption.

Technically DEFLATE can decompress any data but will not produce anything
useable if the input stream is invalid.  With your definition DEFLATE is
'one-to-one'.  I don't see how this helps.... oh well.

>      The second apporach is far worse since the enemy would have to uncompress
> only once.

So what?

>  Yes you have the right to be worried. Most books put out by the
> experts fail to cover this topic. It is most likely not covered on purpose.
> IF you notice Mr Brue or Wagner will not even touch the topic since
> it is a likely back door to such methods as PGP. And people such as
> them are afraid to make trouble for the NSA.

What are you talking about?  Are you at all lucid today?

PGP uses DEFLATE to achieve HIGHER compression ratios (you know the point of
compression).  They don't use simple statistical coders because they perform
far worse.

Right after you find a practical break for PGP I will swear allegence to your
wierd clan of thinking.  You think that only your ideas are valid but alot of
what you say is moot.

AONE for example is not needed if you are smart enough to use a MAC or SIGN
the message.  Thus integrity can be ensured as long as the signature or MAC
(the cipher) are not broken.

one-to-one compression while supported by DEFLATE is utterly useless because
it doesn't enhance security any (it does help the ratio though).  At any rate
I can simply decrypt/decompress to verify ASCII.... it doesn't help much. 
Plus most huffman coded files are rarely 'good random data' so correlations
could be found or used in the binary stream (compressed) if one knows the
input language.  For example 'e' will be near the top of the tree...etc...
etc..

BTW, stop making fun of Mr Wagner and Mr Schneier, I kinda admire them for
their brains, talents and know-how.  You most likely mock them because you
have too much time on your hands.... Why not be creative for once instead of
destructive?

Tom
--
damn windows... new PGP key!!!
http://people.goplay.com/tomstdenis/key.pgp
(this time I have a backup of the secret key)


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to