Cryptography-Digest Digest #698

2001-02-16 Thread Digestifier

Cryptography-Digest Digest #698, Volume #13  Fri, 16 Feb 01 14:13:01 EST

Contents:
  My encryption system. (Keill_Randor)
  Re: Triple-DES MAC (Paul Schlyter)
  Re: Triple-DES MAC (DJohn37050)
  Re: Triple-DES MAC ("Christian Schwarz")
  Re: Ciphile Software:  Why .EXE files so large ("CMan")
  Re: National Security Nightmare? ("CMan")
  Re: Big Numbers in C/C++ (JCA)
  Re: Super strong crypto (Steve Portly)
  Re: Ciphile Software:  Why .EXE files so large (phil hunt)
  Re: "RSA vs. One-time-pad" or "the perfect enryption" ([EMAIL PROTECTED])
  MD5 - VB and ASP code available free ("Phil Fresle")
  Re: Digital signature w/o original document (Mike Rosing)
  Re: /dev/random under Linux (Mike Rosing)
  Re: My encryption system. (Quisquater)
  Reverse encoding question (Paul Starzetz)
  Re: Reverse encoding question ("Henrick Hellström")



From: Keill_Randor [EMAIL PROTECTED]
Subject: My encryption system.
Date: Fri, 16 Feb 2001 13:28:28 +

It seems that you are all ignoring my challenge.. Oh well...

The encryption system I have, (in a response to other posts on this board), 
IS different and a generation ahead of any other encryption system in the world
 that I know of.  Also, (in response to other posts), a one-pad cipher IS NOT 
the best system possible.  Mine is.  To understand why that is the case, you
 have to understand exactly what data encryption is about in the first place.
This seems to be the number one problem with everyone at the minute:
No-one understands the problem, so no wonder nobody, (apart from myself),
 has actually solved it.

All data encryption is about is changing a peice of information into another, in
such a way as to allow you a) to get it back later, and b) stop any 
'unauthorised' people finding out what it originally was.  The ULTIMATE
solution, therefore, is to split a peice of information into two or more
 EXISTING (innocuous) peices of information that CANNOT INDIVIDUALLY BE PROVEN
 TO BE ENCRYPTED..

My system at it's best can do this, (Though I have no doubt that it will be 
very difficult).

The by-product of this, is being able to turn ANY peice of information into ANY
peice of information, which again, makes it uncrackable.  (And completely screws
up a lot of laws I know about).

At it's best, (if splitting it into two or more existing peices isn't possible),
 my system can do a:

Compound, non-repeating, multiple solution, multiple key, multiple algorithm, 
mutiple dimension, multiple depth, variable size encrypt, with multiple phase
 and multiple direction encoding, and (optional) Multiple variable ciphers

Trust me, if I encrypted something with all of this attached, then NO-ONE would
 ever crack OR solve it, without knowing EVERYTHING about it.


Still looking for a job.  (Any offers???). (I cannot drive though, and I am
 currently broke...).

(P.S. If no-one else has what I have, does that make me King Cryppie???).

Darren Tomlyn
[EMAIL PROTECTED]
___
Submitted via WebNewsReader of http://www.interbulletin.com


--

From: [EMAIL PROTECTED] (Paul Schlyter)
Subject: Re: Triple-DES MAC
Date: 16 Feb 2001 14:28:53 +0100

In article 96iror$g7f$02$[EMAIL PROTECTED],
Christian Schwarz [EMAIL PROTECTED] wrote:
 
 hello,
 i've to calculate a MAC using Triple-DES algorithm for memory card
 
 sorry, i forgot to mention that the used Triple-DES algorithm is working in
 CBC mode.
 
Well, then it's even easier: just do a regular CBC encryption of your
plaintext, and discard all blocks except the last one, which becomes
your MAC.
 
-- 

Paul Schlyter,  Swedish Amateur Astronomer's Society (SAAF)
Grev Turegatan 40,  S-114 38 Stockholm,  SWEDEN
e-mail:  pausch at saaf dot se   orpaul.schlyter at ausys dot se
WWW: http://hotel04.ausys.se/pauschhttp://welcome.to/pausch

--

From: [EMAIL PROTECTED] (DJohn37050)
Date: 16 Feb 2001 15:37:25 GMT
Subject: Re: Triple-DES MAC

Truncate to leftmost 32 bits.  Else there is a chaining text attack (not on the
key).
Don Johnson

--

From: "Christian Schwarz" [EMAIL PROTECTED]
Subject: Re: Triple-DES MAC
Date: Fri, 16 Feb 2001 16:47:04 +0100

first of all, thanks for your response. but i've some additional questions.

how long are DES encrypted blocks ? 64 bit ?

i guess there are a lot of sample implementations. let's say i encrypt same
data using same key with different implementations, is the output equal ? if
not, is there a reference implementation available which should be used ?

i've a DES library written by Eric Young. there's a "inner triple cbc" and
"outer triple cbc" mentioned. what's the difference. which one

Cryptography-Digest Digest #698

2000-09-17 Thread Digestifier

Cryptography-Digest Digest #698, Volume #12  Sun, 17 Sep 00 09:13:00 EDT

Contents:
  Re: RC4: Tradeoff key/initialization vector size? (Guy Macon)
  Re: RC4: Tradeoff key/initialization vector size? (Paul Rubin)
  Dangers of using same public key for encryption and signatures? (Paul Rubin)
  Re: Lossless compression defeats watermarks (Niklas Frykholm)
  Re: Capability of memorizing passwords (Chris Rutter)
  Re: RC4: Tradeoff key/initialization vector size? (Guy Macon)
  Re: Dangers of using same public key for encryption and signatures? ("Brian Gladman")
  Re: Double Encryption Illegal? (Mok-Kong Shen)
  On secret Huffman compression (Mok-Kong Shen)
  Re: Double Encryption Illegal? (Mok-Kong Shen)
  Re: Tying Up Loose Ends - Correction (Mok-Kong Shen)



From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: RC4: Tradeoff key/initialization vector size?
Date: 17 Sep 2000 11:35:34 GMT

Tom St Denis wrote:

  David Crick [EMAIL PROTECTED] wrote:

 From the CipherSaber[-1] documentation (http://ciphersaber.gurus.com)

   The user key is a text string, rather than a hex value, because
   humans are more likely to be able to memorize a text string with
   sufficient entropy. To leave room for the initialization vector,
   the length of the user key must be less than 246 bytes. To insure
   adequate mixing of the initialization vector and user key, we
   recommend you select a user key of 54 bytes or less.

I would strongly recommend against using ASCII text as the key for
RC4.  You should really hash it first.


I believe that the implementation of RC4 described on the web
page [ http://ciphersaber.gurus.com ] is secure without any
such hashing.  Ciphersaber has withstood a lot of analysis
and attacks so far.

The reason I reference Ciphersaber instead of RC4 is because
the Ciphersaber implementation of RC4 (ARCFOUR, really - none of
us has proof that what we are looking at is really RC4) is
that it has a standard set of decisions concerning such mundane
details as whether the key is ASCII, how big the initialization
vector shouild be, etc. that have withstood a lot of scrutiny.


--

From: Paul Rubin [EMAIL PROTECTED]
Subject: Re: RC4: Tradeoff key/initialization vector size?
Date: 17 Sep 2000 04:48:21 -0700

[EMAIL PROTECTED] (Guy Macon) writes:
 I believe that the implementation of RC4 described on the web
 page [ http://ciphersaber.gurus.com ] is secure without any
 such hashing.  Ciphersaber has withstood a lot of analysis
 and attacks so far.

Are you kidding?

--

From: Paul Rubin [EMAIL PROTECTED]
Subject: Dangers of using same public key for encryption and signatures?
Date: 17 Sep 2000 04:56:46 -0700

Current practice seems to prefer using two separate keys, though some
systems (PGP 2.x, and effectively SSL) use the same public key for
both encryption and authentication.  I have an application where space
for keys is quite scarce.  I'd like to use the same key (point on
elliptic curve) for both encryption and signing (El-Gamal / ECDSA).
What kind of trouble am I asking for, aside from the "FBI attack"
(they make you turn over your decryption key so they can read
something, and that means they can also sign your name to stuff)?

Also, how long do my keys need to be to satisfy the paranoids in this
crowd?  Assume I'm using some constant (shared) curve over GF(p) for
some large p.  Is 140 bits enough?  How about 170?  Robert Harley has
been breaking ECDL over GF(2^n) for n=112 or so, IIRC.  But those are
easier than GF(p) curves.

thanks

--

From: [EMAIL PROTECTED] (Niklas Frykholm)
Subject: Re: Lossless compression defeats watermarks
Date: 15 Sep 2000 07:34:39 GMT

In article 8ps1ov$rt4$[EMAIL PROTECTED], Matthew Skala wrote:
It seems to me that this should be obvious, but my impression is that most
people don't quite realize it, so I'd just like to point it out:

If a watermarking scheme works perfectly (in the sense of being
inmperceptible by humans) and a lossy compression scheme works perfectly
(in the sense of maximizing compression without harming perceptual
quality) then compressing and decompressing a signal will have the effect
of removing the watermark.

That's perfectly true, and I think it's recognized now by (some of the)
people in the watermarking business. (Anyone else getting the feeling
that the people who do watermarking are more often Image Processing than
Security experts?) 

See for example "A review of watermarking principles and practices" by
Miller, Cox, Linnartz  Kalker --- it's available online just search on
Google --- pages 6--7, which mentions just this problem. A watermark has
to change the perceived content. Hopefully the change is so small that it
will not be noticed.

// Niklas

--

From: Chris Rutter [EMAIL PROTECTED]
Subject: Re: Capability of memorizi

Cryptography-Digest Digest #698

2000-05-03 Thread Digestifier

Cryptography-Digest Digest #698, Volume #11   Wed, 3 May 00 17:13:01 EDT

Contents:
  Re: Sunday Times 30/4/2000: "MI5 builds new centre to read e-mails on  (John M 
Collins)
  Re: Janet and John learn about bits (was Re: Problems with OAP-L3) (James Felling)
  Re: sci.crypt think will be AES? (Jerry Coffin)
  Re: quantum crypto breakthru? ("Leo Sgouros")
  Re: factor large composite ("Dann Corbit")
  Re: RC6 as a Feistel Cipher (Anton Stiglic)
  Re: Cascading Crypto Attack (Mok-Kong Shen)
  Re: quantum crypto breakthru? (Anton Stiglic)
  Re: Interleaving for block encryption (Mok-Kong Shen)
  Re: factor large composite (Diet NSA)
  Re: RC6 as a Feistel Cipher (John Myre)
  Re: quantum crypto breakthru? (Diet NSA)
  Re: factor large composite ("Dann Corbit")



Date: Wed, 03 May 2000 19:09:58 +
From: John M Collins [EMAIL PROTECTED]
Crossposted-To: 
uk.media.newspapers,uk.legal,alt.security.pgp,alt.privacy,uk.politics.parliament,uk.politics.crime,talk.politics.crypto,alt.ph.uk,alt.conspiracy.spy,alt.politics.uk
Subject: Re: Sunday Times 30/4/2000: "MI5 builds new centre to read e-mails on 

JimD wrote:

 Here we go again! Nobody gives any consideration to the enormous
 task of monitoring all EMail.

 Agreed a dictionary computer will look for keywords, but first it
 has to have access to all the traffic...which will have to be
 stored somewhere for most of the time.

 The sifted information has, eventually, to be looked at by a
 (slightly) human. 0.5% of it would take all week to plough through.

Some people spice up their all their emails with juicy phrases to send such
sniffers into overdrive all the time The "Zippy the Pinhead" stuff in GNU
Emacs can do the trick.

--
John Collins([EMAIL PROTECTED])
5 The Reeds, Welwyn Garden City, Herts, AL7 3BN
Tel/fax: 01707 883174   Work: 01707 886110
Personal Web Site:  http://www.jmc.xisl.com




--

From: James Felling [EMAIL PROTECTED]
Subject: Re: Janet and John learn about bits (was Re: Problems with OAP-L3)
Date: Wed, 03 May 2000 14:16:48 -0500



Anthony Stephen Szopa wrote:

 James Felling wrote:
 
  Anthony Stephen Szopa wrote:
 
   Tom St Denis wrote:
   
Richard Heathfield wrote:
 unsigned char num[] = { 0x16, 0x30, 0x47, 0x91 }; /* binary coded
 decimal (almost!) - wastes 6 combinations per nybble */

 as opposed to

 unsigned char num[] - { 0xF8, 0xCA, 0x97 }; which is clearly more
 efficient, as it uses all the bits available to it.

 So perhaps we're in violent agreement?
   
No since not all combinations of 3 byte values are possible you are
still wasting space.  That was my point.
   

   If we have two cryptography applications, one of which uses its memory
   efficiently, runs on my PII/400 at an acceptable speed, and offers me
   reliable security, and the other which doesn't use its memory
   efficiently, runs on my 400 MHz box at a speed which even its author
   says is far too slow, and is based on source code which has not been
   published and therefore has not had the chance to be validated by the
   cryptographic community - thus making its security untrustworthy - which
   application do you think anyone with a brain will buy?
 
  Or just use.  Why do you have to buy good crypto programs?

 I agree entirely. Just roll your own...

  If you have enough time on your hands you can even write your own.

 Ah, I don't have enough time on my hands. But I'm trying to write my own
 anyway g. Unfortunately, I'm too inexperienced in cryptanalysis to
 perform serious cryptanalytic attacks on my own code, let alone other
 people's. (I've cracked a couple of 'unbreakable' algorithms presented
 to me by other would-be cryptographers, but these were only 'kid-sister
 unbreakable', of course.)
   
Well it's one thing to take already developed and analyzed algorithms
and stick it together, and it's another thing *entirely* to invent your
own ciphers at the same time.  If you want a 5kb file crypto program
just take RC4 and a hash (say md2) and write a small program (I have
done it more then once :)).
   
 
  Mr Szopa has some thinking todo about making his algorithm(s) not only
  public but efficient.
 

 Possibly, but that's not his main problem. He has some really serious
 thinking to do about his ability to deal with fellow professionals in a
 professional way. It seems that anyone who dares take issue with him is
 instantly killfiled - in a mysterious and magical process which allows
 Mr Szopa to read their posts anyway, presumably so that he can killfile
 them again, and again, and again.

 When he learns to talk to grown-ups as if they are grown-ups, I suspect
  

Cryptography-Digest Digest #698

1999-06-11 Thread Digestifier

Cryptography-Digest Digest #698, Volume #9   Fri, 11 Jun 99 17:13:02 EDT

Contents:
  Re: Cracking DES (Terry Ritter)
  Re: cant have your cake and eat it too (John Savard)
  Re: Cracking DES (Patrick Juola)
  MD5 test data (Tony Lezard)
  Re: Fw: The Mathematics of Public-Key Cryptography, June 12 - June 17, 1999 ("Mike 
Murray")
  Re: Cracking DES (David Wagner)
  Re: cant have your cake and eat it too (David Wagner)
  Re: Slide Attack on Scott19u.zip (David Wagner)
  differential cryptanalysis ([EMAIL PROTECTED])
  Re: Cracking DES (John Savard)
  Re: huffman code length (Andras Erdei)
  Re: One Time Pad (Steve Rush)
  Re: cant have your cake and eat it too (David Wagner)
  Re: Cracking DES (John Savard)
  Re: Cracking DES (Patrick Juola)
  Re: Cracking DES (Eric Norman)



From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Cracking DES
Date: Fri, 11 Jun 1999 18:33:12 GMT


On Fri, 11 Jun 1999 11:07:47 GMT, in 7jqqm0$cut$[EMAIL PROTECTED], in
sci.crypt [EMAIL PROTECTED] wrote:

Terry Ritter wrote:
[...]
 So for n ciphers, we have n times the attack effort, and 1/n reward
 for any success, which makes the reward/effort ratio 1/n**2.  This is
 not linear, and that should make the attack business somewhat more
 discouraging.

No.  First the simple error: "n times the effort" is the
effort to attack all n ciphers;  1/n reward is for breaking
one of them (more on that below).  We expect the number of
ciphers broken to increase with the number of ciphers
attacked, and thus the ratio stated is not the reward/effort
ratio.

A reward/effort ratio of 1/n**2 is the ideal which we can only
approach, assuming that cryptanalysts can identify our weak ciphers so
we will not use those.  The better our cryptanalysts are, the more we
can approach that value.  And if we cannot approach that value, our
cryptanalysts are not keeping up with the opponents.  


Two other points indicate the assumptions are false.  First,
the reward is proportional to the intelligence value of the
recovered plaintext and not the volume of plaintext.  Five
percent of the traffic carries much more than five percent
of the value, since real traffic is redundant.  

Where does this come from?  Cite please.  Under what assumptions?

The next time you want to read a 200-page book, rip it apart and
reconstruct the writing from any random 10 pages.  So now "reading a
book" takes only 5% of the time it did.  That should be a boon for
busy executives everywhere!

If we consider the essence of a book to be the main plot, you might
get lucky, although even that seems dicy.  But if the essence of the
book is the experience of immersion in a different situation, you can
not get that from 10 pages.  End of story.  

If cryptography protects customer orders, we might get one of those,
and thus know the customer name and the contents of one order.  If all
orders are alike, we have the universe.  But if all orders were alike,
the message would be "I'll have my usual" and if we expose that, we
may not know much more than we did.  

If cryptography protects love letters, breaking a cipher may mean that
we get a love letter.  If the opponents can and do pay the n**2 rate,
they eventually will expose one.  But any enterprise which cannot
tolerate even one exposure must question the use of cryptography
itself, since cryptography simply does not have provable strength.  


Second, an
intelligent attacker would first try to determine what ciphers
his methods have a good chance of breaking, and concentrate
on those.  Developing differential cryptanalysis took years;
determining whether it applies to a new cipher takes hours or
sometimes minutes.

But even if an analysis technique "applies," it still must be applied.
Knowing that something "applies" does not mean that success is
available; it just means that it *might* be, with a lot of effort,
expertise, and luck.  But effort and expertise are limited resources.
That is what the n**2 term means.  If one is willing to pay the
freight, one gets the result.  

Then we have the issue of what an "attack" really means.  Differential
Cryptanalysis was developed for DES.  In that sense, DES is "broken."
And yet, in practice, we are very unlikely to ever have the data
needed to perform such an attack.  So we have an attack which does
apply and is fully known and it still does not help us expose messages
under that cipher.  How has that helped the opponents?


I know Terry Ritter disagrees, but I find that as the number
of ciphers an enterprise uses increases, the attackers
reward/effort ratio goes _up_.  He picks off the low hanging
fruit.  I grant that the ratio goes down if the attacker needs
all the encrypted information, but gaining a significant
portion of the intelligence value gets easier.

Cite it.  

---
Terry Ritter   [EMAIL PROTECTED]   htt