Cryptography-Digest Digest #585, Volume #14 Mon, 11 Jun 01 09:13:01 EDT
Contents:
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG (Tim
Tyler)
Re: Uniciyt distance and compression for AES (Tim Tyler)
Re: Uniciyt distance and compression for AES (Tim Tyler)
Re: cubing modulo 2^w - 1 as a design primitive? ("Arne Baltin")
Re: National Security Nightmare? ([EMAIL PROTECTED])
Re: Def'n of bijection (Phil Carmody)
Re: National Security Nightmare? (JPeschel)
Re: National Security Nightmare? (Mok-Kong Shen)
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY (Mok-Kong
Shen)
Re: Uniciyt distance and compression for AES (Mok-Kong Shen)
Re: Def'n of bijection (Mok-Kong Shen)
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG (John
Savard)
Re: National Security Nightmare? (SCOTT19U.ZIP_GUY)
Re: best encryption? (SCOTT19U.ZIP_GUY)
Re: National Security Nightmare? (JPeschel)
Re: Best, Strongest Algorithm (gone from any reasonable topic) (Phil Carmody)
Re: Def'n of bijection ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 10:07:37 GMT
John Savard <[EMAIL PROTECTED]> wrote:
: On Sun, 10 Jun 2001 16:01:26 -0700, "John A. Malley" wrote:
:>A cipher with perfect secrecy for the finite set M requires as many
:>cryptograms as messages. Let the finite set {c1, c2, c3, c4} = E, the
:>set of cryptograms.
: You are correct that perfect secrecy is attainable for messages of
: different lengths without a need for padding. However, one can explain
: that in a brief fashion.
: We know that it is possible to have perfect secrecy using the
: conventional one-time-pad if all messages have the same length.
: Let us then consider all messages of length from 1 bit to N bits.
: The number of such messages is 2 + 4 + 8 + 16 + ... + 2^N, which
: number is 2^(N+1) - 2.
: Thus, let us convert every message of length N bits or less to a
: message N+1 bits in length by this rule: pad the message on the _left_
: with zero or more zeroes, followed by a one, as required to achieve
: the length of N bits.
: These messages shall then correspond to integers, the smallest integer
: being 2 (the message "0" as padded in this system) and the largest
: being 2^(N+1)-1.
: Subtract 2 from the message, and then apply a one-time-pad containing
: a key value from 0 to 2^(N+1)-3 to the message using addition modulo
: 2^(N+1)-2.
Getting random numbers in that range from the pad by using a sophisticated
procedure that is not guaranteed to terminate...? ;-)
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 10:19:28 GMT
[EMAIL PROTECTED] wrote:
: Tim Tyler wrote:
:> [EMAIL PROTECTED] wrote:
:> : I concur that a compression algorithm that compressed messages
:> : which are meaningful in the original language, and expanded messages
:> : which are meaningless in the original language, would increase unicity
:> : distance if only compressed messages were used as inputs to the
:> : encryption algorithm.
:>
:> Ah good. There's no need for me to reply to your original message, then
:> ;-)
:>
:> : It seems that one would have to demonstrate that only meaningful
:> : messages are compressed in order to say that compression increases
:> : unicity distance.
:>
:> No, that would be an error, also perpetrated by other participants here.
:> It is *not* necessary to show that *only* meaningfulk messages are
:> compressed in order to say that compression increases unicity distance.
:> All that is necessary is for meaningfull messages to be compressed. It
:> matters not-in-the-slightest that loads of other junk is compressed as
:> well.
:>
:> : What isn't clear to me is how a compression algorithm can be intelligent
:> : enough to distinguish "meaningful" from "meaningless" inputs (although
:> : it would be easier if the compression algorithm knew the input language).
:>
:> Compression algorithms need to do no such thing in order
:> for the unicity distance to be increased.
:>
:> All they really need to do is compress plausible-looking messages
:> on average - and face it - if they didn't do that, it would be hard
:> to justify calling them compressors in the context of the target data.
: Someone is going to have to a better job of explaining this before I can
: buy it.
: The explanation should be simple:
: n_o = H(k)/d where n_o = unicity distance, H(k) is keyspace entropy
: and d = redundancy = r_o - r_n and r_o = r_n if all possible
: messages are meaningful.
: Explain how compression effectively reduces the redundancy (i.e. even if
: I am able to decompress decryptions before determining whether or not
: the key is spurious) if both meaningful and meaningless messages are
: compressed. H(k) is assumed constant so the only way to increase n_o is
: to decrease d . So to show that compression increases unicity distance,
: one has to show that compression effectively reduces the redundancy d.
That's what compression *does*. It makes files shorter, increasing
the entropy per bit (entropy remains the same, number of bits in
message decreases). When entropy-per-bit goes up, redundancy goes down.
: I don't see how compression can reduce d unless it filters out
: meaningless messages.
I'm not sure what "filter out" might mean in the context of lossless
compression - but assuming it means "makes longer" it logically must
do that in order to compress the target set.
If meaningful files get shorter, other files must get longer, by the
counting theorem.
What it *doesn't* need to do is filter out *all* meaningless messages.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 10:24:25 GMT
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Tom St Denis <[EMAIL PROTECTED]> wrote:
:> : This is not true. In fact it's just the opposite. Any good codec makes a
:> : few files smaller.
:>
:> You err. Most codecs make an infinite set of files smaller.
: A compressor appropriate for a given application should
: compress the files of that application on the average
: to a smaller sizes. One certainly needn't care files
: that don't belong to the application.
Most codecs can deal with unboundedly long inputs.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: "Arne Baltin" <[EMAIL PROTECTED]>
Subject: Re: cubing modulo 2^w - 1 as a design primitive?
Date: Mon, 11 Jun 2001 12:46:00 +0200
"Mok-Kong Shen" <[EMAIL PROTECTED]> schrieb im Newsbeitrag
news:[EMAIL PROTECTED]...
>
>
> Tom St Denis wrote:
> >
> > "Mok-Kong Shen" <[EMAIL PROTECTED]> wrote:
> > > you posted on Fri, 08 Jun 2001 21:24:35 +0200:
> > >
> > > I find often the biggest problem with math papers/discussions
> > > is the lack of a good language to discuss it in. For example,
> > > my book on Group Theory I got (From Dover) only has 13 words
> > > in the entire text. The rest is vague human egyptian art work
> > > that future archeologists will look at and say "this means
> > > fire, and that's water, and ...".
> > >
> > > This can obvioulsy never be true, or else there would be an
> > > immense scandal about the publisher Dover that has a good
> > > name and whose scientific books have always been of good
> > > quality, even though to a large part outdated. (BTW, I am
> > > myself in posssesion of a Dover book on group theory!)
> >
> > I never said the book is bad. I said it's bad to learn from. It's not
a
> > good text IMHO. Koblitz's "Course in Number ..." is a good text because
it
> > involves english :-)
>
> What are you talking about here in view of the quote I gave
> about your earlier post above?? Read once again your own
> words that you had written!!
>
> M. K. Shen
>
------------------------------
Subject: Re: National Security Nightmare?
From: [EMAIL PROTECTED]
Date: 11 Jun 2001 06:56:57 -0400
[EMAIL PROTECTED] (JPeschel) writes:
> Mok-Kong Shen [EMAIL PROTECTED] writes:
>>
>> In France I heard that there is a national instute that decides
>> authoritatively on language issues of French. Is there a similar
>> one for the English world?
>
> Yes. They told me you should listen to me and Len, er, Len and
> me... I mean, uh, Len and I...
``to Len and me''. The test is to leave out ``Len'': you get ``listen to
me.'' It doesn't change when you put Len back in. (``to me and Len'' is
also correct, but it's considered unmannerly to put ``me'' or ``I'' first
in such constructions.)
Len.
--
Frugal Tip #54:
Invent an embarrassing new kind of underwear, then convince stupid people
to buy it by telling them that Hollywood celebrities wear it.
------------------------------
From: Phil Carmody <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Date: Mon, 11 Jun 2001 11:13:18 GMT
[EMAIL PROTECTED] wrote:
>
> Tim Tyler <[EMAIL PROTECTED]> writes:
> > [EMAIL PROTECTED] wrote:
> >: Tim Tyler <[EMAIL PROTECTED]> writes:
> >:
> >:> The results are here: The messages get a bit shorter.
> >
> > Incidentally, chopping out a huge section from the middle of a post,
> > juxtaposing widely separated text on the same line while including no
> > editorial marks to show what you've done in order to produce the
> > impression that I /had/ no worthwhile results [...]
>
> You had no worthwhile results. The upshot of your ``experiment''
> was the earth-shattering conclusion that ``the messages get a bit
> shorter.'' For some reason you saw a need to prove that--but don't
That was not Tim's conclusion as I read it, but then again I didn't
mangle Tim's text.
His conclusion was that there are more 256-bit arbitrary bicom outputs
that are the image of plausible English text than there are 256-bit
arbitrary bitstrings which represent ascii (and that are plausible
English text).
If you think about what compression is trying to achieve, yes, it's
obvious, but it is an independently noteworthy fact.
If you didn't like Tim's handwaving (and his 'factor of 5' example),
then you could have provided a real example which indicates where you
think he's wrong perhaps?
> see a need to prove that BICOM improves secrecy for real messages.
>
> > ...no longer feel obliged to stoop to your level in order to help
> > correct your misconceptions.
>
> No need for hard feelings. You can't prove that normal messages are
> any more likely to yield false decrypts after being compressed with
> BICOM. You can't prove it because it isn't true--but if name-calling
> makes you feel better about it, feel free.
Can you prove it makes things worse? Can you prove it leaves things
unchanged?
Claiming someone else's statement is false because he hasn't yet proved
it is a hollow argument. Unless you disprove the claim it's just a case
of gainsaying, which leads to the most tedious usenet threads.
However, you've misquoted and missummarised, which possibly leads the
casual observer to believe that you may have misunderstood, what was
being said.
I'd like to see Tim's proof, I'm sure there is one.
Phil
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Date: 11 Jun 2001 11:18:47 GMT
Subject: Re: National Security Nightmare?
[EMAIL PROTECTED] writes:
>[EMAIL PROTECTED] (JPeschel) writes:
>> Yes. They told me you should listen to me and Len, er, Len and
>> me... I mean, uh, Len and I...
>
>``to Len and me''. The test is to leave out ``Len'': you get ``listen to
>me.'' It doesn't change when you put Len back in. (``to me and Len'' is
>also correct, but it's considered unmannerly to put ``me'' or ``I'' first
>in such constructions.)
Sheesh -- talk about being a kill joy. :-)
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: National Security Nightmare?
Date: Mon, 11 Jun 2001 13:44:09 +0200
JPeschel wrote:
>
> Mok-Kong Shen [EMAIL PROTECTED] writes:
>
> >In France I heard that there is a national instute
> >that decides authoritatively on language issues of French.
> >Is there a similar one for the English world?
>
> Yes. They told me you should listen to me and Len, er, Len and
> me... I mean, uh, Len and I...
Sounds fairly like religious: 'The Holy Spirit revealed
to me .....'.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY
Date: Mon, 11 Jun 2001 13:53:10 +0200
Tim Tyler wrote:
>
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> : Tim Tyler wrote:
> :> John A. Malley <[EMAIL PROTECTED]> wrote:
> :> :> > Tim Tyler wrote:
> :> :> > > M.K. Shen wrote:
>
> :> :> > > : My memory of Shannon's paper is no good, but I don't think that he
> :> :> > > : considered the length of the messages.
> :> :> > >
> :> :> > > I don't think it was mentioned either - all the messages were
> :> :> > > the same length in the system in question.
> :>
> :> : Just a comment - the messages in a finite set do NOT need to be of the
> :> : same length for the cipher to achieve perfect secrecy. [...]
> :>
> :> ...but they *do* if one is using an OTP to encrypt them.
> :>
> :> Apologies if the fact that an OTP was intended was not clear from the
> :> context.
>
> : Opinions seem to differ here. So let me once again ask:
> : Has Shannon proved the perfect security of the conventional
> : OTP (for messages of finite but varying length) or not?
> : I like to know the result clearly and unambigiously. Thanks.
>
> I thought John Malley at least was fairly clear and unambiguous in writing:
>
> ``3) WHY ENCIPHERING A FINITE SET OF MESSAGES BY XORING RANDOM BINARY
> STRINGS AS LONG AS THE MESSAGES DOES *NOT* GUARANTEE PERFECT SECRECY''
>
> ...though maybe a qualification abot there not being 2^n messages all of
> the same length needs to be tacked onto that headline.
>
> It appears that Shannon only mentioned the OTP while dealing with
> the case of infinite streams of data.
>
> You say "opinions seem to differ here". Who disagrees at this stage?
> Are you referring to Tom St Denis?
No. In the quote above John Malley wrote: '....set do NOT
need ....' but you wrote: '....but they *do* ....'. That's
why I wrote that opinions seem to differ.
May I repeat: Does Shannon's writing (alone) has established
(fully) the perfect security of the conventional OTP? My
interpretation of what you wrote would be 'no'. Is that
the case? If yes, we need a complete rigorous formal proof
(perhaps based on Shannon's work) to establish the prefect
security of the conventional OTP or else a similarly
rigorous formal proof for the opposite case.
Thanks.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Mon, 11 Jun 2001 13:58:04 +0200
Tim Tyler wrote:
>
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> : Tim Tyler wrote:
> :> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> :> : This is not true. In fact it's just the opposite. Any good codec makes a
> :> : few files smaller.
> :>
> :> You err. Most codecs make an infinite set of files smaller.
>
> : A compressor appropriate for a given application should
> : compress the files of that application on the average
> : to a smaller sizes. One certainly needn't care files
> : that don't belong to the application.
>
> Most codecs can deal with unboundedly long inputs.
In all practical cases there can be given a safe upper
bound of the input length. Whether infinite input could be
treated isn't of any practical significance, I suppose.
M. JK. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Date: Mon, 11 Jun 2001 14:04:55 +0200
Phil Carmody wrote:
>
> [EMAIL PROTECTED] wrote:
> >
> > Tim Tyler <[EMAIL PROTECTED]> writes:
> > > [EMAIL PROTECTED] wrote:
> > >: Tim Tyler <[EMAIL PROTECTED]> writes:
> > >:
> > >:> The results are here: The messages get a bit shorter.
> > >
> > > Incidentally, chopping out a huge section from the middle of a post,
> > > juxtaposing widely separated text on the same line while including no
> > > editorial marks to show what you've done in order to produce the
> > > impression that I /had/ no worthwhile results [...]
> >
> > You had no worthwhile results. The upshot of your ``experiment''
> > was the earth-shattering conclusion that ``the messages get a bit
> > shorter.'' For some reason you saw a need to prove that--but don't
>
> That was not Tim's conclusion as I read it, but then again I didn't
> mangle Tim's text.
> His conclusion was that there are more 256-bit arbitrary bicom outputs
> that are the image of plausible English text than there are 256-bit
> arbitrary bitstrings which represent ascii (and that are plausible
> English text).
> If you think about what compression is trying to achieve, yes, it's
> obvious, but it is an independently noteworthy fact.
May I remark what I figured out after asking some dumb
questions to Tim Tyler. BICOM incorporates AES and AES
uses 128 or 256 bit keys. The numbers involved in the
issue of mapping come from the the length of keys.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
Date: Mon, 11 Jun 2001 12:33:28 GMT
On Mon, 11 Jun 2001 10:07:37 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote, in
part:
>Getting random numbers in that range from the pad by using a sophisticated
>procedure that is not guaranteed to terminate...? ;-)
I'm assuming one's pad _contains_ random numbers in that range.
(Throwing away pads containing 1111...1110 and 1111...1111 certainly
will work as well, but I ignored such details. Instead, keep those
pads, and deal appropriately with the two sums - now from XOR - that
don't shorten, I did note as an alternative.)
John Savard
http://home.ecn.ab.ca/~jsavard/frhome.htm
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: National Security Nightmare?
Date: 11 Jun 2001 12:40:09 GMT
[EMAIL PROTECTED] (JPeschel) wrote in
<[EMAIL PROTECTED]>:
>Mok-Kong Shen [EMAIL PROTECTED] writes:
>
>>In France I heard that there is a national instute
>>that decides authoritatively on language issues of French.
>>Is there a similar one for the English world?
>
>Yes. They told me you should listen to me and Len, er, Len and
>me... I mean, uh, Len and I...
>
Actaully since English is already so missed up they wrote
me and told me that my style was more suited for the future
of English. One of the first things they plan to do is to
change all spellings of there to there and that it would be
come more like the spoken language.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged or
something..
No I'm not paranoid. You all think I'm paranoid, don't you!
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: best encryption?
Date: 11 Jun 2001 12:28:22 GMT
[EMAIL PROTECTED] (Dirk Heidenreich) wrote in
<9g1rnk$t4o$04$[EMAIL PROTECTED]>:
>Hello,
>
>i am not in use to any security programms. But i am interested into how
>to safe my data. I am looking for a very good programm, so which one
>would you suggest? What is objective the best?
>Thanks for your help.
>
Since you ask. If its for your own file security. I would
use mine scott16u or 19u. If you have fairh in AES I would
use BICOM
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged or
something..
No I'm not paranoid. You all think I'm paranoid, don't you!
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Date: 11 Jun 2001 12:40:18 GMT
Subject: Re: National Security Nightmare?
Mok-Kong Shen [EMAIL PROTECTED] writes:
>JPeschel wrote:
>>
>> Mok-Kong Shen [EMAIL PROTECTED] writes:
>>
>> >In France I heard that there is a national instute
>> >that decides authoritatively on language issues of French.
>> >Is there a similar one for the English world?
>>
>> Yes. They told me you should listen to me and Len, er, Len and
>> me... I mean, uh, Len and I...
>
>Sounds fairly like religious: 'The Holy Spirit revealed
>to me .....'.
That's because I was kidding. :-)
Was your question about a national language
institute for English a serious one?
If it was, the Modern Language Association
probably come closest to the answer.
If you just want to improve your English
prose, you need to consult a few good
handbooks, and read good writing. You
won't find, for the most part, examples
of good writing on Usenet.
I find Fowler's book well-nigh useless,
Perrin's reference is a good, but thick
one, and Strunk and White's slim book
a bible of sorts.
Read White's collected prose, a few of
Bellow's essays, Updike's fiction and
prose, and The New Yorker magazine.
Then write every day.
(But not here!)
Now, revise and re-write. Then do it again.
If you start writing as a profession, you'll
find that every periodical has its own style
sheet, one in addition to the MLA style book,
or the Chicago Manual of Style, or the AP Style
Book.
To paraphrase Schneier: Writing is harder
than it looks.
Good luck.
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: Phil Carmody <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: Mon, 11 Jun 2001 12:52:23 GMT
[EMAIL PROTECTED] wrote:
> Anyway, you did a neat job of ignoring the points I made, which
> address the heart of your problems. To wit:
>
> 1. Number of plausible messages in the space of all messages of size 2510
> bytes or less. This can be estimated reasonably (and conservatively) at
> 2^5 000. Since the space of all binary files up to 2510 bytes has size
> around 2^20 000, the probability of a random file up to 2510 bytes being a
> plausibly mistaken for an English message is something like 1 in 2^15 000,
> or effectively zero.
>
> 2. Density of binary files up to 2510 bytes whose preimage under BICOM
> is ``plausible'': I have no idea. I'm willing to bet, however, that
> the odds of a BICOM preimage being ``plausible'' is essentially
> zero. Can you prove that the probability of a random 2510-byte-or-less
> file having a plausible decompression is larger than 1 in 2^20 000?
>
> (If you can even prove, for random files of size 1024 bytes or less, that
> plausible BICOM preimages are likelier than 1 in 2^1 000, then I'll keep
> you in beer for the next six months.)
Hand-waving heuristics seem to say to me that if the mean compression of
plausible plaintext when processed by BICOM is 15, then then density
would increase from 1 in 2^15000 to 1 in 2^1000. If you are so sure that
the density decreases, set your wager point at 2^14999, and then maybe
someone will take you up on the challenge.
Phil
------------------------------
Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 11 Jun 2001 09:03:58 -0400
Phil Carmody <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED] wrote:
>>
>> [Tim Tyler] had no worthwhile results. The upshot of your
>> ``experiment'' was the earth-shattering conclusion that ``the messages
>> get a bit shorter.'' For some reason you saw a need to prove that...
>
> His conclusion was that there are more 256-bit arbitrary bicom outputs
> that are the image of plausible English text than there are 256-bit
> arbitrary bitstrings which represent ascii (and that are plausible
> English text).
True: there are almost certainly ``more''. What does ``more'' mean,
though?
That depends on the message length. For messages short enough to have
BICOM compressions no larger than the key, it is *possible* that BICOM
plus some cipher gives perfect secrecy. Under the same conditions, OTP
*provably* gives perfect secrecy--so we're crazy to go with ``maybe''
when ``definitely'' is available. (Recall: the reason we don't use OTP
for everything is the key distribution problem. If Mr. Tyler has solved
that problem, then he has made all other discussion of crypto moot.)
Furthermore:
(1) If the (compressed) messages are larger than the key, this purported
``benefit'' of compression falls off exponentially. Once messages reach
the size of normal emails (or usenet posts), the ``shorter unicity
distance'' has no practical effect whatsoever.
(2) The above-cited benefit for small messages has nothing to do with
bijectivity, and everything to do with compression. The benefit is
maximized by choosing the best compression algorithm available, without
any regard to bijectivity.
Which is why I said, ``For 'large' messages--where 'large' is no bigger
than 1KB or so--the only benefit of bijective compression is increased
work for trial decryption.''
>> No need for hard feelings. You can't prove that normal messages are
>> any more likely to yield false decrypts after being compressed with
>> BICOM...
>
> Can you prove it makes things worse? Can you prove it leaves things
> unchanged?
Please do not shift the burden of proof. The guy who says, ``BICOM
improves security because X'' is obligated to prove it. Nobody is
obligated to disprove it. The burden of proof rests with the claimant.
However, I gave extremely strong arguments *suggesting* that the
situation is virtually unchanged by using BICOM for normal-sized
messages. They are fairly convincing, and could probably be converted
into a proof without much trouble. (At issue is the definition of
``plausible'' messages. Which is context dependent: in the case of
diplomatic or military messages, the space of ``plausible'' messages
is vanishingly tiny compared to the space of ``English'' messages.)
> Claiming someone else's statement is false because he hasn't yet proved
> it is a hollow argument.
On the contrary. *Everyone's* statement is presumed false until proven.
That's the scientific method, I'm afraid. In a court of Law, Mr. Tyler
must be presumed ``innocent''. In any scientific discipline, Mr. Tyler
must be presumed ``incorrect'' until he proves otherwise. You're
shifting the burden of proof.
> I'd like to see Tim's proof, I'm sure there is one.
So would I, but I'm equally sure there is none. If he had one, he would
not have resorted to the Parable of the Fish in the Aqaurium.
Len.
--
What gives you the idea that the Putnam problems are written carefully?
-- Dan Bernstein
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************