Cryptography-Digest Digest #837, Volume #9        Wed, 7 Jul 99 02:13:03 EDT

Contents:
  Re: The One-Time Pad Paradox (William Tanksley)
  Re: Keeping File Formats Safe (fungus)
  Re: I don't trust my sysadmin (Vernon Schryver)
  Re: The One-Time Pad Paradox (Jim Gillogly)
  Re: DES-NULL attack (S.T.L.)
  Re: I don't trust my sysadmin ([EMAIL PROTECTED])
  extending a hash ([EMAIL PROTECTED])
  Re: Summary of 2 threads on legal ways of exporting strong crypto (Isaac)
  Re: How to find the period of a sequence (Terry Ritter)
  Re: I don't trust my sysadmin (Tramm Hudson)
  Re: I don't trust my sysadmin (Greg Ofiesh)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (William Tanksley)
Subject: Re: The One-Time Pad Paradox
Reply-To: [EMAIL PROTECTED]
Date: Tue, 06 Jul 1999 23:27:41 GMT

On Mon, 05 Jul 1999 16:07:00 +0200, Dr.Gunter Abend wrote:
>[EMAIL PROTECTED] wrote:

>> If I _intend_ to filter keypads, but because my filter criteria
>> is fairly wide, never expect to _actually_ filter a pad, then
>> I've given the adversay who is aware of my policy some information
>> about all of the messages I've sent.

>> The same issue applies to ciphertext filtration.

>   [Example of vanishingly small probability snipped.]

>> So, my filtration has weakened my security by an amount that
>> rounds to zero.

>I agree completely: _How_much_ the cipher has been weakened,
>depends on the probability of _actual_ filtering.

So I don't filter, and therefore get truly zero weakness.  :-)

You're wrong, though -- the fact that you're filtering patterns can itself
be a useful leak.

>I proposed:
>Discard the ciphertext (and repeat the encryption) if it
>contains more than 10% letter tripletts with at least one
>vowel. Random texts would contain about 0.4% letter tripletts
>of this kind. It is *very* unlikely that a long text would
>contain more than 10%.

The probability can be computed, and the margin can also be computed.
(I.e. how unlikely is "*very*".)  Given enough ciphertext (a silly
supposition), the adversary can determine the probability that filtering
was applied to them.

>The degradation of the secrecy due to
>this modification could be calculated. I myself don't know
>how to do this calculation, thus I ask the specialists here
>in this newsgroup whether they can give me an estimation.

I don't know either.  I could figure it out if it were worth enough to me.

If you're using OTP, it's worth enough to your enemy.  Don't bother.

>Ciao,   Gunter

-- 
-William "Billy" Tanksley

------------------------------

From: fungus <[EMAIL PROTECTED]>
Subject: Re: Keeping File Formats Safe
Date: Wed, 07 Jul 1999 02:37:29 +0200



Kile Mornay wrote:
> 
> [EMAIL PROTECTED] (Bradley Yearwood) wrote:
> 
> >I look forward to the day when standards of auditability and fiduciary
> >responsibility finally recognize the risks of opaque proprietary data
> >formats.  No prudent person should allow their vital business data to be
> >taken and held hostage.
> 
> Agreed, but this sounds like a case where the author is supplying his own
> data and doesn't want it grabbed and used elsewhere. For example, a
> dictionary program that supplies definitions and pronunciations of words
> could benefit from protection like this.

There's no garanteed way to protect the files from hackers. The
only thing to do is to obfusticate the decryption procedure as
much as possible. Any "standard" cipher will do the trick as far
as encryption is concerned (even DES). The algorithm chosen won't
be the weakest link in the chain.


-- 
<\___/>
/ O O \
\_____/  FTB.

------------------------------

From: [EMAIL PROTECTED] (Vernon Schryver)
Subject: Re: I don't trust my sysadmin
Date: 6 Jul 1999 16:46:32 -0600

>In article <[EMAIL PROTECTED]>,
>David N. Murray <[EMAIL PROTECTED]> wrote:

> ...
>>My sysadmin is *not* allowed into my payroll database,
>>however, I need to run a job every night against the 
>>payroll database (let's say a program that e-mail's 
>>employee review notifications to managers).

As stated, it's hopeless.


n article <[EMAIL PROTECTED]>,
Terje Mathisen  <[EMAIL PROTECTED]> wrote:

] ...
]The only real solution (AFAIK) to this problem is to use
]application-level strong crypto on all the data in your database, this
]way the server and it's disks and backup tapes becomes pretty much
]worthless for anyone without the proper keys.

Or anyone who has physical access to the computer and so can modify a
driver to occassionally check the process table for your program, and if
it's present, find your physical pages and copy interesting bits.  In this
age of open system source, it is for an enemy among the operators to modify
the system to subvert absolutely any mechanism you can imagine that ever
has the cleartext available to it for an instant, or means to generate
cleartext, such as a key.  Think about old /dev/kmem, new process
filesystems, and the common mechanisms for forcing the system to crash
and dump all of memory to disk for later analysis of a system bug (or
capture of your supersecret data).  Then there is the utility of SIGQUIT
and other mechanisms for making your application drop a core file (e.g.
patching your application to contain an illegal instruction).  You are
safe from the operators only if your application can compute blindly on
the cyphertext to produce other cyphertext, and only if the computing
itself would tell your enemies nothing.


In article <7ltksr$a79$[EMAIL PROTECTED]>,
Patrick Juola <[EMAIL PROTECTED]> wrote:

> ...
>Think of it this way -- what's to prevent the sysadmin from pulling
>the disk, physically, out of your machine, byte-by-byte copying it
>on a PC box and then going over the data at his leisure?
>
>There *are* operating systems available that prevent this; they
>tend to be sold to the DoD and almost no one else.

No, those don't work either.  I've seen a commercial UNIX system certified
by the U.S. government, and I don't mean the easy stuff but full MAC
(mandatory access controls), capabilities, and so forth.  Therein lies a
long sad tale of woe starting ~10 years ago involving hordes of pointy
haired managers, rising executives (a special, particularly dangerous
breed of the pointy haired), and salescritters.  Realsoonnow banks and
businesses in general were going to (10 years ago) stop using ordinary,
insecure UNIX (sic) and switch to government grade real security.  So lots
of hooks, ifdefs, stuff, features, bugs, and other wonderful stuff were
added all over the system.  Periodically thereafter the company security
experts, none of whom have the time or inclination to really read and
understand the source, tell the government security inspectors "sure, we
checked this pile of 10,000 lines of spaghetti in the Gbyte source tree.
It is supposed to have only these interfaces and since we've added these
1000 poorly structured, hard to read, uncommented lines to guard those
interfaces in this pile, the whole 10K pile is safe."  The government
experts peer at it, make token demands for changes to prove to everyone
that they're on the ball, and stamp it certified, and move on to the next
pile.  I used to rag the company security experts by pointing out that I
could add covert channels to code that I regularly worked on that they'd
never notice, or even be able to find if they knew they existed.  About
their only response was that I wouldn't because I had no reason to.

To find obscure bugs in my code, I've been shipped to secure sites.  There
the consultants employed to make the systems do whatever the government
wanted would walk me past the armed guards into the machine room, type
the magic passwords to turn off the MAC and other nonsense, and we would
sit down to diagnose the problems.  Think about that; those who guard
their systems with guns are forced to trust whomever is operatoring their
computers.

We've all heard about the old days when special operators would arrive to
throw out the normal operators and take over machine rooms to do things
with stacks of tapes.  I've been told about systems distinctly larger than
most "diskless workstations" that were diskless or had removable disks
kept in vaults when not in use.  I suspect that's how the professionally
paranoid deal with not wanting to trust all operators all of the time.
-- 


Vernon Schryver    [EMAIL PROTECTED]

------------------------------

From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: The One-Time Pad Paradox
Date: Tue, 06 Jul 1999 16:56:13 -0700

William Tanksley wrote:
> 
> On Mon, 05 Jul 1999 15:51:35 +0200, Dr.Gunter Abend wrote:
> >Jim Gillogly wrote:
> 
> >In order to avaoid this kind of leak of the OTP technique
> >you should not apply it to ASCII texts, but compress them
> >beforehand. Usually, the bit frequencies in compessed files
> >are fairly uniform.

Attribution alert: I didn't write the above.  You didn't trim enough.

> Always compress before you encrypt (unless your data is random, in which
> case you don't need to encrypt ;-).
> 
> But seriously, if you're compressing, you're not going to be able to leak
> useful info.  Even when you DO get a string of seven zero bytes only junk
> will show though, even to the most dedicated psychic.

-- 
        Jim Gillogly
        Highday, 13 Afterlithe S.R. 1999, 23:54
        12.19.6.6.1, 12 Imix 9 Tzec, Fourth Lord of Night

------------------------------

From: [EMAIL PROTECTED] (S.T.L.)
Subject: Re: DES-NULL attack
Date: 07 Jul 1999 00:20:32 GMT

<<I disagree:  the fact that people work with gigabytes all the time makes it
much easier to _visualize_ a billion gigabytes.  An Exabyte is more likely to
just be another huge number.>>

Well then, why not say a trillion megabytes? A quadrillion kilobytes? Maybe
just a quintillion bytes. Oooh, I know. 8 quintillion bits. Aha, jackpot.

Moo-Cow-ID: 57  Moo-Cow-Message: my

-*---*-------
S.T.L.  ===> [EMAIL PROTECTED] <===  BLOCK RELEASED!    2^6972593 - 1 IS PRIME!
Quotations:  http://quote.cjb.net  Main website:  http://137.tsx.org    MOO!
"Xihribz! Peymwsiz xihribz! Qssetv cse bqy qiftrz!"  e^(i*Pi)+1=0   F00FC7C8
E-mail block is gone. It will return if I'm bombed again. I don't care, it's
an easy fix. Address is correct as is. The courtesy of giving correct E-mail
addresses makes up for having to delete junk which gets through anyway. Join
the Great Internet Mersenne Prime Search at http://entropia.com/ips/  Now my
.sig is shorter and contains 3395 bits of entropy up to the next line's end:
-*---*-------

Card-holding member of the Dark Legion of Cantorians, the Holy Order of the
Catenary, the Great SRian Conspiracy, the Triple-Sigma Club, the Union of
Quantum Mechanics, the Polycarbonate Syndicate, the Roll-Your-Own Crypto
Alliance, and People for the Ethical Treatment of Digital Tierran Organisms
Avid watcher of "World's Most Terrifying Causality Violations", "When Kaons
Decay: World's Most Amazing CP Symmetry Breaking Caught On [Magnetic] Tape",
"World's Scariest Warp Accidents", "World's Most Energetic Cosmic Rays", and
"When Tidal Forces Attack: Caught on Tape"
Patiently awaiting the launch of Gravity Probe B and the discovery of M39
Physics Commandment #12: The Weak Force Is Carried By W+, W-, and Z0 bosons.

------------------------------

Date: Tue, 06 Jul 1999 08:27:03 -0400
From: [EMAIL PROTECTED]
Subject: Re: I don't trust my sysadmin

Vernon Schryver wrote:
> No, those don't work either.  I've seen a commercial UNIX system certified
> by the U.S. government, and I don't mean the easy stuff but full MAC
> (mandatory access controls), capabilities, and so forth.  Therein lies a
> long sad tale of woe starting ~10 years ago involving hordes of pointy
> haired managers, rising executives (a special, particularly dangerous
> breed of the pointy haired), and salescritters.

It started much further back than that.  Consider that Un:x was created
by programmers frustrated with the silliness of developing Multics,
which was to be all things to all people and secure too.  These problems
are endemic because they are a natural result of the inevitable arogance
of large organizations.  The members of the org can see that they have
more capabilities than a small org, but cannot see that they have more
problems as well.  So they become arrogant.  And in that environment
stupidity rises to the top in accordance with the Peter Principle.

  Realsoonnow banks and
> businesses in general were going to (10 years ago) stop using ordinary,
> insecure UNIX (sic) and switch to government grade real security.  So lots
> of hooks, ifdefs, stuff, features, bugs, and other wonderful stuff were
> added all over the system.  Periodically thereafter the company security
> experts, none of whom have the time or inclination to really read and
> understand the source, tell the government security inspectors "sure, we
> checked this pile of 10,000 lines of spaghetti in the Gbyte source tree.
> It is supposed to have only these interfaces and since we've added these
> 1000 poorly structured, hard to read, uncommented lines to guard those
> interfaces in this pile, the whole 10K pile is safe."  The government
> experts peer at it, make token demands for changes to prove to everyone
> that they're on the ball, and stamp it certified, and move on to the next
> pile.  I used to rag the company security experts by pointing out that I
> could add covert channels to code that I regularly worked on that they'd
> never notice, or even be able to find if they knew they existed.  About
> their only response was that I wouldn't because I had no reason to.
> 
> To find obscure bugs in my code, I've been shipped to secure sites.  There
> the consultants employed to make the systems do whatever the government
> wanted would walk me past the armed guards into the machine room, type
> the magic passwords to turn off the MAC and other nonsense, and we would
> sit down to diagnose the problems.  Think about that; those who guard
> their systems with guns are forced to trust whomever is operatoring their
> computers.
> 
> We've all heard about the old days when special operators would arrive to
> throw out the normal operators and take over machine rooms to do things
> with stacks of tapes.  I've been told about systems distinctly larger than
> most "diskless workstations" that were diskless or had removable disks
> kept in vaults when not in use.  I suspect that's how the professionally
> paranoid deal with not wanting to trust all operators all of the time.
> --
> 
> Vernon Schryver    [EMAIL PROTECTED]

------------------------------

From: [EMAIL PROTECTED]
Subject: extending a hash
Date: Tue, 06 Jul 1999 16:03:00 -0700

Is this a good idea to extend a sha-1 hash to 320 bits:

1st 160 bits=sha(even bytes of plaintext) ^ sha(all plaintext)
2nd 160 bits=sha(odd bytes of plaintext)  ^ sha(all plaintext)

Is the entropy 320 bits?

------------------------------

From: [EMAIL PROTECTED] (Isaac)
Crossposted-To: talk.politics.crypto
Subject: Re: Summary of 2 threads on legal ways of exporting strong crypto
Date: 7 Jul 1999 01:44:54 GMT

On Tue, 06 Jul 1999 17:47:09 -0400, Paul Koning <[EMAIL PROTECTED]> wrote:
>
>Of course you're in Germany so you don't have to deal with any
>of these annoying regulations.  But please, do the US readers of
>this a favor and stop spouting nonsense that might get them in
>trouble if they are foolish enough to believe you.
>

I wish I'd read your message before bothering with mine.  This
sums things up rather nicely.

Isaac

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: How to find the period of a sequence
Date: Wed, 07 Jul 1999 04:39:01 GMT


On Tue, 06 Jul 1999 17:37:39 -0400, in <[EMAIL PROTECTED]>,
in sci.crypt Paul Koning <[EMAIL PROTECTED]> wrote:

>Brian McKeever wrote:
>> 
>> Well, it depends...  You don't say what information you are given, ie
>> whether you have a generator and you want to know it's period (based on
>> repeated inner state), or someone else has a generator and you want to know
>> it's period (based on repeated output).  I assume you mean the first case.
>> For this, there is a clever technique I picked up (don't recall where I read
>> it, but IIRC it was in a paper by Terry Ritter), where you have two
>> generators G1 and G2 initialized with the same internal state, but for each
>> one time you clock G1 you clock G2 twice.  Then you wait for their initial
>> states to match again.  In order to get the period of the loop they are in,
>> start a counter, and stop it when they match a third time.
>
>That's the way to do it if you have a system that gets into a loop after
>some unknown amount of non-loop.
>
>When I learned that algorithm, in the early 80's, it was attributed
>to E.W.Dijkstra.

My second edition of Knuth II §3.1.6(b) on p. 7 attributes it to R. W.
Floyd, but I don't have a better reference.  Mentioned in:

   http://www.io.com/~ritter/ARTS/CRNG2ART.HTM#Sect6.2

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Tramm Hudson)
Subject: Re: I don't trust my sysadmin
Date: 6 Jul 1999 22:12:40 -0600

[posted and cc'd to citd author]

Vernon Schryver <[EMAIL PROTECTED]> wrote:
[snip a good discucssion of the failings of secure systems]

> I've been told about systems distinctly larger than
> most "diskless workstations" that were diskless or had removable disks
> kept in vaults when not in use.  I suspect that's how the professionally
> paranoid deal with not wanting to trust all operators all of the time.

The Intel Paragon and Teraflops (ASCI Red) both have the ability to switch
between classified and unclassified operations.  The disks are the "hard"
components to switch and in all cases can not be reused across security
boundaries.

On the Teraflops there is a physicall disconnect between the compute
and disk cabinets.  The machine is structured with two "heads" of 
IO nodes and a collection of diskless compute nodes in between.  One
side contains classified disks, the other unclassified.  In order to
change between operational modes the IO cabinets from the other protection
domain must have an air gap from the current one.  Hence the name
of the machine -- "janus".

Professionally paranoid?  Maybe..  But we do run Linux.

Tramm
(not at all in an official capacity)
-- 
  o   [EMAIL PROTECTED]                 [EMAIL PROTECTED]   O___|   
 /|\  http://www.swcp.com/~hudson/          H 505.323.38.81   /\  \_  
 <<   KC5RNF @ N5YYF.NM.AMPR.ORG            W 505.284.24.32   \ \/\_\  
  0                                                            U \_  | 

------------------------------

From: Greg Ofiesh <[EMAIL PROTECTED]>
Subject: Re: I don't trust my sysadmin
Date: Wed, 07 Jul 1999 05:52:18 GMT


> How do I store the uname/password to make it as difficult as
> possible for the sysadmin to retrieve?  My basic assumption
> is that if I encrypt the password, I have to decrypt it to
> present it to the DBMS.  That means that the key, algorithm,
> and ciphertext are all in the same place, right?  Isn't that
> a Bad Thing?
>
> Any suggestions would be welcome.


If this is work, then I would either get the sysadmin fired or I would
quit.  This is a rediculous situation to live or work under.

If this is personal ISP, then I would find another ISP or make myself
an ISP.  Then again, a cheap solution may be found in another post to
your questions.


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to