Cryptography-Digest Digest #767, Volume #13      Wed, 28 Feb 01 21:13:01 EST

Contents:
  Re: Keystoke recorder (nemo outis)
  Re: What is the probability that an md5sum of a group of md5sums will be the    
same? (Steve Roberts)
  Re: What is the probability that an md5sum of a group of md5sums will be the    
same? ("Sam Simpson")
  Re: what is the use for MAC(Message Authentication Code ), as there can  (Anton 
Stiglic)
  Re: DH Key Agreement in SSL (Anton Stiglic)
  Re: Keystoke recorder (William Hugh Murray)
  Re: what is the use for MAC(Message Authentication Code ), as there can be digital 
signature? ("david Hopkins")
  Re: encryption and information theory (Benjamin Goldberg)
  Re: philosophical question? (Benjamin Goldberg)
  Re: philosophical question? (wtshaw)
  HPRNG (Benjamin Goldberg)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (nemo outis)
Subject: Re: Keystoke recorder
Date: Wed, 28 Feb 2001 22:39:19 GMT

Software keyloggers include: skin98 (and later), keykey, winwhatwhere, PC 
Activity monitor, and many others.

A good (software) keylogger *detector* is PC Investigator (aka Hookprotect)

One example of an inline hardware keylogger that goes on the end of the cable 
rather than inside the keyboard is at:

http://www.keyghost.com/

Again, there are many others.

Regards,



In article <[EMAIL PROTECTED]>, Alberto 
<[EMAIL PROTECTED]> wrote:
>It's seems that the easiest way to access encrypted data is to gain
>access to the target computer and install such device.
>
>Have you ever seen one of them? How does it look like? How can you
>defend yourself against this kind of attack?
>
>Thanks.
>Alberto
>
>

------------------------------

From: [EMAIL PROTECTED] (Steve Roberts)
Crossposted-To: sci.math
Subject: Re: What is the probability that an md5sum of a group of md5sums will be the  
  same?
Date: Wed, 28 Feb 2001 23:07:26 GMT

jtnews <[EMAIL PROTECTED]> wrote:

>Given:
>
>  Files: 1 to N
>
>  A program takes files 1 to N and generates
>  an array of N md5sums S[1..N].
>
>  An md5sum is then generated on array S.
>
>  What is the probability that the md5sum
>  generated on array S will be the same
>  if only one of the files 1 to N
>  is changed?
>
>Does anyone have a clue on how to proceed
>with such a calculation?

Yeah, an MD5 digest is 128 bits and essentially random.

So the chance that your new digest is the same as the old one, or as
any given digest, is 1 in 2^128

However there is a far greater (but still tiny) chance that two
digests from a large population are the same as each other.  That's
simply because you can do more comparisons.  If you took your N
digests from the N files separately, for example.  But you'd still
need billions of files (N > 2^30 say) to have a hope of getting two
digests the same.

Steve


------------------------------

From: "Sam Simpson" <[EMAIL PROTECTED]>
Crossposted-To: sci.math
Subject: Re: What is the probability that an md5sum of a group of md5sums will be the  
  same?
Date: Wed, 28 Feb 2001 23:17:09 -0000

Steve Roberts <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> jtnews <[EMAIL PROTECTED]> wrote:
<SNIP>

> However there is a far greater (but still tiny) chance that two
> digests from a large population are the same as each other.  That's
> simply because you can do more comparisons.  If you took your N
> digests from the N files separately, for example.  But you'd still
> need billions of files (N > 2^30 say) to have a hope of getting two
> digests the same.

I'd expect it to be somewhere nearer 2^64 files - it's the birthday paradox
of a 128-bit hash function.

--
Regards,

Sam
http://www.scramdisk.clara.net/




------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: what is the use for MAC(Message Authentication Code ), as there can 
Date: Wed, 28 Feb 2001 18:33:09 -0500

david Hopkins wrote:
> 
> Why use for MAC(Message Authentication Code ),
> as there can be digital signature?
> 
> thanks

Because MACs are typically much faster to compute.
Same kind of tradeoff like between symmetric 
encryption schemes and public key encryption schemes.

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: DH Key Agreement in SSL
Date: Wed, 28 Feb 2001 18:48:18 -0500

You typicaly use what is called a key derivation function, 
which has two purposes:  get keys of the correct size and
scramble the bits in the shared secret in order to break
and algebraic structure.

Take a look at
http://www.ietf.org/internet-drafts/draft-ietf-secsh-transport-09.txt

the ietf draft for SSH2.

After the key agreement, you end up with two values, K and H
(H is the hash of a couple of stuff, K the DH secret key).
In the Draft they write the following:

5.2.  Output from Key Exchange

The key exchange produces two values: a shared secret K, and an exchange
hash H.  Encryption and authentication keys are derived from these. The
exchange hash H from the first key exchange is additionally used as the
session identifier, which is a unique identifier for this connection.
It is used by authentication methods as a part of the data that is
signed as a proof of possession of a private key.  Once computed, the
session identifier is not changed, even if keys are later re-exchanged.

Each key exchange method specifies a hash function that is used in the
key exchange.  The same hash algorithm MUST be used in key derivation.
Here, we'll call it HASH.

Encryption keys MUST be computed as HASH of a known value and K as
follows:

o  Initial IV client to server: HASH(K || H || "A" || session_id) (Here
   K is encoded as mpint and "A" as byte and session_id as raw data."A"
   means the single character A, ASCII 65).

o  Initial IV server to client: HASH(K || H || "B" || session_id)

o  Encryption key client to server: HASH(K || H || "C" || session_id)

o  Encryption key server to client: HASH(K || H || "D" || session_id)


T. Ylonen, T. Kivinen, M. Saarinen, T. Rinne and S. Lehtinen   [page 14]

INTERNET-DRAFT                                          9 January, 2001
 
o  Integrity key client to server: HASH(K || H || "E" || session_id)

o  Integrity key server to client: HASH(K || H || "F" || session_id)

Key data MUST be taken from the beginning of the hash output. 128 bits
(16 bytes) SHOULD be used for algorithms with variable-length keys.  For
other algorithms, as many bytes as are needed are taken from the
beginning of the hash value. If the key length in longer than the output
of the HASH, the key is extended by computing HASH of the concatenation
of K and H and the entire key so far, and appending the resulting bytes
(as many as HASH generates) to the key.  This process is repeated until
enough key material is available; the key is taken from the beginning of
this value.  In other words:

  K1 = HASH(K || H || X || session_id)   (X is e.g. "A")
  K2 = HASH(K || H || K1)
  K3 = HASH(K || H || K1 || K2)
  ...
  key = K1 || K2 || K3 || ...

This process will lose entropy if the amount of entropy in K is larger
than the internal state size of HASH.

-- Anton


Frédéric Donnat wrote:
> 
> Hi
> 
> I'd like to know how i migth use the secret generate with a DH Key
> Agreement in SSL.
> In SSL DH Key Agreement is used to generate a shared secret and this
> secret is use as a pre master secret but the size don't match so i'd
> like to know if the most sigificant or the least signifiant bytes are
> used !
> 
> If someone can help me !
> 
> Best Regards Fred

------------------------------

From: William Hugh Murray <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Keystoke recorder
Date: Thu, 01 Mar 2001 00:19:32 GMT

Alberto wrote:

> It's seems that the easiest way to access encrypted data is to gain
> access to the target computer and install such device.

The best place to see encrypted data is at the end points, before it is
encrypted or after it is decrypted.  Running my program implementing my
intent on the target's system with the target's privileges is a good way
to do that.  If you are persistently connected to the Internet, it is
almost as good as being you.  That is why the FBI is asserting the right
to break into your home or office for the purpose of installing such
software.  It should be noted to their benefit, that in the cases where
they admit to having done this already, they have obtained a warrant.
Notice that this is very different from a warrant that is served on you
and that you know about.  They have also suggested that they do not
really need a warrant.

> Have you ever seen one of them? How does it look like? How can you
> defend yourself against this kind of attack?
>
> Thanks.
> Alberto

Both Back Orifice and SubSeven not only permit the monitoring of your
key strokes, they facilitate doing it in real-time.  They are not
impossible to detect but they are far from obvious.  Simply running them
under another name would make them less obvious but they could be far
more covert than they are.  If one wanted to be really covert, one would
insert this functionality into a copy of the operating ystem and run it
under the operating systems identity but simply running it as an
extension or a common application would be good enough.   If the FBI
breaks into your house or office, installing these is trivial as is
monitoring you when you are on the Internet.  Getting you to install it
yourself is as easy as saying, "Click here to install the latest version
of RealPlayer" if only not so accurately aimed.  I guarantee you Hanssen
would click on any mail attachment whose name was in the Cyrillic
alphabet.  Obviously, the FBI is disingenuous when they assert that the
use or strong crypto results in perfect security for terrorists, drug
dealers, spies, hackers, and pornographers.

One protects oneself from this kind of attack with great care, not to
say difficulty.  One should run one's system in a stand-alone
configuration.  One must exercise strict  control  over the content of
one's system.  Single-user single-application systems help.  If you are
a Russian spy, you should put your system in that kind of configuration
when dealing with your communications to your handlers.  One should
reconcile what is running on one's system to what one thinks one is
running.  One should be very careful not to take the bait.  One should
run only software obtained from trusted sources in trusted packaging.
One should run one's system in a burglar resistant burglary evident
environment.



------------------------------

From: "david Hopkins" <[EMAIL PROTECTED]>
Subject: Re: what is the use for MAC(Message Authentication Code ), as there can be 
digital signature?
Date: Thu, 01 Mar 2001 00:27:20 GMT

Thank you.
So at present , is it still useful?
(make a digital signature will take less than 2  second at present)

"Anton Stiglic" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> david Hopkins wrote:
> >
> > Why use for MAC(Message Authentication Code ),
> > as there can be digital signature?
> >
> > thanks
>
> Because MACs are typically much faster to compute.
> Same kind of tradeoff like between symmetric
> encryption schemes and public key encryption schemes.



------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: encryption and information theory
Date: Thu, 01 Mar 2001 01:30:12 GMT

Mok-Kong Shen wrote:
> 
> Benjamin Goldberg wrote:
> >
> > Mok-Kong Shen wrote:
> > >
> > > John Savard wrote:
> > > >
> > > [snip]
> > > >
> > > > More precisely: if the message contains N bits of information,
> > > > and occupies M bits of bandwidth, and the K is K bits long, the
> > > > entropy of the encrypted message is N+K bits, *or* M bits,
> > > > whichever is less.
> > > >
> > > > In the case of RSA encryption, given that you know the public
> > > > key, no increase of entropy takes place.
> > >
> > > In the sense of crypto, entropy is related to the difficulty
> > > for the opponent to decrypt, I suppose. How does one explain
> > > that a key enhances entropy in the symmetric case but not in
> > > the asymmetric case, as you stated above? Thanks.
> >
> > Because RSA encryption is a known transformation which has an
> > inverse.
> >
> > Also, you can 'break' RSA encryption [by factoring] without even
> > having a message to look at.
> >
> > With symmetric encryption, breaking the system [by brute force]
> > effectivly consists of subtracting the entropy of the plaintext from
> > the entropy of the ciphertext, producing the entropy of the key.
> 
> From the point of view of the opponent, I don't see
> any 'inherent' difference between breaking, say, RSA and
> AES. Both require 'efforts'.

Yes, but entropy is a measure of randomness/unpredictability/
information, not of how hard it is to break.

> I think that one has in the asymmetic case to consider the
> 'entropy' of the private key. For, while in the symmtric
> case one has a single key, in the asymmetric case it is
> the ensemble of the public and the private key that
> constitutes the 'key' that we have to consider in the
> current context.

Consider the case when we are encrypting purely with PK, rather than
using PK to encrypt a symettric key.  Is there any information that is
added to the message during encryption that the attacker does not know?
No.  The only information added is that of the public key, which the
attacker does know.

Consider symettric encryption.  Is there information added to the
message during encryption which the attacker does not know?  YES, the
secret key is added, and the attacker does not know the secret key.

Thus, no randomness, no unpredictability, no unknown information, no
entropy is added in the case of PK encryption.

The fact that work is required to find the method for removing the
public key information is needed does not make the public key have
entropy.

Consider any fixed length string that has been produced by a TRNG.  Does
the string have entropy?  Not after you know it.  Information sources
produce entropy, but known pieces of information do not contain entropy.

We can predict (before they're seen) that the first X bits of data
produced by a TRNG will have X bits of entropy.  However, once those X
bits have been seen, they have no entropy.

This might be strange, but entropy is a measure of information.  If
someone tells you something you already know, his message contains no
information -- thus no entropy.  Once you've seen [once you know] a
piece of data, then that data has no more entropy (unless you forget
it).

-- 
The difference between theory and practice is that in theory, theory and
practice are identical, but in practice, they are not.

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt.random-numbers,de.sci.informatik.misc,sci.math
Subject: Re: philosophical question?
Date: Thu, 01 Mar 2001 01:48:37 GMT

Johannes H Andersen wrote:
> 
> Simon Johnson wrote:
> >
> > Dirk Van de moortel <[EMAIL PROTECTED]>
> > wrote in message
> > news:9g5n6.35596$[EMAIL PROTECTED]...
> > > "Peter Osborne" <[EMAIL PROTECTED]> wrote in message
> > > news:[EMAIL PROTECTED]...
> > > > Hi there!
> > > >
> > > > RANDOMNESS / RANDOM NUMBERS
> > > >
> > > > Maybe that point is not that simple at all, maybe it concerns
> > > > too many topics like statistics, math, cryptanalysis and even
> > > > religion...
> > > >
> > > > As I dealed with cryptography and HRNG circuits, I often ask
> > > > myself:
> > > >
> > > > Is randomness a kind of information ?
> > > > Is it the highest density of information (that we are not able
> > > > to understand)?
> > > > Is it merely the opposite of information?
> > > >
> > > > Can there be a fundamental difference between pseudo-randomness
> > > > and real randomness (e.g. generated by radioactive decay or
> > > > thermal noise), especially under these aspects mentioned above?
> > >
> > > Not so philosophical: I think, if I remember well, that
> > > information can be defined as something that provides an answer to
> > > a Yes-No question.
> > > I don't think randomness can do this.
> >
> > Randomness is the same as unpredictability. When unpredictability is
> > at its maximum, information content is also at its maximum. To
> > demonstrate this, think of compression. If i compressed this text,
> > the information per character in the compressed document would
> > clearly be greater than if it were not compressed. Yet on visual
> > inspection of the compressed data, it appears more random.
> >
> > Simon.
> 
> Quite true. If the compressed data appear to have some structure, then
> this structure can exploited to compressed the data even even more. At
> the final stage, the compressed data appear completely random.
> 
> However, the unpredictability of randomness looks like a paradox. E.g.
> pseudo-random numbers are entirely predictable. Also, If a truly
> randomly generated sequence is stored; next time it is used it is
> predictable?

Hmm.  "The Tao which can be spoken of is not the true Tao."

> However, unpredictability is somewhat subjective, depending on the
> knowledge of the system. Only quantum unpredictability is fundamental,
> or so we have been told by the great masters. Perhaps randomness can't
> be completely defined, yet we know it when we see it.

Randomness has the Tao-nature?

-- 
The difference between theory and practice is that in theory, theory and
practice are identical, but in practice, they are not.

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Crossposted-To: sci.crypt.random-numbers,de.sci.informatik.misc,sci.math
Subject: Re: philosophical question?
Date: Wed, 28 Feb 2001 19:39:03 -0600

In article <[EMAIL PROTECTED]>, Mike Rosing
<[EMAIL PROTECTED]> wrote:


> Yup.  As Douglas pointed out PRNG use a small set of data compared with
> HRNG which use the whole universe (now that's philosophy!).  Unpredictable
> comes from not knowing, which may be related to Heisenberg.  You gain
> meaning and predictablity by understanding the origins of signals.  Things
> appear "random" when they are unpredictable, i.e. we don't know what it
> means or how it happned.
> 
Randomness: To err is human;  pseudorandomness: To air can be caused by beans.

Computer randomness is to a hardware glitch as computer pseudorandomness
is to a software bug.

Machine randomness is to a squeak as machine pseudorandomness is to
weaving lace; oil is to good design as bad programming is to ugliness.
-- 
Better to pardon hundreds of guilty people than execute one
that is innocent.

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: HPRNG
Date: Thu, 01 Mar 2001 02:06:32 GMT

Here's an idea for a TRNG, which uses components which are used in
Quantum Cryptography.  Get a device which produces one photon at a time,
send it through a polarizer.  Follow this with a second polarizer at 45
degree angle from the first.  Photons will go through the first 100% of
the time, and through the second exactly 50% of the time.  Measure
photon/no photon as your bit of randomness.  I call the system a
Heisenburg Random Number Generator, or HRNG.  Bits might be slightly
biased, if the mirrors aren't exactly 45 degrees apart, but they should
not be correlated in any way, shape, or form.

-- 
I used to drive a Heisenburgmobile, but every time I looked at the
speedometer, I got lost.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to