Re: [PGP]: PGP 6.5.2 Random Number Generator (RNG) support

2000-02-02 Thread Martin Minow

lcs Mixmaster Remailer wrote:

> As for the concerns about back doors, the best reference on
> the design of the RNG remains cryptography.com's analysis at
> http://www.cryptography.com/intelRNG.pdf.

The one problem I have with the RNG, based on my reading of the
analysis, is that programmers cannot access the "raw" bitstream,
only the stream after the "digital post-processing" that converts
the bitstream into a stream of balanced 1 and 0 bits.
 
> And as pointed out before, this level of paranoia is ultimately self
> defeating, as Intel could just as easily put back doors into its CPU.

Also, there are much better places to leak information, including
keyboard and monitor designs that radiate detectable signals (the
"Tempest" problem).

Martin Minow
[EMAIL PROTECTED]



Re: DeCSS Court Hearing Report

2000-01-04 Thread Martin Minow

Here's my translation of the law Eivind Eklund describes. Note that,
while I am a fluent speaker of Swedish, Norwegian is a similar, but
not identical language. This is a really quick translation and should
not be relied upon for judicial opinion. It's free, and worth what
you paid for it.

S/S 39i It is permitted to create examples of a computer program's code and
  translate the code's form when this is a requirement for obtaining
  the information that is necessary to provide functional compatibilty between
  an independently developed computer program and other progrmms, where
  
  a) The operation is performed by a person who has the right to use an
 example of a computer program, or for the benefit of a person who
 has this right. [Sorry, sloppy: I think this means that an employee
 or consultant can do the work, not only the "person who has the right."]
  b) The information that is necessary to achieve functional compatibility
 has not previously been readily available for those named in section a and,
  c) The operation is limited to those parts of the original program that
 are required to achive functional compatibility.

The information obtained under the first paragraph may not
  a) Be used for other purposes than to provide functional compatibility
 with the independently developed computer program.
  b) Be given to others, except from when this is necessary to provide functional
 compatability with the independently developed computer program, or
  c) Be used for development, production, or marketing of a computer program
 that significantly duplicates the form, or in other fashion damanges
 the copyright of the program.

These paragraphs cannot be revoked by contract.

-


Eivind Eklund wrote:
> 
> On Mon, Jan 03, 2000 at 11:46:48AM -0800, bram wrote:
> > On Wed, 29 Dec 1999, Lucky Green wrote:
> >
> > > 1. CSS was reverse engineered from Xing's DVD player.
> > > 2. Xing's player requires the user to click on a button accepting a license
> > > agreement prohibiting reverse engineering.
> > > 3. Reverse engineering could not have been performed without accepting this
> > > license agreement.
> >
> > This may be reiterating the obvious, but isn't (3) just plain wrong?
> 
> Yes, but there are other errors, too.  The license agreement was not
> valid for the person doing the reverse engineering, because he is a
> norwegian, and this license agreement is in violation of paragraph 39
> section i of the norwegian copyright law ("Ã…ndsverskloven"), active
> from june 30th, 1995.  That's available from the following URL (in
> norwegian) http://www.lovdata.no/all/tl-19610512-002-029.html#39i
> 
> This section specifically allows reverse engineering to get
> information for product interoperability as long as you have a legal
> copy to start from, and cannot be revoked by a license.
> 
> Oh, and AFAIK, click- and shrinkwrap-licenses aren't considered valid
> here, either.
> 
> Eivind.

-- 
Please reply to 



Re: DVD DeCSS Docs

2000-01-03 Thread Martin Minow

The DVD plaintiff's request to the court includes, in part,

  "DVD CCA makes this Application for the issuance of a Temporary Restraining
  Order ... enjoining Defendants ... from making any further use of ...
  or "linking" to other web sites which disclose, distribute or "link" to,
  any proprietary information ..."

Note that linking to a site that which links to a site containing their
proprietary information is included in their request. With just a little
extra effort ("links to a site which links to a site which links to a site,
which links to a site."), they might have made the entire Internet a defendent.

As Pierrot would say, "Woo Hoo!"

Martin Minow
[EMAIL PROTECTED]



Encryption Rules Change Delayed.

1999-12-14 Thread Martin Minow

In its Tuesday edition, the New York Times reports that the
new encryption rules will not be released this week, as
originally planned.

http://www.nytimes.com/library/tech/99/12/cyber/capital/14capital.html

"The delay gives officials another month to address the widespread
criticism that greeted a draft of the regulations when they were
released just before Thanksgiving. Critics said the draft fell far
short of White House promises to lift restrictions on the retail
data-scrambling products used to keep computer data and online
communications like e-mail private."

---

Those of us who have been following the Bernstein case will recall
that the government successfully requested a delay in the appeal to
the full 9th Circuit (to mid-March as I recall) as the new regulations
would affect the appeal process.

Martin Minow
[EMAIL PROTECTED]



Siemens German Digital Signature Chip Hacked

1999-12-01 Thread Martin Minow

The Register <http://www.theregister.co.uk> reports that the Siemens
Digital Signature Chip used for cashless payments (and recently ratified
for use by the European Union) was disassembled. According to The Register's
sources, "the knowledge gained has already been used to get hold of Telesec
private keys.

Martin Minow
[EMAIL PROTECTED]

-- 
Please reply to <mailto:[EMAIL PROTECTED]>



Re: draft regulations?

1999-11-24 Thread Martin Minow


Russell Nelson wrote:
> ...  You also have to (somehow) prevent users from
> Cuba, Iran, Iraq, Libya, North Korea, Sudan and Syria form downloading
> the code. 

Ok. how am I going to do that (rhetorical question)? My Web Server is the
module distributed with every recent MacOS system (i.e., all those millions
of iMac's and iBooks). It's a Control Panel (TSR in DOS-speak) called
"Web Sharing". As far as I know, it has no mechanism for preventing
certain domains from accessing a local web page. Of course, I could
put up a link that says "click here if you aren't a terrorist", but
I rather doubt that this will satisfy the regulations.

Martin Minow
[EMAIL PROTECTED]



New Scientist Article on Do-it-yourself Evesdropping

1999-11-08 Thread Martin Minow

<http://www.newscientist.com/ns/19991106/newsstory6.html>

"SOFTWARE that allows a computer to receive radio signals could make
spying on
 other computers all too simple, according to two scientists at the
University of
 Cambridge. Such are the dangers that they are patenting countermeasures
that
 computer manufacturers can take to foil any electronic eavesdroppers. "

[I note that there are several commercially-available shortwave radios
that are
designed to be controlled by computer, and the technology is well within
the
capability of a reasonably skilled hobbiest. See recent issues in the
amateur radio
magazine, QST, for one example of a radio using a digital signal
processor for
signal management and overall control.]

Transcribed by Martin Minow, <[EMAIL PROTECTED]>






Bruce Schneier Interviewed in SlashDot.

1999-10-30 Thread Martin Minow

You will find an interesting and informative interview with
Bruce Schneier on cryptogrphy in SlashDot:

<http://slashdot.org/interviews/99/10/29/0832246.shtml>

Martin Minow
[EMAIL PROTECTED]





Re: having source code for your CPU chip -- NOT

1999-09-23 Thread Martin Minow



"Steven M. Bellovin" wrote:
"Steven M. Bellovin" <[EMAIL PROTECTED]> wrote:
> 
> In message <v04210102b40f68d103d7@[63.193.122.223]>, Martin Minow writes:
> 
> >
> > Yeah, but 370 Assembler H had a very extensive macro facility and
> > you could hide all kinds of wierd stuff in 370 code. Not too many
> > folk left around who can read it.
> 
> And those of us who once could no longer remember how to -- for me, it's at
> least 20 years (more like 25, actually) since I touched the stuff...
> >

It's been 30 for me and I still have some listings lying around but
haven't the foggiest idea what some of the macros do (same for my
7090 assembler).

> That isn't the real problem -- most crypto routines per se are small enough
> that one could verify the machine code without too much effort.  The problem
> is the environment they're embedded in.  By that I mean not just the
> crypto-using application, but the entire operating system.  By example, I
> could verify the machine code for IDEA, but not PGP and certainly not your
> favorite version of UNIX.

Why run crypto code on Unix? You could write a tiny microkernel
(semaphores, interrupt redirection, static memory allocation, no
memory management or protection) for a PDP-11 (or a similar "modern"
computer such as a 68HC11) in about 1000 lines of C and 200 lines of
assembler. (Or buy one ready-made from any of a half-dozen vendors)
Add a minimal IP stack and web server and you have enough of an
environment to write a complete "crypto machine" that can be verified
with a line-by-line code walk-through. Put the "crypto machine" in a
bullet-proof (and Tempest proof) container and "drive" it with HTML.

While you can't validate the Dallas Semiconductor TINI operating system,
it could serve as a test platform for a Java-based design. The crypto
secrets would stay on an iButton while the TINI provides the network
front-end. Both are programmed in Java.

Martin.
ps: I found Decus C on ftp://ftp.update.uu.se/pub/pdp11/decusc.tar.Z
It looks complete.



Re: having source code for your CPU chip -- NOT

1999-09-23 Thread Martin Minow

At 9:26 AM -0700 9/22/99, Bill Frantz wrote:
>
>My own approach would be to audit the generated code.  In KeyKOS/370, we
>"solved" the problem by using an assembler which was written before KeyKOS
>was designed.  (N.B. KeyKOS/370 was written in 370 Assembler H).
>

Yeah, but 370 Assembler H had a very extensive macro facility and
you could hide all kinds of wierd stuff in 370 code. Not too many
folk left around who can read it.

I have a copy of Decus C (Open Source PDP-11 C) lying around and
wrote enough of its compiler and code generator to know what it can
and cannot do, in case anyone is interested. The entire source code
of the C compiler is small enough to sight-verify in about a man-month.
A "Small C" compiler (see early issues of Dr. Dobbs) can be implemented
in about 3 man months and ought to be good enough for crypto work.

Martin Minow
[EMAIL PROTECTED]


[And then how do you trust your assembler? Or the compiler and
assembler you compiled the C compiler on? And the linker? If you
really try hard enough on all this, you find your self smack dab in
front of Kurt Goedel's door, and he tends to have unpleasant news for
visitors who come to him looking for solace.

And of course, once you've done all this lovely work, the NSA comes in
and puts a microscopic bug into your keyboard cable in the night, or
replaces your hand verified assembler executables, or...

I suggest that in practical terms, one has to set some reasonable
limits on what one is willing to do to overcome risk. Paranoia is a
potential source of infinite work, but there is only a finite amount
of work one can do in a given lifetime. That is not to say that *some*
paranoia isn't of value, but perfect paranoia results in a perfect
absence of progress on one's projects.

   --Perry]



Re: Why did White House change its mind on crypto?

1999-09-17 Thread Martin Minow


On Fri, Sep 17, 1999 at 11:05:37AM -0400, Russell Nelson wrote:
> What's the difference between that, and someone claiming that a
> certain piece of text decrypts to a sinister message?

What's the difference between this and claiming that a certain
drop of blood has DNA characteristics that match a particular
person? In the O.J. Simpson trial, the government took over
a month to explain to the jury the similarities between the
blood collected from the crime scene and the defendent; and
the defense lawyers rebutted the evidence by claiming that
it may have been contaminated or planted by the police.

Since my only legal education was from watching that trial, it
seems to me that only a jury can decide whether a particular
message was written by a particular individual and that it
is the government's responsibility to provide evidence "beyond
a resaonable doubt" to that effect.

I don't see how the government can take this responsibility
away from the jury.

Martin Minow
[EMAIL PROTECTED]



Re: plausible CAPI recovery designs (Re: FW: Cryptonym...)

1999-09-09 Thread Martin Minow

Adam Back wrote:
> 
> This general area of discussion -- software modification
> authentication -- is a bit fuzzy: if you can modify the software you
> can patch out the check of the signature (a correctly placed NOP is
> known to do it).
> 

The "PowerTalk" release of Macintosh System 7 (around 1993) allowed anyone
with an RSA public key to sign any Macintosh file, and any user to validate
a signed file. This worked for both applications and data files.

Recall that Macintosh files may contain both pure data and "resources"
(essentially a set of key/value pairs). Files were signed by adding a
specific resource with a signed MD5 hash. The signature format was
standardized, but I don't recall the specifics. The hash was taken over
the data area and a specific subset of resources to allow certain
resources to be modified without invalidating the hash: the set of
unchecked resources could be controlled by the signer and the set was,
itself, signed. While it would be trivial to remove a signature resource,
this would be apparent to the end user. A missing or invalid signature
check could also be made within the program itself.

Regrettably, PowerTalk (which had to run on a 4 MB 68020 and provided
integrated e-mail, directory services, authenticated and encrypted
networking and a slew of other features and mis-features) was far too
ambitious and, ultimately, unsuccessful in the marketplace. Some of
its capabilities are reported to be included in future Apple software
releases.

Martin Minow
[EMAIL PROTECTED]



Re: s/w radios & secure modules

1999-08-22 Thread Martin Minow

At 2:39 PM -0700 8/21/99, David Honig wrote:
>In the Aug 16 '99 EETimes, there are several articles
>about software radios.  These have analog front ends,
>and after down-conversion are digital.  This lets you
>deal with complex back-compatability/protocol/DSP improvement/legal issues
>flexibly.
>

The September issue of the amateur radio magazine, QST,
has the circuit diagram and construction details for a
2 meter (144 Mhz) transceiver (receiver and transmitter).
The software for the DSP and radio control, source included,
is promised for the October issue.

Martin Minow
[EMAIL PROTECTED]





Re: The Beer Bottle Cipher (some fun summer reading for you...)

1999-06-30 Thread Martin Minow

At 12:07 -0400 1999.06.30, Ron Rivest described the Beer Bottle Cypher,
asking:
>
>The actual security of this cipher seems to be an open question... Can it
>be broken?
>

Have you tried getting an export license for it?

Martin Minow
[EMAIL PROTECTED]





Re: MacOS 8.7 Security

1999-05-18 Thread Martin Minow

At 3:57 PM -0700 5/17/99, Dave Del Torto wrote:
>
>If this is based on the "speaker independent" voice recognition in
>PlainTalk,

My understanding is that it not based on a speaker independent
technology and, as a former linguist, it is not clear to me that
the technology offers information useful for speech recognition.

>
>I'm sure no-one on this list needs me to tell them that this is a
>VERY BAD IDEA: the marketing weasel who came up with it should be
>soundly thrashed (pun intended). Stick with typed passphrases.

My understanding is that this augments typed pass phrases.

Remember, the security model is "keep little brother out of big
sister's secret diary," not "protect the nuclear bomb codes."
Products like PGP Disk still have their place.


Martin, not speaking for Apple.





Did the court publish cryptography on the web?

1999-05-07 Thread Martin Minow

The appeals court decision, at the web at


contains source of the core of Bernstein's "Sunffle" program. (Search
for "Hash512" if you want to see just the naughty bits.)

Isn't publishing this on a web page what Bernstein wasn't allowed
to do because of ITAR/EAR?

Martin.





NY Times article on Shamir's public key breaking machine

1999-05-01 Thread Martin Minow

gives a very superficial overview of a paper to be presented Tuesday
that describes a hardware attack on public key cryptography.

"Researchers said that if his machine worked it would mean that
cryptographic systems with keys of 512 bits or less -- that is, keys less
than about 150 digits long -- would be vulnerable in the future, an
exposure that would have seemed unthinkable only five years ago. The longer
1,024-bit keys that are available today would not be vulnerable at present."

Martin Minow
[EMAIL PROTECTED]







Re: Cryptoprocessors and reverse engineering

1999-01-29 Thread Martin Minow

At 8:01 PM -0800 1/28/99, John Gilmore wrote:
>
>The opportunity to reverse-engineer in order to get past a deliberate
>software monopoly lock-up is critical.  Remove this from computer
>architectures at your peril.
>

About a year ago, someone from Intel presented their thinking about
security at an Oakland Cypherpunks meeting. My take from his presentation
is that, in the medium to long term, a chip manufacturer can put a key
and the necessary cryptographic infrastructure in the instruction decoder
of a microcomputer (and, if they did it right, there would be no significant
performance impact). This would let a vendor bind their application to
a specific CPU: the application would have limited "demo" functionality
when it was purchased. To unlock the full application, you would
connect to the vendor's server, provide the chip ID, and receive
the necessary modules compiled for your particular chip -- and only
that specific chip. This essentially puts the "dongle" inside
the processor.

Mathematica does something similar, using other "system-unique"
information to bind the application to a specific machine.

This is, of course, another argument *for* open source software.

Martin Minow
[EMAIL PROTECTED]







Re: Hardware Random Number Generators

1999-01-28 Thread Martin Minow

>This I got from computer historian, Simon Lavington.
>
>The (Manchester) Ferranti Mark I had a hardware random number generator.
>This was specified by Alan Turing - (A copy of his original
>Internal Report, dated 1949 I believe, still exists.)  ...

For what it's worth, Illiac 1 (also a first-generation computer) had,
unintentionally, a hardware random number generator. Recall that
Illiac memory was based on a Williams Tube. This was an electrostatic
display tube. To write a 1-bit to memory, the computer directed the
electron beam to the bit's x-y coordinate. To read memory, it directed
the beam (at a lower energy?) to that spot. If a bit was previously
set, a pulse could be detected by a copper plate affixed to the front
of the tube. (This is from memory: I apologize for any errors).

Williams tube memory suffered from charge leakage. Imagine a tic-tac-toe
configuration. Set the center bit to zero. Now, if you write 1-bits into
the 8 exterior locations, there was a low probability of incorrectly
reading a 1-bit from the center. On Illiac, about 400 cycles would be
needed before the center bit changed. This was not considered good
programming practice (!) but could be used to generate a few
random bits numbers by recovering the precise number of cycles.

Martin. (walking repository of Illiac lore: it was my first computer)
[EMAIL PROTECTED]