RE: Steven Levy buys Microsoft's bullshit hook, line, and sinker

2002-06-24 Thread Lucky Green

Bram wrote: 
> http://www.msnbc.com/news/770511.asp?cp1=1
> 
> Of course, the TCPA has nothing to do with security or
> privacy, since those are OS-level things. All it can really 
> do is ensure you're running a particular OS. 
> 
> It's amazing the TCPA isn't raising all kinds of red flags at
> the justice department already - it's the most flagrant 
> attempt to stifle competition I've ever seen.

[Bram is correct, stifling competition is one of the many features TCPA
will enable. In more ways than one. And for more players than just
Microsoft].

Coincidentally, Steven Levy's article that Bram is citing also helps
answer Mr. Anonymous's question with which he challenged Ross and myself
earlier today.

First, however, I must apologize to the reader for my earlier, now
incorrect, statement that TCPA member companies would deny that DRM is
an objective of the TCPA. I had been unaware that, as evidenced by the
publication of the Newsweek article, the public phase of the TCPA effort
had already begun. What a bizarre coincidence for this phase, after all
those years the TCPA effort and its predecessors have been underway,
(the design, and in fact the entire architecture, has morphed
substantially over the years) to be kicked off the very day of my post.

[Tim: do you recall when we had the discussion about the upcoming
"encrypted op code chips" at a Cypherpunks meeting in a Stanford lecture
hall? Was that 1995 or 1996? It cannot have been later; I know that I
was still working for DigiCash at the time because I remember giving a
talk on compact endorsement signatures at the same meeting].

>From Levy's article:
"Palladium [Microsoft's TCPA-based technology - LG] is being offered to
the studios and record labels as a way to distribute music and film with
"digital rights management" (DRM). This could allow users to exercise
"fair use" (like making personal copies of a CD) and publishers could at
least start releasing works that cut a compromise between free and
locked-down. But a more interesting possibility is that Palladium could
help introduce DRM to business and just plain people. "It's a funny
thing," says Bill Gates. "We came at this thinking about music, but then
we realized that e-mail and documents were far more interesting
domains."'

Another paragraph of the Newsweek article has this to say:

"In 1997, Peter Biddle, a Microsoft manager who used to run a paintball
arena, was the company's liason to the DVD-drive world. Naturally, he
began to think of ways to address Hollywood's fear of digital copying.
He hooked up with [...] researchers Paul England and John Manferdelli,
and they set up a skunkworks operation, stealing time from their regular
jobs to pursue a preposterously ambitious idea-creating virtual vaults
in Windows to protect information. They quickly understood that the
problems of intellectual property were linked to problems of security
and privacy.
They also realized that if they wanted to foil hackers and
intruders, at least part of the system had to be embedded in silicon,
not software."

Well, now that Bill Gates himself is being quoted stating that DRM was a
driver behind the technology the TCPA is enabling (Microsoft is one of
the companies that founded the TCPA and should be in a position to
know), does Mr. Anonymous consider this sufficient "evidence that the
TCPA is being designed for the support of digital rights management
(DRM) applications"? Or does Anonymous continue to believe Ross and
Lucky are making this stuff up out of whole cloth?

To answer Anonymous's question as to whether the "the TCPA [is] really,
as [Ross and Lucky] claim, a secretive effort to get DRM hardware into
consumer PCs?", I am not sure I would exactly call this fact a secret at
this point. (Though by no means are all cards already on the table).

DRM is a significant objective of some of the TCPA's member companies,
which includes Microsoft.

There are of course other objectives. Some of which Ross published, some
which I mentioned, some which Steven Levy has published (though he
largely fell for the designated bait and missed the numerous hooks),
some which Bram has realized, and some which have yet to be talked
about. Some desirable, some questionable, and a lot of them downright
scary.

Sincerely,
--Lucky Green




Re: Ross's TCPA paper

2002-06-24 Thread R. A. Hettinga

--- begin forwarded text


Status:  U
Date: Sun, 23 Jun 2002 12:53:42 -0700
From: Paul Harrison <[EMAIL PROTECTED]>
Subject: Re: Ross's TCPA paper
To: "R. A. Hettinga" <[EMAIL PROTECTED]>
User-Agent: Microsoft-Outlook-Express-Macintosh-Edition/5.02.2022

on 6/23/02 6:50 AM, R. A. Hettinga at [EMAIL PROTECTED] wrote:

>
> --- begin forwarded text
>
>
> Status:  U
> From: "Lucky Green" <[EMAIL PROTECTED]>
> To: <[EMAIL PROTECTED]>
> Cc: <[EMAIL PROTECTED]>
> Subject: RE: Ross's TCPA paper
> Date: Sat, 22 Jun 2002 23:01:12 -0700
> Sender: [EMAIL PROTECTED]
>

> None of these obstacles are impossible to overcome, but not by Joe
> Computer User, not by even the most talented 16-year old hacker, and not
> even by many folks in the field. Sure, I know some that could overcome
> it, but they may not be willing to do the time for what by then will be
> a crime. Come to think of it, doing so already is a crime.
>
> --Lucky Green
>
> --- end forwarded text
>
The discussion of TCPA has a tendency to avoid serious discussion of what I
feel is the core security issue:  ownership of the platform.  Comments such
as Lucky's:

"TPM will make it near impossible for the owner of that motherboard to
access supervisor mode on the CPU without their knowledge"

obfuscate this.  The Trusted Computing Platform includes the TPM, the
motherboard and the CPU, all wired together with some amount of tamper
resistance.  It is meaningless to speak of different "owners" of different
parts.  The owner of a TCP might be a corporate IT department (for employee
machines), a cable company (for set-top boxen), or an individual.  The
important question is not whether trusted platforms are a good idea, but
who will own them.  Purchasing a TCP without the keys to the TPM is like
buying property without doing a title search.  Of course it is possible to
_rent_ property from a title holder, and in some cases this is desirable.

I would think a TCP _with_ ownership of the TPM would be every paranoid
cypherpunk's wet dream.  A box which would tell you if it had been tampered
with either in hardware or software?  Great.  Someone else's TCP is more
like a rental car:  you want the rental company to be completely responsible
for the safety of the vehicle.  This is the economic achilles heal of using
TCPA for DRM.  Who is going to take financial responsibility for the proper
operation of the platform?  It can work for a set top box, but it won't fly
for a general purpose computer.

--- end forwarded text


-- 
-
R. A. Hettinga 
The Internet Bearer Underwriting Corporation 
44 Farquhar Street, Boston, MA 02131 USA
"... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'




Re: Ross's TCPA paper

2002-06-24 Thread Mike Rosing

> Date: Sun, 23 Jun 2002 12:53:42 -0700
> From: Paul Harrison <[EMAIL PROTECTED]>
> Subject: Re: Ross's TCPA paper
> I would think a TCP _with_ ownership of the TPM would be every paranoid
> cypherpunk's wet dream.  A box which would tell you if it had been tampered
> with either in hardware or software?  Great.  Someone else's TCP is more
> like a rental car:  you want the rental company to be completely responsible
> for the safety of the vehicle.  This is the economic achilles heal of using
> TCPA for DRM.  Who is going to take financial responsibility for the proper
> operation of the platform?  It can work for a set top box, but it won't fly
> for a general purpose computer.

Exactly my point - economicly it can't work for the "nightmare" scenario.

The whole DRM concept is seriously flawed, and the fact it's being
pushed by a guy who used to run a paint-ball arena is really no
supprise.

There's a large group of academics working on DRM concepts for access
to university facilities, including libraries and computers.  They
use secure platforms, but they still have to worry about who gets
physical access to the platform.

And I also don't think "conspiricy" is the right term.  The article
Lucky quoted from indicated that use of the trusted platform for
DRM was an afterthought, and that's much more believeable.  A bunch
of sharks looking for money all swim around the same target.  It has
to do with where the money is, not any collusion between the players.

S.2048 is not likely to see the light of day.  The automotive
industry is bigger than the entertainment industry, and they have
more sway in washington when it comes to how much some bill is
going to cost them.  S.2048 makes cars way too expensive, and when
union workers find out that a) they will have fewer jobs and b)
they won't be able to watch videos when they get home, the shit
will hit the fan big time.

Definitly write a letter to your congress critter to let them know
the whole thing is stupid.  But don't call it a conspiricy, that
gives the morons thinking this whole thing up a bit too much
intellect.

Patience, persistence, truth,
Dr. mike




RE: DOJ proposes US data-rentention law.

2002-06-24 Thread Trei, Peter

I tried sending this last week, but it did not seem to go through:

Two points:

1. According to Poulson, the DOJ proposal never 
discussed just what would be logged. Poulson 
compared it to the European Big Brother legislation, 
which required storage to Web browsing 
histories and email header data (NOT email body text
or IP traffic).

2. After I posted the same info to /.
http://slashdot.org/articles/02/06/19/1724216.shtml?tid=103
(I'm the 'Anonymous Coward' in this case), Kevin updated
his article. The new version may be found at:
http://online.securityfocus.com/news/489

The relevant portions read:

- start quote -

U.S. Denies Data Retention Plans

The Justice Department disputes claims that Internet service 
providers could be forced to spy on their customers as part 
of the U.S. strategy for securing cyberspace.
By Kevin Poulsen, Jun 19 2002 12:24PM

[...]

But a Justice Department source said Wednesday that data 
retention is mentioned in the strategy only as an industry 
concern -- ISPs and telecom companies oppose the costly idea -- 
and does not reflect any plan by the department or the White 
House to push for a U.S. law. 

[...]

- end quote -

Peter Trei




sins of the fathers (brothers, sisters, etc)

2002-06-24 Thread Major Variola (ret)

On Israel's decision to deport families of martyrs:

A Palestinian legislator, Salah Tamari, called deporting families "an
illegal,
unlawful and inhuman measure. Why should somebody be accountable for
someone elses actions?
http://www.news.scotsman.com/international.cfm?id=685972002

Someone needs to clue Tamari into the US court-approved practices
in welfare cages: families are tossed for the indiscretions of
familymembers,
if the indiscretions involve pharmaceuticals.

Pharms, nitrates, whatever.

But then, the US is becoming Israel, why not a bit of the reverse?

-
And the rockets' red glare, the bombs bursting in air,
Gave proof thro' the night that Osama was still there.




Re: Ross's TCPA paper

2002-06-24 Thread Derek Atkins

I, for one, can vouch for the fact that TCPA could absolutely
be applied to a DRM application.  In a previous life I actually
designed a DRM system (the company has since gone under).  In
our research and development in '96-98, we decided that you need
at least some trusted hardware at the client to perform any DRM,
but if you _did_ have some _minimal_ trusted hardware, that would
provide a large hook to a fairly secure DRM system.

Check the archives of, IIRC, coderpunks... I started a thread entitled
The Black Box Problem.  The issue is that in a DRM system you (the
content provider) wants to verify the operation of the client, even
though the client is not under your control.  We developed an online
interactive protocol with a sandbox environment to protect content,
but it would certainly be possible for someone to crack it.  Our
threat model was that we didn't want people to be able to use a hacked
client against our distributation system.

We discovered that if we had some trusted hardware that had a few key
functions (I don't recall the few key functions offhand, but it was
more than just encrypt and decrypt) we could increase the
effectiveness of the DRM system astoundingly.  We thought about using
cryptodongles, but the Black Box problem still applies.  The trusted
hardware must be a core piece of the client machine for this to work.

Like everything else in the technical world, TPCA is a tool..  It is
neither good nor bad; that distinction comes in how us humans apply
the technology.

-derek

"Lucky Green" <[EMAIL PROTECTED]> writes:

> Anonymous writes:
> > Lucky Green writes regarding Ross Anderson's paper at: 
> > Ross and Lucky should justify their claims to the community 
> > in general and to the members of the TCPA in particular.  If 
> > you're going to make accusations, you are obliged to offer 
> > evidence.  Is the TCPA really, as they claim, a secretive 
> > effort to get DRM hardware into consumer PCs? Or is it, as 
> > the documents on the web site claim, a general effort to 
> > improve the security in systems and to provide new 
> > capabilities for improving the trustworthiness of computing platforms?
> 
> Anonymous raises a valid question. To hand Anonymous additional rope, I
> will even assure the reader that when questioned directly, the members
> of the TCPA will insist that their efforts in the context of TCPA are
> concerned with increasing platform security in general and are not
> targeted at providing a DRM solution.
> 
> Unfortunately, and I apologize for having to disappoint the reader, I do
> not feel at liberty to provide the proof Anonymous is requesting myself,
> though perhaps Ross might. (I have no first-hand knowledge of what Ross
> may or may not be able to provide).
> 
> I however encourage readers familiar with the state of the art in PC
> platform security to read the TCPA specifications, read the TCPA's
> membership list, read the Hollings bill, and then ask themselves if they
> are aware of, or can locate somebody who is aware of, any other
> technical solution that enjoys a similar level of PC platform industry
> support, is anywhere as near to wide-spread production as TPM's, and is
> of sufficient integration into the platform to be able to form the
> platform basis for meeting the requirements of the Hollings bill.
> 
> Would Anonymous perhaps like to take this question?
> 
> --Lucky Green
> 
> 
> -
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

-- 
   Derek Atkins
   Computer and Internet Security Consultant
   [EMAIL PROTECTED] www.ihtfp.com




Re: Ross's TCPA paper

2002-06-24 Thread Ross Anderson

> It's an interesting claim, but there is only one small problem.
> Neither Ross Anderson nor Lucky Green offers any evidence that the TCPA
> (http://www.trustedcomputing.org) is being designed for the support of
> digital rights management (DRM) applications.

Microsoft admits it:

http://www.msnbc.com/news/770511.asp

Intel admitted it to me to. They said that the reason for TCPA was that
their company makes most of its money from the PC microprocessor; they
have most of the market; so to grow the company they need to grow the
overall market for PCs; that means making sure the PC is the hub of the
future home network; and if entertainment's the killer app, and DRM is
the key technology for entertainment, then the PC must do DRM.

Now here's another aspect of TCPA. You can use it to defeat the GPL.

During my investigations into TCPA, I learned that HP has started a
development program to produce a TCPA-compliant version of GNU/linux.
I couldn't figure out how they planned to make money out of this. On
Thursday, at the Open Source Software Economics conference, I figured
out how they might.

Making a TCPA-compliant version of GNU/linux (or Apache, or whatever)
will mean tidying up the code and removing whatever features conflict
with the TCPA security policy. The company will then submit the pruned
code to an evaluator, together with a mass of documentation for the
work that's been done, including a whole lot of analyses showing, for
example, that you can't get root by a buffer overflow.

The business model, I believe, is this. HP will not dispute that the
resulting `pruned code' is covered by the GPL. You will be able to
download it, compile it, check it against the binary, and do what you
like with it. However, to make it into TCPA-linux, to run it on a
TCPA-enabled machine in privileged mode, you need more than the code.
You need a valid signature on the binary, plus a cert to use the TCPA
PKI. That will cost you money (if not at first, then eventually).

Anyone will be free to make modifications to the pruned code, but in
the absence of a signature the resulting O/S won't enable users to
access TCPA features. It will of course be open to competitors to try
to re-do the evaluation effort for enhanced versions of the pruned
code, but that will cost money; six figures at least. There will
likely be little motive for commercial competitors to do it, as HP
will have the first mover advantages and will be able to undercut them
on price. There will also be little incentive for philanthropists to
do it, as the resulting product would not really be a GPL version of a
TCPA operating system, but a proprietary operating system that the
philanthropist could give away free. (There are still issues about who
would pay for use of the PKI that hands out user certs.) The need to
go through evaluation with each change is completely incompatible with
the business model of free and open source software.

People believed that the GPL made it impossible for a company to come
along and steal code that was the result of community effort. That 
may have been the case so long as the processor was open, and anyone
could access supervisor mode. But TCPA changes that completely. Once
the majority of PCs on the market are TCPA-enabled, the GPL won't work
as intended any more. There has never been anything to stop people
selling complementary products and services to GPL'ed code; once the
functioning of these products can be tied to a signature on the
binary, the model breaks.

Can anyone from HP comment on whether this is actually their plan?

Ross




Re: Ross's TCPA paper

2002-06-24 Thread Harry Hawk

It seems clear at least if DRM is an application than DRM applications would benefit
from the "increased trust" and architecturally that such "trust" would be needed to
enforce/ensure some/all of the requirements of the Hollings bill.

hawk

Lucky Green wrote:

>  other
> technical solution that enjoys a similar level of PC platform industry
> support, is anywhere as near to wide-spread production as TPM's, and is
> of sufficient integration into the platform to be able to form the
> platform basis for meeting the requirements of the Hollings bill.
>
> Would Anonymous perhaps like to take this question?




Re: Ross's TCPA paper

2002-06-24 Thread Adam Shostack

On Mon, Jun 24, 2002 at 08:15:29AM -0400, R. A. Hettinga wrote:
> Status:  U
> Date: Sun, 23 Jun 2002 12:53:42 -0700
> From: Paul Harrison <[EMAIL PROTECTED]>
> Subject: Re: Ross's TCPA paper
> To: "R. A. Hettinga" <[EMAIL PROTECTED]>

> The
> important question is not whether trusted platforms are a good idea, but
> who will own them.  Purchasing a TCP without the keys to the TPM is like
> buying property without doing a title search.  Of course it is possible to
> _rent_ property from a title holder, and in some cases this is desirable.
> 
> I would think a TCP _with_ ownership of the TPM would be every paranoid
> cypherpunk's wet dream.  A box which would tell you if it had been tampered
> with either in hardware or software?  Great.  Someone else's TCP is more
> like a rental car:  you want the rental company to be completely responsible
> for the safety of the vehicle.  This is the economic achilles heal of using
> TCPA for DRM.  Who is going to take financial responsibility for the proper
> operation of the platform?  It can work for a set top box, but it won't fly
> for a general purpose computer.

In general, I'm very fond of this sort of ownership analysis.  If I
have a TCPA box running my software, and thinking that its mine, how
do I know there isn't one more layer?  Leave it off, and my analysis
is simpler.

I suspect that verifying ownership of the TPM will be like verifying
ownership of property in modern Russia: There may be a title that
looks clean.  But what does the mafia think?  What about the security
services?  There may even be someone with a pre-Bolshevik title
floating around.  Or a forgery.  Hard to tell.  It's annoying to have
one's transaction costs pushed up that high.

I can get very high quality baseline software today.  What I need for
my cypherpunk wet dreams is ecash, and a nice anonymizing network.
What I also need is that the general purpose computing environment
stay free of control points, in Lessig sense.


Adam




Re: Ross's TCPA paper

2002-06-24 Thread Pete Chown

Ross Anderson wrote:

> ... that means making sure the PC is the hub of the
> future home network; and if entertainment's the killer app, and DRM is
> the key technology for entertainment, then the PC must do DRM.

Recently there have been a number of articles pointing out how much
money Microsoft is losing on Xbox sales.  To some extent, of course,
console makers expect to lose money on the consoles themselves, making
it up on the games.  However Microsoft seems to be losing more than
anyone else.

Perhaps Microsoft don't care, because the Xbox is one vision they have
of the future.  Gradually it starts running more than just games, but
you still get the ease of use and security of a console.

It's always risky making predictions, but I think that over the next few
years, free software will do in the desktop space what has already
happened in the server space.  There is a kind of economic inevitability
about it; competing with a free product of equivalent quality is
virtually impossible.

Now, Gates isn't stupid, and I'm sure he's aware of this risk.  So we
have various alternative strategies.  One is web services.  The other
strategy is to become more closed at the same time as everyone else is
becoming more open.  That strategy is the Xbox, which may over time
evolve into the kind of tamper resistant system that we have been
talking about.

> During my investigations into TCPA, I learned that HP has started a
> development program to produce a TCPA-compliant version of GNU/linux.
> I couldn't figure out how they planned to make money out of this.

It might simply be useful that it exists.  If people complain that they
can't run Linux on the new systems, it could create all sorts of
anti-trust problems.  However, even if they didn't try to make money out
of the product, it still wouldn't be free in the freedom sense.

A similar problem to this has already come up, albeit in a much less
serious form.  When the Mindterm ssh client is used as an applet, it
needs to be signed in order to be maximally useful.  At one point it was
available under the GPL, but of course if you changed it the signature
was invalidated.  In this case you could at least get your own code
signing key, but there were problems.  Firstly it cost money.  Secondly
by signing code that you didn't write, you would be taking
responsibility for something being secure when you had no easy way of
verifying that.

> You need a valid signature on the binary, plus a cert to use the TCPA
> PKI. That will cost you money (if not at first, then eventually).

I think it would be a breach of the GPL to stop people redistributing
the signature: "You must cause any work that you distribute or publish,
that in whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License."

This doesn't help with your other point, though; people wouldn't be able
to modify the code and have a useful end product.  I wonder if it could
be argued that your private key is part of the source code?

> Anyone will be free to make modifications to the pruned code, but in
> the absence of a signature the resulting O/S won't enable users to
> access TCPA features.

What if the DRM system was cracked by means of something that you were
allowed to do under the GPL?  If they use the DMCA, or the Motherhood
and Apple Pie Promotion Act against you, they have to stop distributing
Linux.  "If you cannot distribute so as to satisfy simultaneously your
obligations under this License and any other pertinent obligations, then
as a consequence you may not distribute the Program at all."

BTW, Ross, does Microsoft Research in Cambridge work on this kind of
technology?

-- 
Pete




Monkeywrenching DRM

2002-06-24 Thread Tim May

On Monday, June 24, 2002, at 01:47  AM, Lucky Green wrote:
>
> [Tim: do you recall when we had the discussion about the upcoming
> "encrypted op code chips" at a Cypherpunks meeting in a Stanford lecture
> hall? Was that 1995 or 1996? It cannot have been later; I know that I
> was still working for DigiCash at the time because I remember giving a
> talk on compact endorsement signatures at the same meeting].

Around that time. Someone (Markoff?) was reporting that Intel was 
devoting a few percent of its transistors in an upcoming CPU to op code 
encrypting. I remember pointing out that Intel had previously released, 
in the early 80s, a "KeyPROM," which was an EPROM with encryption so 
that the internal state could not easily be read. The ostensible market 
was for arcade game makers, who were heavy consumers of EPROMs at the 
time and who wanted ways to not have their games copied by competitors.  
(The product flopped. Left as an exercise is to think about how 
pointless it is to try to make a tamper-proof chip, especially without 
any of the expensive countermeasures being possible. Anyone who can make 
the chip wiggle with a logic analyzer and o-scope could learn a lot. We 
used our Dynamic Fault Imager to image internal microcode states, thus 
bypassing the crypto junk.)

Back to the rumor. The supposed encrypted CPU has not yet appeared.

One theory, one that I find plausible, is that Intel got freaked out by 
the firestorm of derision and protest that met its attempt (around the 
same time) to introduce processor/user ID numbers which companies like 
Microsoft could use.

(As it turns out, there's enough readable state in a PC, with various 
configurations of memory, drives, etc., that Microsoft can do a crude 
registration system which makes it difficult for users to run a product 
on N different machines. The Intel ID system was anticipated to make 
this _much_ more robust than simply counting drives and slots and 
attempting to map to one such configuration...which has the headaches of 
requiring customers to re-register, if they are allowed to, when they 
swap out drives or move cards around.)

Anyway, a major reason Intel got freaked is that AMD, a competitor of 
course, announced with much publicity that they would NOT, repeat NOT, 
include the processor ID feature!

As an Intel shareholder of many years, I'm not happy that AMD is as 
strong a competitor as it is (which isn't very, to be honest). But in 
other obvious ways I am happy to see them out there, keeping Intel from 
implementing such schemes.

This is the key, no pun intended. Any single vendor, like Intel, who 
imposes such a scheme will face harsh criticism from the rabble like us. 
We will write essays, we will monkeywrench their boxes with "Big Brother 
Inside" stickers, we will laugh at their failures, we will be energized 
to find hacks to defeat them.

So any effort to put "DRM" into hardware will have to be a mandated, 
directed, antitrust-exempted procedure.

(Aside: And possibly unpatented. Rambus is now getting smacked around by 
the courts for participating in JEDEC memory chip standards committees 
without disclosing their patent interests. A standard _can_ involve 
patents, pace Firewire and USB, but the issues get complicated. 
Something to keep your eye on, as a wedge for attack.)

If one vendor doesn't put the DRM in, he has bragging rights a la AMD 
with Intel's processor ID scheme.

For a DRM scheme to have any hope of succeeding, it must happen with all 
vendors of VCRs or PCs or whatever.

And since companies are not allowed (in the U.S. and most statist 
countries) to meet secretly or even quasi-secretly to plan features, the 
DRM planning must be done either publically under a guise of "industry 
standards." Or exempted by the law, possibly a secret ruling (e.g., a 
letter from the AG exempting AMD, Intel, Nvidia, and VIA from antitrust 
laws for the purposes of implementing DRM).

In summary:

-- expect more such attempts

-- use laughter, derision, and slogans to monkeywrench the public 
perception

(I talked to a person from Intel at this year's CFP...got the 
confirmation that the firestorm over the chip ID scheme had scared Intel 
badly and that there was little support within Intel for repeating the 
mistake...could be why senior Intel managers have testified in Congress 
against mandated DRM schemes...cf. testimony of Les Vadasz, IIRC.)

>
--Tim May
""Guard with jealous attention the public liberty. Suspect everyone who 
approaches that jewel. Unfortunately, nothing will preserve it but 
downright force. Whenever you give up that force, you are ruined." 
--Patrick Henry




Re: Ross's TCPA paper

2002-06-24 Thread Anonymous

The amazing thing about this discussion is that there are two pieces
of conventional wisdom which people in the cypherpunk/EFF/"freedom"
communities adhere to, and they are completely contradictory.

The first is that protection of copyright is ultimately impossible.
See the analysis in Schneier and Kelsey's "Street Performer Protocol"
paper, http://www.counterpane.com/street_performer.pdf.  Or EFF
columnist Cory Doctorow's recent recitation of the conventional wisdom
at http://boingboing.net/2002_06_01_archive.html#85167215: "providing
an untrusted party with the key, the ciphertext and the cleartext but
asking that party not to make a copy of your message is just silly,
and can't possibly work in a world of Turing-complete computing."

The second is that evil companies are going to take over our computers
and turn us into helpless slaves who can only sit slack-jawed as they
force-feed us whatever content they desire, charging whatever they wish.
The recent outcry over TCPA falls into this category.

Cypherpunks alternate between smug assertions of the first claim and
panicked wailing about the second.  The important point about both of
them, from the average cypherpunk's perspective, is that neither leaves
any room for action.  Both views are completely fatalistic in tone.
In one, we are assured victory; in the other, defeat.  Neither allows
for human choice.

Let's apply a little common sense for a change, and analyze the situation
in the context of a competitive market economy.  Suppose there is no
law forcing people to use DRM-compliant systems, and everyone can decide
freely whether to use one or not.

This is plausible because, if we take the doom-sayers at their word,
the Hollings bill or equivalent is completely redundant and unnecessary.
Intel and Microsoft are already going forward.  The BIOS makers are
on board; TPM chips are being installed.  In a few years there will
be plenty of TCPA compliant systems in use and most new systems will
include this functionality.

Furthermore, inherent to the TCPA concept is that the chip can in
effect be turned off.  No one proposes to forbid you from booting a
non-compliant OS or including non-compliant drivers.  However the TPM
chip, in conjunction with a trusted OS, will be able to know that you
have done so.  And because the chip includes an embedded, certified key,
it will be impossible to falsely claim that your system is running in a
"trusted" mode - only the TPM chip can convincingly make that claim.

This means that whether the Hollings bill passes or not, the situation
will be exactly the same.  People running in "trusted" mode can prove
it; but anyone can run untrusted.  Even with the Hollings bill there
will still be people using untrusted mode.  The legislation would
not change that.  Therefore the Hollings bill would not increase the
effectiveness of the TCPA model.  And it follows, then, that Lucky and
Ross are wrong to claim that this bill is intended to legislate use of
the TCPA.  The TCPA does not require legislation.

Actually the Hollings bill is clearly targeted at the "analog hole", such
as the video cable that runs from your PC to the display, or the audio
cable to your speakers.  Obviously the TCPA does no good in protecting
content if you can easily hook an A/D converter into those connections and
digitize high quality signals.  The only way to remove this capability
is by legislation, and that is clearly what the Hollings bill targets.
So much for the claim that this bill is intended to enforce the TCPA.

That claim is ultimately a red herring.  It doesn't matter if the bill
exists, what matters is that TCPA technology exists.  Let us imagine a
world in which most new PCs have TCPA built-in, Microsoft OS's have been
adapted to support it, maybe some other OS's have been converted as well.

The ultimate goal, according to the doom-sayers, is that digital content
will only be made available to people who are running in "trusted"
mode as determined by the TPM chip built into their system.  This will
guarantee that only an approved OS is loaded, and only approved drivers
are running.  It will not be possible to patch the OS or insert a custom
driver to intercept the audio/video stream.  You won't be able to run
the OS in a virtual mode and provide an emulated environment where you
can tap the data.  Your system will display the data for you, and you
will have no way to capture it in digital form.

Now there are some obvious loopholes here.  Microsoft software has a
track record of bugs, and let's face it, Linux does, too.  Despite the
claims, the TCPA by itself does nothing to reduce the threat of viruses,
worms, and other bug-exploiting software.  At best it includes a set of
checksums of key system components, but you can get software that does
that already.  Bugs in the OS and drivers may be exploitable and allow
for grabbing DRM protected content.  And once acquired, the data can
be made widely available.  No doubt the OS will be bu

Fwd: Re: Fwd: Book Review: Peter Wayner's "Translucent Databases"

2002-06-24 Thread R. A. Hettinga

...More fun and games from the "We're Monkeys, we'll *go*!!!" school of
disputation...

:-).

Cheers,
RAH

--- begin forwarded text


Status:  U
Date: Mon, 24 Jun 2002 07:58:45 +0530
To: Robert Hettinga <[EMAIL PROTECTED]>
From: Udhay Shankar N <[EMAIL PROTECTED]>
Subject: Fwd: Re: Fwd: Book Review: Peter Wayner's "Translucent
  Databases"

Bob,

I forwarded your review of Wayner's book to, among others, David Brin. He
sent this reply, asking me to pass it on. Seems to have touched a nerve!

Udhay

>Uday, thanks for sharing this.
>
>Could you submit the following reply?
>
>---
>
>It is particularly dishonest of a so-called reviewer not only to
>misinterpret and misconvey another person's position, but to abuse
>quotation marks in the way Robert Hettinga has done in his review of
>Translucent Databases By Peter Wayner. Openly and publicly, I defy
>Hettinga to find any place where I used the word "trust" in the fashion or
>meaning he attributes to me.
>
>In fact, my argument is diametrically opposite to the one that he portrays
>as mine.  For him to say that 'Brin seems to want, "trust" of state
>force-monopolists... their lawyers and apparatchiks." demonstrates either
>profound laziness - having never read a word I wrote - or else deliberate
>calumny.  In either event, I now openly hold him accountable by calling it
>a damnable lie.  This is not a person to be trusted or listened-to by
>people who value credibility.
>
>Without intending-to, he laid bare one of the 'false dichotomies" that
>trap even bright people into either-or - or zero-sum - kinds of
>thinking.  For example, across the political spectrum, a "Strong Privacy"
>movement claims that liberty and personal privacy are best defended by
>anonymity and encryption, or else by ornate laws restricting what people
>may know. This approach may seem appealing, but there are no historical
>examples of it ever having worked.
>
>INdeed, those mired in these two approaches seem unable to see outside the
>dichotomy.  Hettinga thinks that, because I am skeptical of the right
>wing's passion for cowboy anonymity, that I am therefore automatically an
>advocate of the left wing's prescription of  "privacy through state
>coercive information management'.  Baloney.  A plague on both houses of
>people who seem obsessed with policing what other people are allowed to know.
>
>Strong Privacy advocates bears a severe burden of proof when they claim
>that a world of secrets will protect freedom... even privacy... better
>than what has worked for us so far - general openness.
>
>Indeed, it's a burden of proof that can sometimes be met!  Certainly there
>are circumstances when/where secrecy is the only recourse... in concealing
>the location of shelters for battered wives, for instance, or in fiercely
>defending psychiatric records.  These examples stand at one end of a
>sliding scale whose principal measure is the amount of harm that a piece
>of information might plausibly do, if released in an unfair manner.  At
>the other end of the scale, new technologies seem to make it likely that
>we'll just have to get used to changes in our definition of privacy.  What
>salad dressing you use may be as widely known as what color sweater you
>wear on the street... and just as harmlessly boring.
>
>The important thing to remember is that anyone who claims a right to keep
>something secret is also claiming a right to deny knowledge to
>others.  There is an inherent conflict! Some kind of criterion must be
>used to adjudicate this tradeoff and most sensible people seem to agree
>that this criterion should be real or plausible harm... not simply whether
>or not somebody likes to keep personal data secret.
>
>
>The modern debate over information, and who controls it, must begin with a
>paradox.
>
>(1) Each of us understands that knowledge can be power. We want to know as
>much as possible about people or groups we see as threatening... and we
>want our opponents to know little about us. Each of us would prescribe
>armor for "the good guys" and nakedness for our worst foes.
>
>(2) Criticism is the best antidote to error. Yet most people, especially
>the mighty, try  to avoid it. Leaders of past civilizations evaded
>criticism by crushing free speech and public access to information. This
>sometimes helped them stay in power... but it also generally resulted in
>horrific blunders in statecraft.
>
>3) Ours may be the first civilization to systematically avoid this cycle,
>whose roots lie in human nature. We have learned that few people are
>mature enough to hold themselves accountable. But in an open society where
>criticism flows, adversaries eagerly pounce on each others' errors.  We do
>each other the favor of reciprocal criticism (though it seldom personally
>feels like a favor!)
>
>
>Four great social innovations foster our unprecedented wealth and freedom:
>science, justice, democracy & free markets.  Each of these "accountability
>arenas" functions best when all players

Ross TCPA paper

2002-06-24 Thread Larry J. Blunk

   For those who question the use of the TCPA spec as part of a DRM
system, I refer you to the following article where the author
interviewed Jim Ward of IBM (one of the authors of the TCPA spec) --

http://www.101com.com/solutions/security/article.asp?ArticleID=3266

   In particular, note the following text:

 "The TCPA specifications center on two main areas: trusted reporting and public key
  infrastructure (PKI). The TCPA reporting guidelines create profiles of a machine's
  security settings as the machine boots. Ward says content providers such as
  Bloomberg or Hoover's may take advantage of this feature to ensure users do not
  redistribute content."


  And then there is the DRMOS patent granted to Paul England, et al. of
Microsoft -- http://cryptome.org/ms-drm-os.htm.  Mr. England is also
one of the authors of the TCPA specification.  Note the section
describing the interaction of the DRMOS with the CPU and the
cryptographic/certificate requirements of the CPU.  This is precisely
how the TCPA spec works.  

  The TCPA website avoids mentioning the DRM applications of the spec
precisely because it is so controversial.  However, it's no big
secret that this is clearly one of the intended uses of the spec.


--
Larry J. Blunk
Merit Network, Inc.   Ann Arbor, Michigan




Re: Ross's TCPA paper

2002-06-24 Thread Nomen Nescio

Ross Anderson writes:

> During my investigations into TCPA, I learned that HP has started a
> development program to produce a TCPA-compliant version of GNU/linux.
> I couldn't figure out how they planned to make money out of this. On
> Thursday, at the Open Source Software Economics conference, I figured
> out how they might.
> ...
> The business model, I believe, is this. HP will not dispute that the
> resulting `pruned code' is covered by the GPL. You will be able to
> download it, compile it, check it against the binary, and do what you
> like with it. However, to make it into TCPA-linux, to run it on a
> TCPA-enabled machine in privileged mode, you need more than the code.
> You need a valid signature on the binary, plus a cert to use the TCPA
> PKI. That will cost you money (if not at first, then eventually).

H Not clear that this really works to make money.  The GPL
allows everyone to redistribute HP's software verbatim, right?  So a
cert on one copy of the software will work on everyone's.  How can HP
make money on a product that everyone can copy freely, when they can
all share the same cert?

It's true that modified versions of the software would not be able to
use that cert, and it would no doubt be expensive to get a new cert for
the modified software.  But that still gives HP no monopoly on selling
or supporting its own version.  Anyone can step in and do that.

Is the cert itself supposed to be somehow copyrighted?  Kept secret?
Will it be illegal to publish the cert, to share it with someone else?
This seems pretty questionable both in terms of copyright law (since
a cert is a functional component) and in terms of the GPL (which would
arguably cover the cert and forbid restrictively licensing it).

It seems more likely that the real purpose is to bring the benefits of
TCPA to the Linux world.  As an innovator in this technology HP will gain
in reputation and be the source that people turn to for development and
support in this growing area.  The key to making money from open source
is reputation.  Being first makes good economic sense.  You don't need
conspiracy theories.




Re: Ross's TCPA paper

2002-06-24 Thread Mike Rosing

On Mon, 24 Jun 2002, Anonymous wrote:

> The amazing thing about this discussion is that there are two pieces
> of conventional wisdom which people in the cypherpunk/EFF/"freedom"
> communities adhere to, and they are completely contradictory.

Makes for lively conversation doesn't it :-)

> Cypherpunks alternate between smug assertions of the first claim and
> panicked wailing about the second.  The important point about both of
> them, from the average cypherpunk's perspective, is that neither leaves
> any room for action.  Both views are completely fatalistic in tone.
> In one, we are assured victory; in the other, defeat.  Neither allows
> for human choice.

A good discussion should alternate.  Certainly it's not the same people.
And both urge the same action - tell your congress critter to butt out!

> This means that whether the Hollings bill passes or not, the situation
> will be exactly the same.  People running in "trusted" mode can prove
> it; but anyone can run untrusted.  Even with the Hollings bill there
> will still be people using untrusted mode.  The legislation would
> not change that.  Therefore the Hollings bill would not increase the
> effectiveness of the TCPA model.  And it follows, then, that Lucky and
> Ross are wrong to claim that this bill is intended to legislate use of
> the TCPA.  The TCPA does not require legislation.

Exactly.  Let the market decide.  This is why it's necessary to
contact your congress critter - they don't need to be involved.

> Lucky, Ross and others who view this as a catastrophe should look at
> the larger picture and reconsider their perspective.  Realize that the
> "trusted" mode of the TCPA will always be only an option, and there
> is no technological, political or economic reason for that to change.
> The TCPA gives people new capabilities without removing any old ones.
> It makes possible a new kind of information processing that cannot be
> accomplished in today's world.  It lets people make binding promises that
> are impossible today.  It makes the world a more flexible place, with
> more opportunities and options.  Somehow that doesn't sound all that bad.

As long as it's not legislated, nobody needs to worry about what
gets fabbed.  The market will decide if DRM makes any economic sense.
I'm betting it doesn't, but I've been wrong before.  Untrusted
platforms will be cheaper than trusted ones, so there has to be some
incentive for customers to buy them.  Economic incentives make far
more sense than legislated ones.

The main point is not the content of the bill, or its purpose.  The
main point is that government is being told to get involved in the market
place, and that, all by itself, is a *bad* idea.  If people want to
build trusted platforms and put them on the market they can go ahead
and do it.  If people don't want to buy them, that's their choice,
and if others do decide it's worth it, they should be allowed to.

As long as TCPA is really an option, the market place is a good way
to sort things out.  But S.2048 needs to die, not for scary reasons,
but just because there's no reason for it in the first place.

Patience, persistence, truth,
Dr. mike




Re: Ross's TCPA paper

2002-06-24 Thread Pete Chown

Anonymous wrote:

> Furthermore, inherent to the TCPA concept is that the chip can in
> effect be turned off.  No one proposes to forbid you from booting a
> non-compliant OS or including non-compliant drivers.

Good point.  At least I hope they don't. :-)

> There is not even social opprobrium; look at how eager
> everyone was to look the other way on the question of whether the DeCSS
> reverse engineering violated the click-through agreement.

Perhaps it did, but the licence agreement was unenforceable.  It's
clearly reverse engineering for interoperability (between Linux and DVD
players) so the legal exemption applies.  You can't escape the exemption
by contract.  Now, you might say that morally he should obey the
agreement he made.  My view is that there is a reason why this type of
contract is unenforceable; you might as well take advantage of the
exemption.

The prosecution was on some nonsense charge that amounted to him
burgling his own house.  A statute that was meant to penalise computer
break-ins was used against someone who owned the computer that he broke
into.

> The TCPA allows you to do something that you can't do today: run your
> system in a way which convinces the other guy that you will honor your
> promises, that you will guard his content as he requires in exchange for
> his providing it to you.

Right, but it has an odd effect too.  No legal system gives people
complete freedom to contract.  Suppose you really, really want to exempt
a shop from liability if your new toaster explodes.  You can't do it;
the legal system does not give you the freedom to contract in that way.

DRM, however, gives people complete freedom to make contracts about how
they will deal with digital content.  Under EU single market rules, a
contract term to the effect that you could pass on your content to
someone in the UK but not the rest of the EU is unenforceable.  No
problem for DRM though...

I think lawyers will hate this.

-- 
Pete




RE: Ross's TCPA paper

2002-06-24 Thread Lucky Green

Pete Chown wrote quoting Ross:
> > You need a valid signature on the binary, plus a cert to 
> use the TCPA 
> > PKI. That will cost you money (if not at first, then eventually).
> 
> I think it would be a breach of the GPL to stop people 
> redistributing the signature: "You must cause any work that 
> you distribute or publish, that in whole or in part contains 
> or is derived from the Program or any part thereof, to be 
> licensed as a whole at no charge to all third parties under 
> the terms of this License."

The application or OS vendor can in confidence distribute not just the
code, but also the also the signature and cert. In fact, the application
vendor can distribute absolutely everything they have access to
themselves and you still won't be able to run the application in trusted
mode.

The cert that enables an application to run in trusted mode is tied to a
specific TPM and therefore to a specific motherboard. For this cert to
work on another motherboard without a new and different cert, the
software vendor would need to extract the 2048-bit secret RSA key [1]
from their own motherboard's TPM, make the secret key available for
download, followed by the customer importing the key into their own TPM.
The TPM, for obvious reasons, offers no facilities to export or import
the TPM's internal keys.

The GPL cannot possibly require a software author to distribute a
hardware crack with their software or be in violation of the GPL.
Distributing a crack for TPM's is distributing an infringement device
and as such is illegal under US law. Even if the GPL were to be modified
to mandate what is technically near impossible to a software vendor to
achieve, even this layperson knows that contracts that require illegal
acts are unenforceable. Note that I am not referring to acts that might
be illegal in the future under the Hollings bill. Doing the above is
illegal today.

The GPL might be modified to require that the application vendor do
whatever is necessary for a user to utilize an application in the way
the user deems fit (i.e. in privileged mode), but that would put the GPL
into very dangerous, and I believe thoroughly undesirable, territory.
With such modifications, the hypothetical new GPL would mandate, to use
Richard Stallman's terminology, not just freedom of speech, but free
beer as well. That has never been the intend of the GPL.

Furthermore, the certs required to run the OS or application will in may
cases be issued by a party other than the application author or vendor.
To continue using Richard's terminology, to cover this case the GPL
would need to be rewritten to mandate that a third-party provide the
free beer.

I will leave it to the attorneys on this list to elucidate on the legal
deficiencies of such a hypothetical contract, since I am not an attorney
I will simply state that I sincerely doubt such contract would hold up
in litigation.

Of course I do not believe the FSF would make such changes. Which gets
us back to Ross's point that the TCPA threatens the core of the GPL,
from which this discussion started. For completeness I would like to
state that I have no personal stake in the continued enforceability of
the GPL, being a long-time supporter of the BSD licensing scheme myself.

[1] 1024-bit RSA keys were rejected during the design phase of the TPM
by members of the TCPA, which, as Anonymous pointed out in a previous
post, contains several well-known crypto companies. The TCPA's website,
which only makes specs, but not design documents, available to the
public, unfortunately does not provide any documentation which reasoning
lead to this decision.

--Lucky Green




Re: Fwd: Re: Fwd: Book Review: Peter Wayner's "Translucent Databases"

2002-06-24 Thread Peter Wayner

I think Bob made some great points about my book, but it's clear that 
this debate
is revolving around a few sentences in Bob's review. Perhaps he miscategorizes
Brin, perhaps he doesn't. I haven't read _Transparent Society_ in some time.

Still, it's important to realize that this isn't just a battle 
between the state
and its citizens. Encryption can provide a practical tool and a great option
for the data management engineers. Brin has a good point about the value
of openness, but I'm sure he doesn't extend it to things like people's credit
card numbers. Brin would probably be interested in the book and the way
it leaves some things in the clear. It's all about translucency, 
which is, after
all, partially transparent. The glass is half empty or full. So maybe there's
something in common here?

The right use of encryption (and any anonymity that comes along with it) can
protect businesses, customers, clients, employees and others. I'm sure it
might also be used to by a few elites to avoid scrutiny, but that doesn't have
to be the case.




For me, the mathematics of on-line anonymity are essential parts of 
on-line security. While I think that there are plenty of personal and 
emotional reasons to embrace anonymity, one of the best is the higher 
amount of security the systems offer. Simply put, identity-based 
systems are more fragile because identity theft is so easy. Systems 
designed for anonymity avoid that weakness because they're designed, 
a priori, to work without names. So I think they're just bound to be 
a bit safer.

It should be noted that the anonymous techniques developed by Chaum, 
Brands and others do not have to be used to avoid scrutiny. You can 
always tack on your true name in an additional field. To me, the 
systems just avoid relying on the the name field to keep people 
honest.

I'm glad Bob sees the resonance between _Translucent Databases_ and 
the world of cypherpunk paranoia, but I would like to avoid a strong 
connection. It's not that there's no relationship. There is. But the 
book is meant to be much more practical. It explores how to use the 
right amount of encryption to lock up the personal stuff in a 
database without scrambling all of it. In the right situations, the 
results can be fast, efficient, and very secure. So the techniques 
are good for the paranoids as well as the apolitical DBAs who just 
want to do a good job.









>  >
>>It is particularly dishonest of a so-called reviewer not only to
>>misinterpret and misconvey another person's position, but to abuse
>>quotation marks in the way Robert Hettinga has done in his review of
>>Translucent Databases By Peter Wayner. Openly and publicly, I defy
>>Hettinga to find any place where I used the word "trust" in the fashion or
>>meaning he attributes to me.
>>
>>In fact, my argument is diametrically opposite to the one that he portrays
>>as mine.  For him to say that 'Brin seems to want, "trust" of state
>>force-monopolists... their lawyers and apparatchiks." demonstrates either
>>profound laziness - having never read a word I wrote - or else deliberate
>>calumny.  In either event, I now openly hold him accountable by calling it
>>a damnable lie.  This is not a person to be trusted or listened-to by
>>people who value credibility.
>>
>>Without intending-to, he laid bare one of the 'false dichotomies" that
>>trap even bright people into either-or - or zero-sum - kinds of
>>thinking.  For example, across the political spectrum, a "Strong Privacy"
>>movement claims that liberty and personal privacy are best defended by
>>anonymity and encryption, or else by ornate laws restricting what people
>>may know. This approach may seem appealing, but there are no historical
>>examples of it ever having worked.
>>
>>INdeed, those mired in these two approaches seem unable to see outside the
>>dichotomy.  Hettinga thinks that, because I am skeptical of the right
>>wing's passion for cowboy anonymity, that I am therefore automatically an
>  >advocate of the left wing's prescription of  "privacy through state
>>coercive information management'.  Baloney.  A plague on both houses of
>>people who seem obsessed with policing what other people are allowed to know.
>>
>>Strong Privacy advocates bears a severe burden of proof when they claim
>  >that a world of secrets will protect freedom... even privacy... better
>  >than what has worked for us so far - general openness.
>  >
>  >Indeed, it's a burden of proof that can sometimes be met!  Certainly there
>>are circumstances when/where secrecy is the only recourse... in concealing
>>the location of shelters for battered wives, for instance, or in fiercely
>>defending psychiatric records.  These examples stand at one end of a
>>sliding scale whose principal measure is the amount of harm that a piece
>>of information might plausibly do, if released in an unfair manner.  At
>>the other end of the scale, new technologies seem to make it likely that
>>we'll just have to get used to cha

Re: Fwd: Re: Fwd: Book Review: Peter Wayner's "Translucent Databases"

2002-06-24 Thread R. A. Hettinga

--- begin forwarded text


Status:  U
Date: Mon, 24 Jun 2002 17:02:52 -0400
To: "R. A. Hettinga" <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
From: Peter Wayner <[EMAIL PROTECTED]>
Subject: Re: Fwd: Re: Fwd: Book Review: Peter Wayner's
 "Translucent  Databases"

>

I think Bob made some great points about my book, but it's clear that
this debate
is revolving around a few sentences in Bob's review. Perhaps he miscategorizes
Brin, perhaps he doesn't. I haven't read _Transparent Society_ in some time.

Still, it's important to realize that this isn't just a battle
between the state
and its citizens. Encryption can provide a practical tool and a great option
for the data management engineers. Brin has a good point about the value
of openness, but I'm sure he doesn't extend it to things like people's credit
card numbers. Brin would probably be interested in the book and the way
it leaves some things in the clear. It's all about translucency,
which is, after
all, partially transparent. The glass is half empty or full. So maybe there's
something in common here?

The right use of encryption (and any anonymity that comes along with it) can
protect businesses, customers, clients, employees and others. I'm sure it
might also be used to by a few elites to avoid scrutiny, but that doesn't have
to be the case.




For me, the mathematics of on-line anonymity are essential parts of
on-line security. While I think that there are plenty of personal and
emotional reasons to embrace anonymity, one of the best is the higher
amount of security the systems offer. Simply put, identity-based
systems are more fragile because identity theft is so easy. Systems
designed for anonymity avoid that weakness because they're designed,
a priori, to work without names. So I think they're just bound to be
a bit safer.

It should be noted that the anonymous techniques developed by Chaum,
Brands and others do not have to be used to avoid scrutiny. You can
always tack on your true name in an additional field. To me, the
systems just avoid relying on the the name field to keep people
honest.

I'm glad Bob sees the resonance between _Translucent Databases_ and
the world of cypherpunk paranoia, but I would like to avoid a strong
connection. It's not that there's no relationship. There is. But the
book is meant to be much more practical. It explores how to use the
right amount of encryption to lock up the personal stuff in a
database without scrambling all of it. In the right situations, the
results can be fast, efficient, and very secure. So the techniques
are good for the paranoids as well as the apolitical DBAs who just
want to do a good job.









>  >
>>It is particularly dishonest of a so-called reviewer not only to
>>misinterpret and misconvey another person's position, but to abuse
>>quotation marks in the way Robert Hettinga has done in his review of
>>Translucent Databases By Peter Wayner. Openly and publicly, I defy
>>Hettinga to find any place where I used the word "trust" in the fashion or
>>meaning he attributes to me.
>>
>>In fact, my argument is diametrically opposite to the one that he portrays
>>as mine.  For him to say that 'Brin seems to want, "trust" of state
>>force-monopolists... their lawyers and apparatchiks." demonstrates either
>>profound laziness - having never read a word I wrote - or else deliberate
>>calumny.  In either event, I now openly hold him accountable by calling it
>>a damnable lie.  This is not a person to be trusted or listened-to by
>>people who value credibility.
>>
>>Without intending-to, he laid bare one of the 'false dichotomies" that
>>trap even bright people into either-or - or zero-sum - kinds of
>>thinking.  For example, across the political spectrum, a "Strong Privacy"
>>movement claims that liberty and personal privacy are best defended by
>>anonymity and encryption, or else by ornate laws restricting what people
>>may know. This approach may seem appealing, but there are no historical
>>examples of it ever having worked.
>>
>>INdeed, those mired in these two approaches seem unable to see outside the
>>dichotomy.  Hettinga thinks that, because I am skeptical of the right
>>wing's passion for cowboy anonymity, that I am therefore automatically an
>  >advocate of the left wing's prescription of  "privacy through state
>>coercive information management'.  Baloney.  A plague on both houses of
>>people who seem obsessed with policing what other people are allowed to know.
>>
>>Strong Privacy advocates bears a severe burden of proof when they claim
>  >that a world of secrets will protect freedom... even privacy... better
>  >than what has worked for us so far - general openness.
>  >
>  >Indeed, it's a burden of proof that can sometimes be met!  Certainly there
>>are circumstances when/where secrecy is the only recourse... in concealing
>>the location of shelters for battered wives, for instance, or in fiercely
>>defending psychiatric records.  These examples stand at one end of a
>>sliding scale wh