Re: Schneier on Palladium and the TCPA

2002-08-17 Thread Anonymous

Bruce Schneier wrote about Palladium:

> Basically, Pd is Microsoft's attempt to build a trusted computer, much as I
> discussed the concept in "Secrets and Lies" (pages 127-130); read it for
> background).

Actually his discussion in the book is about traditional "secure OS"
concepts such as Multics.  Trusted computing attempts to go considerably
beyond this.

> The idea is that different users on the system have
> limitations on their abilities, and are walled off from each other.

That was the idea for secure OS's.  For trusted computing it is more
that you can have trust in an application running on a remote system,
that it is what it claims, and that it has a certain degree of immunity
from being compromised.

> Pd provides protection against two broad classes of attacks.  Automatic
> software attacks (viruses, Trojans, network-mounted exploits) are contained
> because an exploited flaw in one part of the system can't affect the rest
> of the system.  And local software-based attacks (e.g., using debuggers to
> pry things open) are protected because of the separation between parts of
> the system.

It's interesting that Bruce sees it in terms of attacks like this.  As he
is now in the managed security business, it makes sense that he would
look at Palladium in terms of how much security it can add to a system.

As far as viruses and such, the protection Palladium offers would seem to
be that if you load a trusted component, and it has been infected by a
virus since the last time you ran it, its hash will change.  This means
that it will no longer be able to access sealed data - it won't be able
to get into the "virtual vault" because it is no longer the same program.
Likewise it would not be able to participate in any trusted networking
because the fact of its compromise would be remotely observable (due to
the hash change).

This is not an all-purpose defense against viruses and such; it would
be restricted to the "trusted" parts of applications and it would only
work specifically with sealed data and trusted networking.  But for some
purposes it could be quite useful.  Imagine a banking app which keeps
your account access info sealed in a virtual vault; then no other app
can get to the data, so you are immune to virus attacks elsewhere in
the system; and if even the banking app itself is compromised, it will
no longer be able to get into its own vault.

> There are security features that tie programs and data to CPU and to user,
> and encrypt them for privacy.  This is probably necessary to make Pd work,
> but has a side-effect that I'm sure Microsoft is thrilled with.  Like books
> and furniture and clothing, the person who currently buys new software can
> resell it when he's done with it.  People have a right to do this -- it's
> called the "First Sale Doctrine" in the United States -- but the software
> industry has long claimed that software is not sold, but licensed, and
> cannot be transferred.  When someone sells a Pd-equipped computer, he is
> likely to clear his keys so that his identity can't be used or files can't
> be read.  This will also serve to erase all the software he purchased.  The
> end result might be that people won't be able to resell software, even if
> they wanted to.

This is a pretty far-fetched scenario, for several reasons.  First,
according to Peter Biddle, Palladium is designed to protect content and
not programs.  Sure, maybe you don't believe him, but at least he's on
record as saying it.  And what is known of the Palladium architecture
is consistent with his claim.  The limited architectural diagrams in
the Palladium white paper don't show any mechanism for locking code to
a computer.

But there are other problems with Bruce's scenario.  It assumes
(apparently) that you aren't copying your programs to your replacement
computer when you get rid of the old one!  That doesn't make sense.
You have an investment of hundreds or thousands of dollars in software.
You'll want to copy it over, and certainly Palladium will allow that.

So what's his objection in that case: that you can't sell an illegal
copy of your old software once you've installed it on the new system?
What's the "First Sale Doctrine" got to do with that?  It doesn't allow
for you to both keep a copy of your software and to sell it.  If he's
objecting that Palladium won't let you break the law in some ways you can
today, let him say so openly.  But as it is he is claiming that Palladium
will compromise the First Sale Doctrine, and that interpretation doesn't
hold water.

It's also not at all clear why you would want to wipe your keys like
this.  It should be enough to just delete your data files from the disk.
It's not like the trusted computing chip will hold kilobytes of sensitive
personal data.  All it has is a few keys, so if you get rid of the
data, the keys don't matter.  And then, how different is that from
what you do today?  If you sell an old computer, you should clear out
the sensitive data fi

Schneier on Palladium and the TCPA (was Re: CRYPTO-GRAM, August 15, 2002)

2002-08-15 Thread R. A. Hettinga

At 3:53 PM -0500 on 8/15/02, Bruce Schneier wrote:


>  Palladium and the TCPA
>
>
>
> There's been more written about Microsoft's Palladium security initiative
> than about anything else in computer security in a very long time.  My URL
> list of comments, analysis, and opinions goes on for quite a while.  Which
> is interesting, because we really don't know anything about the details of
> what it is or how it works.  Much of this is based on reading between the
> lines in the various news reports, conversations I've had with Microsoft
> people (none of them under NDA), and conversations with people who've had
> conversations.  But since I don't know anything for sure, all of this could
> be wrong.
>
> Palladium (like chemists, Microsoft calls it "Pd" for short) is Microsoft's
> implementation of the TCPA spec, sort of.  ("Sort of" depends on who you
> ask.  Some say it's related.  Some say they do similar things, but are
> unrelated.  Some say that Pd is, in fact, Microsoft's attempt to preempt
> the TCPA spec.)  TCPA is the Trusted Computing Platform Alliance, an
> organization with just under 200 corporate members (an impressive list,
> actually) trying to build a trusted computer.  The TCPA 1.1 spec has been
> published, and you can obtain the 1.2 spec under NDA.  Pd doesn't follow
> the spec exactly, but it's along those lines, sort of.
>
> Pd has been in development for a long time, since at least 1997.  The best
> technical description is the summary of a meeting with Microsoft engineers
> by Seth Schoen of the EFF (URL below).  I'm not going to discuss the
> details, because systems with an initial version of Pd aren't going to ship
> until 2004 -- at least -- andthe details are all likely to change.
>
> Basically, Pd is Microsoft's attempt to build a trusted computer, much as I
> discussed the concept in "Secrets and Lies" (pages 127-130); read it for
> background).  The idea is that different users on the system have
> limitations on their abilities, and are walled off from each other.  This
> is impossible to achieve using only software; and Pd is a combination
> hardware/software system.  In fact, Pd affects the CPU, the chip set on the
> motherboard, the input devices (keyboard, mouse, etc.), and the video
> output devices (graphics processor, etc.).  Additionally, a new chip is
> required: a tamper-resistant secure processor.
>
> Microsoft readily acknowledges that Pd will not be secure against hardware
> attacks.  They spend some effort making the secure processor annoying to
> pry secrets out of, but not a whole lot of effort.  They assume that the
> tamper-resistance will be defeated.  It is their intention to design the
> system so that hardware attacks do not result in class breaks: that
> breaking one machine doesn't help you break any others.
>
> Pd provides protection against two broad classes of attacks.  Automatic
> software attacks (viruses, Trojans, network-mounted exploits) are contained
> because an exploited flaw in one part of the system can't affect the rest
> of the system.  And local software-based attacks (e.g., using debuggers to
> pry things open) are protected because of the separation between parts of
> the system.
>
> There are security features that tie programs and data to CPU and to user,
> and encrypt them for privacy.  This is probably necessary to make Pd work,
> but has a side-effect that I'm sure Microsoft is thrilled with.  Like books
> and furniture and clothing, the person who currently buys new software can
> resell it when he's done with it.  People have a right to do this -- it's
> called the "First Sale Doctrine" in the United States -- but the software
> industry has long claimed that software is not sold, but licensed, and
> cannot be transferred.  When someone sells a Pd-equipped computer, he is
> likely to clear his keys so that his identity can't be used or files can't
> be read.  This will also serve to erase all the software he purchased.  The
> end result might be that people won't be able to resell software, even if
> they wanted to.
>
> Pd is inexorably tied up with Digital Rights Management.  Your computer
> will have several partitions, each of which will be able to read and write
> its own data.  There's nothing in Pd that prevents someone else (MPAA,
> Disney, Microsoft, your boss) from setting up a partition on your computer
> and putting stuff there that you can't get at.  Microsoft has repeatedly
> said that they are not going to mandate DRM, or try to control DRM systems,
> but clearly Pd was designed with DRM in mind.
>
> There seem to be good privacy controls, over and above what I would have
> expected.  And Microsoft has claimed that they will make the core code
> public, so that it can be reviewed and evaluated.  It's about time they
> realized that lots of people are willing to do their security work for free.
>
> It's hard to sort out the antitrust implications of Pd.  Lots of people
> have written about it.  Will