Bruce Schneier wrote about Palladium:

> Basically, Pd is Microsoft's attempt to build a trusted computer, much as I
> discussed the concept in "Secrets and Lies" (pages 127-130); read it for
> background).

Actually his discussion in the book is about traditional "secure OS"
concepts such as Multics.  Trusted computing attempts to go considerably
beyond this.

> The idea is that different users on the system have
> limitations on their abilities, and are walled off from each other.

That was the idea for secure OS's.  For trusted computing it is more
that you can have trust in an application running on a remote system,
that it is what it claims, and that it has a certain degree of immunity
from being compromised.

> Pd provides protection against two broad classes of attacks.  Automatic
> software attacks (viruses, Trojans, network-mounted exploits) are contained
> because an exploited flaw in one part of the system can't affect the rest
> of the system.  And local software-based attacks (e.g., using debuggers to
> pry things open) are protected because of the separation between parts of
> the system.

It's interesting that Bruce sees it in terms of attacks like this.  As he
is now in the managed security business, it makes sense that he would
look at Palladium in terms of how much security it can add to a system.

As far as viruses and such, the protection Palladium offers would seem to
be that if you load a trusted component, and it has been infected by a
virus since the last time you ran it, its hash will change.  This means
that it will no longer be able to access sealed data - it won't be able
to get into the "virtual vault" because it is no longer the same program.
Likewise it would not be able to participate in any trusted networking
because the fact of its compromise would be remotely observable (due to
the hash change).

This is not an all-purpose defense against viruses and such; it would
be restricted to the "trusted" parts of applications and it would only
work specifically with sealed data and trusted networking.  But for some
purposes it could be quite useful.  Imagine a banking app which keeps
your account access info sealed in a virtual vault; then no other app
can get to the data, so you are immune to virus attacks elsewhere in
the system; and if even the banking app itself is compromised, it will
no longer be able to get into its own vault.

> There are security features that tie programs and data to CPU and to user,
> and encrypt them for privacy.  This is probably necessary to make Pd work,
> but has a side-effect that I'm sure Microsoft is thrilled with.  Like books
> and furniture and clothing, the person who currently buys new software can
> resell it when he's done with it.  People have a right to do this -- it's
> called the "First Sale Doctrine" in the United States -- but the software
> industry has long claimed that software is not sold, but licensed, and
> cannot be transferred.  When someone sells a Pd-equipped computer, he is
> likely to clear his keys so that his identity can't be used or files can't
> be read.  This will also serve to erase all the software he purchased.  The
> end result might be that people won't be able to resell software, even if
> they wanted to.

This is a pretty far-fetched scenario, for several reasons.  First,
according to Peter Biddle, Palladium is designed to protect content and
not programs.  Sure, maybe you don't believe him, but at least he's on
record as saying it.  And what is known of the Palladium architecture
is consistent with his claim.  The limited architectural diagrams in
the Palladium white paper don't show any mechanism for locking code to
a computer.

But there are other problems with Bruce's scenario.  It assumes
(apparently) that you aren't copying your programs to your replacement
computer when you get rid of the old one!  That doesn't make sense.
You have an investment of hundreds or thousands of dollars in software.
You'll want to copy it over, and certainly Palladium will allow that.

So what's his objection in that case: that you can't sell an illegal
copy of your old software once you've installed it on the new system?
What's the "First Sale Doctrine" got to do with that?  It doesn't allow
for you to both keep a copy of your software and to sell it.  If he's
objecting that Palladium won't let you break the law in some ways you can
today, let him say so openly.  But as it is he is claiming that Palladium
will compromise the First Sale Doctrine, and that interpretation doesn't
hold water.

It's also not at all clear why you would want to wipe your keys like
this.  It should be enough to just delete your data files from the disk.
It's not like the trusted computing chip will hold kilobytes of sensitive
personal data.  All it has is a few keys, so if you get rid of the
data, the keys don't matter.  And then, how different is that from
what you do today?  If you sell an old computer, you should clear out
the sensitive data files, even if you leave the applications in place.
There is no reason why Palladium would be any different.

> Pd is inexorably tied up with Digital Rights Management.  Your computer
> will have several partitions, each of which will be able to read and write
> its own data.  There's nothing in Pd that prevents someone else (MPAA,
> Disney, Microsoft, your boss) from setting up a partition on your computer
> and putting stuff there that you can't get at.  Microsoft has repeatedly
> said that they are not going to mandate DRM, or try to control DRM systems,
> but clearly Pd was designed with DRM in mind.

Everyone says this last point, and maybe it's true.  But at the same
time it's worth noting that Pd does more than is necessary for DRM - and
in fact it is not optimal for DRM.  The fact that Pd is open and useful
for a wide range of other applications is one piece of evidence.  We have
even discussed a Palladiumized Napster (PDster?) which could undercut the
interests of the content companies.

Microsoft didn't have to make Palladium an open system; they could have
kept control over the keys, and required that only signed apps can run
as trusted (as most people still appear to believe; see the discussion
today on slashdot).  Maybe you can argue that Microsoft felt forced to do
an open system just for public relations reasons, that they knew they'd
take too much heat if they produced the closed system they hypothetically
wanted.  Whatever their reasons, the fact is that Pd is a lot more open
than is optimal for DRM, and people should recognize that fact.

> It's hard to sort out the antitrust implications of Pd.  Lots of people
> have written about it.  Will Microsoft jigger Pd to prevent Linux from
> running?  They don't dare.

This piece of sanity is a breath of fresh air.  If only Ross Anderson
and Lucky Green and most of the cypherpunks had a similarly sound grip
on reality, the discussion of these technologies would have been greatly
improved.

> Will it take standard Internet protocols and
> replace them with Microsoft-proprietary protocols?  I don't think so.  Will
> you need a Pd-enabled device -- the system is meant for both
> general-purpose computers and specialized media devices -- in order to view
> copyrighted content?  More likely.  Will Microsoft enforce its Pd patents
> as strongly as it can?  Almost certainly.

Right, I think one of the big issues is whether Microsoft's patents cover
Palladium and TCPA, and whether it will even be possible to make a Linux
version of a trusted computing system.  As I have written before, in some
ways Linux is a much better platform for trusted computing than Windows
(due to its transparency, so much more important now that apps can cloak
themselves from users).  But if Microsoft patents block such an effort,
that could be a serious problem.  It is encouraging that HP and perhaps
IBM are going forward with a TCPA-enabled Linux; that suggests that the
Microsoft patents don't cover at least that specific architecture.

> 1.  A "trusted" computer does not mean a computer that is trustworthy.  The
> DoD's definition of a trusted system is one that can break your security
> policy; i.e., a system that you are forced to trust because you have no
> choice.  Pd will have trusted features; the jury is still out as to whether
> or not they are trustworthy.

Ross Anderson makes a similar point, but it is quite misleading.
It implies that trusted computing is in some sense weaker than ordinary
computing because it requires you to trust more systems.  But it misses
the point, that trusted computing for the first time gives you grounds
to trust remote systems.  That's what's really new here, the ability to
have some foundation for trust in what a remote system is doing.  And so
I think the word trust is very appropriate here, and it carries its
usual connotations and meaning.  No one is "forced" to trust anything.
Trusted computing will make it more reasonable for people to choose to
trust remote systems.

> 2.  When you think about a secure computer, the first question you should
> ask is: "Secure for whom?"  Microsoft has said that Pd allows the
> computer-owner to prevent others from putting their own secure areas on the
> computer.  But really, what is the likelihood of that really
> happening?  The NSA will be able to buy Pd-enabled computers and secure
> them from all outside influence.  I doubt that you or I could, and still
> enjoy the richness of the Internet.

To a large extent this is already true.  Who knows what is in the data
files and registry entries for all the closed-source Windows apps on
the market?  You already have apps putting crap on your computer and you
have no idea what is there.  Pd lets them wrap it in a secure envelope,
but that doesn't change the fact that data files are already essentially
opaque to the typical user.

> Microsoft really doesn't care about
> what you think; they care about what the RIAA and the MPAA
> think.  Microsoft can't afford to have the media companies not make their
> content available on Microsoft platforms, and they will do what they can to
> accommodate them.

This reasoning is totally backwards.  The RIAA are not Microsoft's
customers.  Microsoft doesn't sell much software to them.  Why can't
Microsoft afford for them not to make content available?  It's because
of the end users.  Those are the people Microsoft cares about!  It is
those people who buy Microsoft software.  The RIAA is only a means to
the end, the end being making end users happy.  Users want to be able
to play music and movies on their computers.  Microsoft is trying to
satisfy this market need.

The implication that Microsoft somehow doesn't care about end users and
is only concerned about the content industry has it totally backwards.
Microsoft cares only about the end users; the problem is that it needs
the content industry's permission in order to make end users happy.
This puts Microsoft between a rock and a hard place: the insatiable thirst
for content on the part of users, and the unreasoning terror which the
content companies feel about making their wares available on the net.

IMO this explains the openness of the Pd design.  Microsoft is not in
the pocket of the content companies, and Pd is not primarily about DRM.
It needs to be sufficient to provide DRM, but at the same time Microsoft
really wants to satisfy its customers, the people who buy PCs.  It is
for them that Microsoft makes Pd open, makes it optional, lets people
run whatever apps they like in trusted space.  Microsoft is gambling
that an open Pd will provide benefits over and above DRM, even if its
openness makes the content companies unhappy.

> 4.  Pay attention to the antitrust angle.  I guarantee you that Microsoft
> believes Pd is a way to extend its market share, not to increase competition.

I agree that the point of Pd is totally to increase Microsoft's market
share.  In fact, as a general principle, everything every business does
is for that specific reason.  IMO it is sufficient to point to the many
benefits Pd can provide to end users to explain why Microsoft is pushing
Palladium.

> There's a lot of good stuff in Pd, and a lot I like about it.

Bruce listed the increased immunity to viruses, trojans and the like;
and the better-than-expected protection of user privacy.  Not much else.
It's not clear whether this counts as a lot to like, or whether there
are other things he likes which he did not mention here.

> There's also
> a lot I don't like, and am scared of.  My fear is that Pd will lead us down
> a road where our computers are no longer our computers, but are instead
> owned by a variety of factions and companies all looking for a piece of our
> wallet.  To the extent that Pd facilitates that reality, it's bad for
> society.  I don't mind companies selling, renting, or licensing things to
> me, but the loss of the power, reach, and flexibility of the computer is
> too great a price to pay.

I can understand that those who have this vision of the future would
oppose Pd.  But I don't see this road laid out before us as so many other
people do.  Pd allows you to run apps that can prove to others that they
are what they say, that can run and store data without being compromised.
I don't see this as a step towards Big Brother.

If it did what people believed, took over your computer and let other
people run apps on it without your permission, wouldn't let you run
your own apps, gave other people "root" on your computer and took it
away from you, I'd agree with the concerns Bruce and others have raised.
But it doesn't do these things.  And to the extent that Pd moves in that
direction, it seems to me that PKI's and digital certificates and even
encryption have already put us on the road, creating data that is opaque
to us, data that we hold but are powerless to alter, data which is in
effect owned by someone else even though it rests on our own equipment.

Reply via email to