Re: Challenge to David Wagner on TCPA

2002-08-13 Thread AARG! Anonymous

Brian LaMacchia writes:

 So the complexity isn't in how the keys get initialized on the SCP (hey, it
 could be some crazy little hobbit named Mel who runs around to every machine
 and puts them in with a magic wand).  The complexity is in the keying
 infrastructure and the set of signed statements (certificates, for lack of a
 better word) that convey information about how the keys were generated 
 stored.  Those statements need to be able to represent to other applications
 what protocols were followed and precautions taken to protect the private
 key.  Assuming that there's something like a cert chain here, the root of
 this chain chould be an OEM, an IHV, a user, a federal agency, your company,
 etc. Whatever that root is, the application that's going to divulge secrets
 to the SCP needs to be convinced that the key can be trusted (in the
 security sense) not to divulge data encrypted to it to third parties.
 Palladium needs to look at the hardware certificates and reliably tell
 (under user control) what they are. Anyone can decide if they trust the
 system based on the information given; Palladium simply guarantees that it
 won't tell anyone your secrets without your explicit request..

This makes a lot of sense, especially for closed systems like business
LANs and WANs where there is a reasonable centralized authority who can
validate the security of the SCP keys.  I suggested some time back that
since most large businesses receive and configure their computers in
the IT department before making them available to employees, that would
be a time that they could issue private certs on the embedded SCP keys.
The employees' computers could then be configured to use these private
certs for their business computing.

However the larger vision of trusted computing leverages the global
internet and turns it into what is potentially a giant distributed
computer.  For this to work, for total strangers on the net to have
trust in the integrity of applications on each others' machines, will
require some kind of centralized trust infrastructure.  It may possibly
be multi-rooted but you will probably not be able to get away from
this requirement.

The main problem, it seems to me, is that validating the integrity of
the SCP keys cannot be done remotely.  You really need physical access
to the SCP to be able to know what key is inside it.  And even that
is not enough, if it is possible that the private key may also exist
outside, perhaps because the SCP was initialized by loading an externally
generated public/private key pair.  You not only need physical access,
you have to be there when the SCP is initialized.

In practice it seems that only the SCP manufacturer, or at best the OEM
who (re) initializes the SCP before installing it on the motherboard,
will be in a position to issue certificates.  No other central authorities
will have physical access to the chips on a near-universal scale at the
time of their creation and installation, which is necessary to allow
them to issue meaningful certs.  At least with the PGP web of trust
people could in principle validate their keys over the phone, and even
then most PGP users never got anyone to sign their keys.  An effective
web of trust seems much more difficult to achieve with Palladium, except
possibly in small groups that already trust each other anyway.

If we do end up with only a few trusted root keys, most internet-scale
trusted computing software is going to have those roots built in.
Those keys will be extremely valuable, potentially even more so than
Verisign's root keys, because trusted computing is actually a far more
powerful technology than the trivial things done today with PKI.  I hope
the Palladium designers give serious thought to the issue of how those
trusted root keys can be protected appropriately.  It's not going to be
enough to say it's not our problem.  For trusted computing to reach
its potential, security has to be engineered into the system from the
beginning - and that security must start at the root!




Re: Challenge to David Wagner on TCPA

2002-08-13 Thread lynn . wheeler

actually it is possible to build chips that generate keys as part of
manufactoring power-on/test (while still in the wafer, and the private key
never, ever exists outside of the chip)  ... and be at effectively the same
trust level as any other part of the chip (i.e. hard instruction ROM).
using such a key pair than can uniquely authenticate a chip 
effectively becomes as much a part of the chip as the ROM or the chip
serial number, etc. The public/private key pair  if appropriately
protected (with evaluated, certified and audited process) then can be
considered somewhat more trusted than a straight serial number aka a
straight serial number can be skimmed and replayed ... where a digital
signature on unique data is harder to replay/spoof.  the hips come with
unique public/private key where the private key is never known.

sometimes this is a difficult consept ... the idea of a public/private key
pair as a form of a difficult to spoof chip serial   when all uses of
public/private key, asymmetric cryptograhy might have always been portrayed
as equilanet to x.509 identity certificates (it is possible to show in
large percentage of the systems that public/private key digital signatures
are sufficient for authentication and any possible certificates are both
redundant and superfulous).

misc. ref (aads chip strawman):
http://www.garlic.com/~lynn/index.html#aads
http://www.asuretee.com/



[EMAIL PROTECTED] on 6/13/2002 11:10 am wrote:

This makes a lot of sense, especially for closed systems like business
LANs and WANs where there is a reasonable centralized authority who can
validate the security of the SCP keys.  I suggested some time back that
since most large businesses receive and configure their computers in the IT
department before making them available to employees, that would be a time
that they could issue private certs on the embedded SCP keys. The
employees' computers could then be configured to use these private certs
for their business computing.

However the larger vision of trusted computing leverages the global
internet and turns it into what is potentially a giant distributed
computer.  For this to work, for total strangers on the net to have trust
in the integrity of applications on each others' machines, will require
some kind of centralized trust infrastructure.  It may possibly be
multi-rooted but you will probably not be able to get away from this
requirement.

The main problem, it seems to me, is that validating the integrity of the
SCP keys cannot be done remotely.  You really need physical access to the
SCP to be able to know what key is inside it.  And even that is not enough,
if it is possible that the private key may also exist outside, perhaps
because the SCP was initialized by loading an externally generated
public/private key pair.  You not only need physical access, you have to be
there when the SCP is initialized.

In practice it seems that only the SCP manufacturer, or at best the OEM who
(re) initializes the SCP before installing it on the motherboard, will be
in a position to issue certificates.  No other central authorities will
have physical access to the chips on a near-universal scale at the time of
their creation and installation, which is necessary to allow them to issue
meaningful certs.  At least with the PGP web of trust people could in
principle validate their keys over the phone, and even then most PGP users
never got anyone to sign their keys.  An effective web of trust seems much
more difficult to achieve with Palladium, except possibly in small groups
that already trust each other anyway.

If we do end up with only a few trusted root keys, most internet-scale
trusted computing software is going to have those roots built in. Those
keys will be extremely valuable, potentially even more so than Verisign's
root keys, because trusted computing is actually a far more powerful
technology than the trivial things done today with PKI.  I hope the
Palladium designers give serious thought to the issue of how those trusted
root keys can be protected appropriately.  It's not going to be enough to
say it's not our problem.  For trusted computing to reach its potential,
security has to be engineered into the system from the beginning - and that
security must start at the root!

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to
[EMAIL PROTECTED]




Re: Challenge to David Wagner on TCPA

2002-08-12 Thread Brian A. LaMacchia

I just want to point out that, as far as Palladium is concerned, we really
don't care how the keys got onto the machine. Certain *applications* written
on top of Palladium will probably care, but all the hardware  the security
kernel really care about is making sure that secrets are only divulged to
the code that had them encrypted in the first place.  It's all a big trust
management problem (or a series of trust management problems) --
applications that are going to rely on SCP keys to protect secrets for them
are going to want some assurances about where the keys live and whether
there's a copy outside the SCP.  I can certainly envision potential
applications that would want guarantees that the key was generated on the
SCP  never left, and I can see other applications that want guarantees that
the key has a copy sitting on another SCP on the other side of the building.

So the complexity isn't in how the keys get initialized on the SCP (hey, it
could be some crazy little hobbit named Mel who runs around to every machine
and puts them in with a magic wand).  The complexity is in the keying
infrastructure and the set of signed statements (certificates, for lack of a
better word) that convey information about how the keys were generated 
stored.  Those statements need to be able to represent to other applications
what protocols were followed and precautions taken to protect the private
key.  Assuming that there's something like a cert chain here, the root of
this chain chould be an OEM, an IHV, a user, a federal agency, your company,
etc. Whatever that root is, the application that's going to divulge secrets
to the SCP needs to be convinced that the key can be trusted (in the
security sense) not to divulge data encrypted to it to third parties.
Palladium needs to look at the hardware certificates and reliably tell
(under user control) what they are. Anyone can decide if they trust the
system based on the information given; Palladium simply guarantees that it
won't tell anyone your secrets without your explicit request..

--bal

P.S. I'm not sure that I actually *want* the ability to extract the private
key from an SCP after it's been loaded, because presumably if I could ask
for the private key then a third party doing a black-bag job on my PC could
also ask for it.  I think what I want is the ability to zeroize the SCP,
remove all state stored within it, and cause new keys to be generated
on-chip.  So long as I can zero the chip whenever I want (or zero part of
it, or whatever) I can eliminate the threat posed by the manufacturer who
initialized the SCP in the first place.

Lucky Green [EMAIL PROTECTED] wrote:
 Ray wrote:

 From: James A. Donald [EMAIL PROTECTED]
 Date: Tue, 30 Jul 2002 20:51:24 -0700

 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
 both Palladium and TCPA deny that they are designed to restrict
 what applications you run.  The TPM FAQ at
 http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
 

 They deny that intent, but physically they have that capability.

 To make their denial credible, they could give the owner
 access to the private key of the TPM/SCP.  But somehow I
 don't think that jibes with their agenda.

 Probably not surprisingly to anybody on this list, with the exception
 of potentially Anonymous, according to the TCPA's own TPM Common
 Criteria Protection Profile, the TPM prevents the owner of a TPM from
 exporting the TPM's internal key. The ability of the TPM to keep the
 owner of a PC from reading the private key stored in the TPM has been
 evaluated to E3 (augmented). For the evaluation certificate issued by
 NIST, see:

 http://niap.nist.gov/cc-scheme/PPentries/CCEVS-020016-VR-TPM.pdf

 If I buy a lock I expect that by demonstrating ownership I
 can get a replacement key or have a locksmith legally open it.

 It appears the days when this was true are waning. At least in the PC
 platform domain.

 --Lucky


 -
 The Cryptography Mailing List
 Unsubscribe by sending unsubscribe cryptography to
 [EMAIL PROTECTED]




RE: Challenge to David Wagner on TCPA

2002-08-11 Thread Jim Choate

On Sat, 10 Aug 2002, Russell Nelson wrote:

 I agree that it's irrelevant.  So why is he trying to argue from
 authority (always a fallacy anyway) without *even* having any way to
 prove that he is that authority?

What has 'authority' got to do with it? Arguments from authority are
-worthless-. Make up your own mind as to its validity, who cares about
their 'proof'.

-Who- is irrelevant. What damns his argument -is- his appeal to
-authority-. Anyone who bases their argument on 'He said...' has already
lost the discussion and invalidated any point they might make. It's one of
the primary fallacies of (for example) Tim May and his consistent appeal
to who he knows or what 'they' said.

We agree, what I don't understand is why you keep expecting that dead
horse to get up...keep asking those damning questions ;)


 --


  Conform and be dull..J. Frank Dobie

 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Challenge to David Wagner on TCPA

2002-08-11 Thread Ben Laurie

Lucky Green wrote:
 Ray wrote:
 
From: James A. Donald [EMAIL PROTECTED]
Date: Tue, 30 Jul 2002 20:51:24 -0700

On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:

both Palladium and TCPA deny that they are designed to restrict
what applications you run.  The TPM FAQ at 
http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads


They deny that intent, but physically they have that capability.

To make their denial credible, they could give the owner 
access to the private key of the TPM/SCP.  But somehow I 
don't think that jibes with their agenda.
 
 
 Probably not surprisingly to anybody on this list, with the exception of
 potentially Anonymous, according to the TCPA's own TPM Common Criteria
 Protection Profile, the TPM prevents the owner of a TPM from exporting
 the TPM's internal key. The ability of the TPM to keep the owner of a PC
 from reading the private key stored in the TPM has been evaluated to E3
 (augmented). For the evaluation certificate issued by NIST, see:
 
 http://niap.nist.gov/cc-scheme/PPentries/CCEVS-020016-VR-TPM.pdf

Obviously revealing the key would defeat any useful properties of the 
TPM/SCP. However, unless the machine refuses to run stuff unless signed 
by some other key, its a matter of choice whether you run an OS that has 
the aforementioned properties.

Of course, its highly likely that if you want to watch products of Da 
Mouse on your PC, you will be obliged to choose a certain OS. In order 
to avoid more sinister uses, it makes sense to me to ensure that at 
least one free OS gets appropriate signoff (and no, that does not 
include a Linux port by HP). At least, it makes sense to me if I assume 
that the certain other OS will otherwise become dominant. Which seems 
likely.

Cheers,

Ben.

-- 
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

Available for contract work.

There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit. - Robert Woodruff




Re: Challenge to David Wagner on TCPA

2002-08-10 Thread D.Popkin

-BEGIN PGP SIGNED MESSAGE-

AARG! Anonymous [EMAIL PROTECTED] writes:

 Lucky Green wrote:
  Ray wrote:
   If I buy a lock I expect that by demonstrating ownership I 
   can get a replacement key or have a locksmith legally open it.

  It appears the days when this was true are waning. At least in the PC
  platform domain.

 We have had other systems which work like this for a long while.
 Many consumer devices are sealed such that if you open them you void
 the warranty.  This is to your advantage as a consumer; ...

There is exactly one person in the world qualified to decide what's to
the advantage of that consumer, and it's not AARG! Anonymous.

-BEGIN PGP SIGNATURE-
Version: 2.6.3ia
Charset: noconv

iQBVAwUBPVRO0PPsjZpmLV0BAQEwrQH/eXqkJVmXYmqNtweg6246KMXmCGekK/h6
HNmnd65WeR2A84pJdJFb8jZ2CX6bJ+XrboaDv8klJCo21xTkFxWIuA==
=DL2o
-END PGP SIGNATURE-




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread AARG! Anonymous

Mike Rosing wrote:
 On Fri, 2 Aug 2002, AARG! Anonymous wrote:

  You don't have to send your data to Intel, just a master storage key.
  This key encrypts the other keys which encrypt your data.  Normally this
  master key never leaves your TPM, but there is this optional feature
  where it can be backed up, encrypted to the manufacturer's public key,
  for recovery purposes.  I think it is also in blinded form.

 In other words, the manufacturer has access to all your data because
 they have the master storage key.

 Why would everyone want to give one manufacturer that much power?

It's not quite that bad.  I mentioned the blinding.  What happens is
that before the master storage key is encrypted, it is XOR'd with a
random value, which is also output by the TPM along with the encrypted
recovery blob.  You save them both, but only the encrypted blob gets
sent to the manufacturer.  So when the manufacturer decrypts the data,
he doesn't learn your secrets.

The system is cumbersome, but not an obvious security leak.




Re: Challenge to David Wagner on TCPA

2002-08-04 Thread Roy M.Silvernail

On Saturday 03 August 2002 05:12 pm, Morlock Elloi wrote:

 UUCP will work as long as people can talk over telephone and there are
 modems available. The harder and more inconvenient it becomes to connect
 the higher average IQ of participants will be.

 There is hope.

 Just imagine the absence of short-attention span morons that find uucp too
 complicated. Ask around.

But if WorldCom disolves in bankruptcy, will UUNet still be the center of the 
bang-path universe?

More seriously, I think many of us old-timers long for the time when a 
certain level of wizardry was required to get on the net. (before Prodigy and 
the September that Never Ended)
-- 
Roy M. Silvernail [ ] [EMAIL PROTECTED] 
(formerly uunet!comcon!cybrspc!roy)
DNRC Minister Plenipotentiary of All Things Confusing, Software Division
PGP Key 0x1AF39331 :  71D5 2EA2 4C27 D569  D96B BD40 D926 C05E
 Key available from [EMAIL PROTECTED]
I charge to process unsolicited commercial email




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread Eugen Leitl

On Sat, 3 Aug 2002, Morlock Elloi wrote:

 Ah, the computers. Well, those that want computers will have them.
 They may not be as cheap as today and there will not be as many of
 them, but I think that all people *I* deal with will have them, so I
 don't really care.

Sure, people will have computers. However, if we merrily slide down the
slippery slope the authentication might move into the network layer
eventually. You will be on the network, yet you will be not on the
network. 

One might be able to fab computers at small scale (FPGA, organic
transistors via inkjet, whatever), but it will be tough to create global
networks using just overlapping patches of wireless. Especially, if rogue
wireless will be rather illegal.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread James A. Donald

--
On 2 Aug 2002 at 14:36, Trei, Peter wrote:
 OK, It's 2004, I'm an IT Admin,
 and I've converted my corporation over to TCPA/Palladium machines. My
 Head of Marketing has his TCPA/Palladium desktop's hard drive
 jam-packed with corporate confidential documents he's been actively
 working on - sales projections,  product plans, pricing schemes.
 They're all sealed files.

 His machine crashes - the MB burns out.
 He wants to recover the data.

 HoM:  I want to recover my data.
 Me:   OK: We'll pull the HD, and get the data off it.
 HoM:  Good - mount it as a secondary HD in my new system.
 Me:   That isn't going to work now we have TCPA and Palladium.
 HoM:  Well, what do you have to do?
 Me:   Oh, it's simple. We encrypt the data under Intel's TPME key,
  and send it off to Intel. Since Intel has all the keys, they can
  unseal all your data to plaintext, copy it, and then re-seal it for
  your new system. It only costs $1/Mb.
 HoM:  Let me get this straight - the only way to recover this data is
 to let
  Intel have a copy, AND pay them for it?
 Me:   Um... Yes. I think MS might be involved as well, if your were
 using
  Word.
 HoM:  You are *so* dead.

Obviously it is insane to use keys that you do not yourself control 
to keep secrets.  That, however, is not the purpose of TCPA/Palladium 
as envisaged by Microsoft.

The intent is that Peter can sell Paul software or content that will 
only run on ONE computer for ONE time period..

When the motherboard emits blue smoke, or the time runs out, 
whichever happens first, Paul has to buy new software.  If prices are 
lowered accordingly, this might be acceptable.

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 4Mqj1ia6DD0EYpdLMEd7al35eTYefnvhcFesBlMz
 25n9obdfhvRVxEkY4YtWw7BuFxrOKgTtfI1Dp8uAA




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Albion Zeglin

Quoting Jay Sulzberger [EMAIL PROTECTED]:


 b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
 separate dongle?  Because, of course, the Englobulators proceed here on
 principle.  The principle being that only the Englobulators have a right to
 own printing presses/music studios/movie and animation studios.
 

A separate dongle can't verify the integrity of the processor.  The important
part is that the processor's state (including initial RAM load) is verifiable.
Without this the OS could be virtualized and modified after the integrity check.

Just imagine running Windows Media Player on a virtual machine, trapping the 
calls to the audio card and thus being able to copy content perfectly.  A 
dongle can't prevent this.

Eventually for TCPA to be effective against hardware hacks such as memory probes, 
not only will the harddrive storage be sealed, but RAM must be sealed as well.
Once TCPA moves onprocessor, I expect encrypted RAM will be next.  

Albion.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread AARG! Anonymous

Peter Trei writes:

 It's rare enough that when a new anononym appears, we know
 that the poster made a considered decision to be anonymous.

 The current poster seems to have parachuted in from nowhere, 
 to argue a specific position on a single topic. It's therefore 
 reasonable  to infer that the nature of that position and topic has 
 some bearing on the decision to be anonymous.


Yes, my name is AARG!.  That was the first thing my mother said after
I was born, and the name stuck.

Not really.  For Peter's information, the name associated with a
message through an anonymous remailer is simply the name of the
last remailer in the chain, whatever that remailer operator chose
to call it.  AARG is a relatively new remailer, but if you look at
http://anon.efga.org/Remailers/TypeIIList you will see that it is very
reliable and fast.  I have been using it as an exit remailer lately
because other ones that I have used often produce inconsistent results.
It has not been unusual to have to send a message two or three times
before it appears.  So far that has not been a problem with this one.

So don't read too much into the fact that a bunch of anonymous postings
have suddenly started appearing from one particular remailer.  For your
information, I have sent over 400 anonymous messages in the past year
to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
of them on TCPA related topics).




Re: CDR: RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Alif The Terrible


On Fri, 2 Aug 2002, AARG! Anonymous wrote:

  I have sent over 400 anonymous messages in the past year
 to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
 of them on TCPA related topics).

I see you are no too worries about traffic analysis?

-- 
Yours, 
J.A. Terranson
[EMAIL PROTECTED]

If Governments really want us to behave like civilized human beings, they
should give serious consideration towards setting a better example:
Ruling by force, rather than consensus; the unrestrained application of
unjust laws (which the victim-populations were never allowed input on in
the first place); the State policy of justice only for the rich and 
elected; the intentional abuse and occassionally destruction of entire
populations merely to distract an already apathetic and numb electorate...
This type of demogoguery must surely wipe out the fascist United States
as surely as it wiped out the fascist Union of Soviet Socialist Republics.

The views expressed here are mine, and NOT those of my employers,
associates, or others.  Besides, if it *were* the opinion of all of
those people, I doubt there would be a problem to bitch about in the
first place...






RE: Challenge to David Wagner on TCPA

2002-08-03 Thread AARG! Anonymous

Peter Trei envisions data recovery in a TCPA world:

 HoM:  I want to recover my data.
 Me:   OK: We'll pull the HD, and get the data off it.
 HoM:  Good - mount it as a secondary HD in my new system.
 Me:   That isn't going to work now we have TCPA and Palladium.
 HoM:  Well, what do you have to do?
 Me:   Oh, it's simple. We encrypt the data under Intel's TPME key,
  and send it off to Intel. Since Intel has all the keys, they can
  unseal all your data to plaintext, copy it, and then re-seal it for
  your new system. It only costs $1/Mb.
 HoM:  Let me get this straight - the only way to recover this data is
 to let
  Intel have a copy, AND pay them for it?
 Me:   Um... Yes. I think MS might be involved as well, if your were
 using
  Word.
 HoM:  You are *so* dead.

It's not quite as bad as all this, but it is still pretty bad.

You don't have to send your data to Intel, just a master storage key.
This key encrypts the other keys which encrypt your data.  Normally this
master key never leaves your TPM, but there is this optional feature
where it can be backed up, encrypted to the manufacturer's public key,
for recovery purposes.  I think it is also in blinded form.

Obviously you'd need to do this backup step before the TPM crashed;
afterwards is too late.  So maybe when you first get your system it
generates the on-chip storage key (called the SRK, storage root key),
and then exports the recovery blob.  You'd put that on a floppy or some
other removable medium and store it somewhere safe.  Then when your
system dies you pull out the disk and get the recovery blob.

You communicate with the manufacturer, give him this recovery blob, along
with the old TPM key and the key to your new TPM in the new machine.
The manufacturer decrypts the blob and re-encrypts it to the TPM in the
new machine.  It also issues and distributes a CRL revoking the cert on
the old TPM key so that the old machine can't be used to access remote
TCPA data any more.  (Note, the CRL is not used by the TPM itself, it is
just used by remote servers to decide whether to believe client requests.)

The manufacturer sends the data back to you and you load it into the TPM
in your new machine, which decrypts it and stores the master storage key.
Now it can read your old data.

Someone asked if you'd have to go through all this if you just upgraded
your OS.  I'm not sure.  There are several secure registers on the
TPM, called PCRs, which can hash different elements of the BIOS, OS,
and other software.  You can lock a blob to any one of these registers.
So in some circumstances it might be that upgrading the OS would keep the
secure data still available.  In other cases you might have to go through
some kind of recovery procedure.

I think this recovery business is a real Achilles heel of the TCPA
and Palladium proposals.  They are paranoid about leaking sealed data,
because the whole point is to protect it.  So they can't let you freely
copy it to new machines, or decrypt it from an insecure OS.  This anal
protectiveness is inconsistent with the flexibility needed in an imperfect
world where stuff breaks.

My conclusion is that the sealed storage of TCPA will be used sparingly.
Ross Anderson and others suggest that Microsoft Word will seal all of
its documents so that people can't switch to StarOffice.  I think that
approach would be far too costly and risky, given the realities I have
explained above.  Instead, I would expect that only highly secure data
would be sealed, and that there would often be some mechanism to recover
it from elsewhere.  For example, in a DRM environment, maybe the central
server has a record of all the songs you have downloaded.  Then if your
system crashes, rather than go through a complicated crypto protocol to
recover, you just buy a new machine, go to the server, and re-download
all the songs you were entitled to.

Or in a closed environment, like a business which seals sensitive
documents, the data could be backed up redundantly to multiple central
file servers, each of which seal it.  Then if one machine crashes,
the data is available from others and there is no need to go through
the recovery protocol.

So there are solutions, but they will add complexity and cost.  At the
same time they do add genuine security and value.  Each application and
market will have to find its own balance of the costs and benefits.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Jay Sulzberger

On Fri, 2 Aug 2002, Albion Zeglin wrote:

 Quoting Jay Sulzberger [EMAIL PROTECTED]:


  b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
  separate dongle?  Because, of course, the Englobulators proceed here on
  principle.  The principle being that only the Englobulators have a right to
  own printing presses/music studios/movie and animation studios.
 


 A separate dongle can't verify the integrity of the processor.  The
 important part is that the processor's state (including initial RAM load)
 is verifiable.

But if you just want to show movies securely you need not use my general
purpose and today untrammeled computer.  You can either show movies in
movie houses, or use some slightly trammeled version of a cable ready TV,
or the variant product mentioned earlier, the donglified monitor/speaker.

There is no need for the MPAA to verify the integrity of the processor if
all the MPAA wants to do is sell me tickets to movies.

 Without this the OS could be virtualized and modified after the integrity
 check.

What does the enforcement of the laws against copyright infringement have
to do with my general purpose and today untrammeled computer?  There is no
relation of the sort you, and all the mass media, implicitly assume here.
Indeed no OS at all should be involved in the secure showing of movies.
It is like using the standard C libraries to write secure code!


 Just imagine running Windows Media Player on a virtual machine, trapping
 the calls to the audio card and thus being able to copy content
 perfectly.  A dongle can't prevent this.

My donglified monitor/speakers combination, of course, offers greater
assurance.  Here is part of my argument: the explanation of my proposed
protocols can actually be understood.


 Eventually for TCPA to be effective against hardware hacks such as memory
 probes, not only will the harddrive storage be sealed, but RAM must be
 sealed as well.
 Once TCPA moves onprocessor, I expect encrypted RAM will be next.

 Albion.

The dilemma Either give over all the computers in the world to the
Englobulators, or never get to see another big budget Hollywood movie. is
a false dichotomy.

oo--JS.




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread James A. Donald

 --
On 2 Aug 2002 at 0:36, David Wagner wrote:
 For instance, suppose that, thanks to TCPA/Palladium, Microsoft 
 could design Office 2005 so that it is impossible for StarOffice 
 and other clones to read files created in Office 2005.  Would 
 some users object?

In an anarchic society, or under a government that did not define 
and defend IP, TCPA/Palladium would probably give roughly the 
right amount of protection to intellectual property by technical 
means in place of legal means.

Chances are that the thinking behind Palladium is not Let us sell 
out to the Hollywood lobby but rather Let us make those !@#$$%^ 
commie chinese pay for their *^%$## software.

Of course, in a society with both legal and technical protection 
of IP, the likely outcome is oppressive artificial monopolies 
sustained both by technology and state power.

I would certainly much prefer TCPA/Palladium in place of existing
IP law.  What I fear is that instead legislation and technology
will each reinforce the other. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 R66NXPp5xZNDYn98jcVqH5q22ikRRFR3evv5xfwF
 2PNka92tYm9+/iBKaR+IcOoDA8BwXZlwcPD18Ogw8




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread David G. Koontz

Jon Callas wrote:
 On 8/1/02 1:14 PM, Trei, Peter [EMAIL PROTECTED] wrote:
 
 
So my question is: What is your reason for shielding your identity?
You do so at the cost of people assuming the worst about your
motives.
 
 
 Is this a tacit way to suggest that the only people who need anonymity or
 pseudonymity are those with something to hide?
 



RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Trei, Peter

 Jon Callas[SMTP:[EMAIL PROTECTED]]
 
 
 On 8/1/02 1:14 PM, Trei, Peter [EMAIL PROTECTED] wrote:
 
  So my question is: What is your reason for shielding your identity?
  You do so at the cost of people assuming the worst about your
  motives.
 
 Is this a tacit way to suggest that the only people who need anonymity or
 pseudonymity are those with something to hide?
 
 Jon
 
Not really. However, in todays actual environment, this is frequently 
true that those with something to hide use anonymity. 

While some people have maintained nyms for many years (I can't
think of anyone maintaining explicit stong anonymity right now,
actually - remember Sue D. Nym? ),  and used them to talk about 
a variety of issues, it's pretty rare.

It's rare enough that when a new anononym appears, we know
that the poster made a considered decision to be anonymous.

The current poster seems to have parachuted in from nowhere, 
to argue a specific position on a single topic. It's therefore 
reasonable  to infer that the nature of that position and topic has 
some bearing on the decision to be anonymous.

Since the position argued involves nothing which would invoke the
malign interest of government powers or corporate legal departments, 
it's not that. I can only think of two reasons why our corrospondent
may have decided to go undercover... 

1. If we know who he/she/them were, it would weaken the argument
(for example, by making it clear that the poster has a vested interest
in the position maintained, or that 'AARGH! is the group effort of an
astroturf campaign).

2. If the true identity of the poster became known, he/she/them
fears some kind of retribution:
* The ostracism and detestation of his peers.
* The boycotting of his employer. 
* His employer objecting to his wasting company time on 
  Internet mailing lists.

Our corrospondent has not given us any reason not to 
infer the worst motives. This is, after all, a discipline where
paranoia and suspicion are job requirements.

Peter Trei
Disclaimer: The above represents my private , personal 
opinions only; do not misconstrue them to represent the 
opinions of others.




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread Jon Callas

On 8/1/02 1:14 PM, Trei, Peter [EMAIL PROTECTED] wrote:

 So my question is: What is your reason for shielding your identity?
 You do so at the cost of people assuming the worst about your
 motives.

Is this a tacit way to suggest that the only people who need anonymity or
pseudonymity are those with something to hide?

Jon




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread James A. Donald

--
On 2 Aug 2002 at 10:43, Trei, Peter wrote:
 Since the position argued involves nothing which would invoke
 the malign interest of government powers or corporate legal
 departments, it's not that. I can only think of two reasons why
 our corrospondent may have decided to go undercover...

I can think of two innocuous reasons, though the real reason is
probably something else altogether:

1.  Defending copyright enforcement is extremely unpopular because
it seemingly puts you on the side of the hollywood cabal, but in
fact TCPA/Paladium, if it works as described, and if it is not
integrated with legal enforcement, does not over reach in the
fashion that most recent intellectual property legislation, and
most recent policy decisions by the patent office over reach.

2..  Legal departments are full of people who are, among their
many other grievious faults, technologically illiterate.
Therefore when an insider is talking about something, they cannot
tell when he is leaking inside information or not, and tend to
have kittens, because they have to trust him (being unable to tell
if he is leaking information covered by NDA), and are
constitutionally incapable of trusting anyone. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 Alf9R2ZVGqWkLhwWX2H6TBqHOunrj2Fbxy+U0ORV
 2uPGI4gMDt1fTQkV1820PO3xWmAWPiaS0DqrbmobN




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Jay Sulzberger

On Fri, 2 Aug 2002, James A. Donald wrote:

 --
 On 2 Aug 2002 at 10:43, Trei, Peter wrote:
  Since the position argued involves nothing which would invoke
  the malign interest of government powers or corporate legal
  departments, it's not that. I can only think of two reasons why
  our corrospondent may have decided to go undercover...

 I can think of two innocuous reasons, though the real reason is
 probably something else altogether:

 1.  Defending copyright enforcement is extremely unpopular because
 it seemingly puts you on the side of the hollywood cabal, but in
 fact TCPA/Paladium, if it works as described, and if it is not
 integrated with legal enforcement, does not over reach in the
 fashion that most recent intellectual property legislation, and
 most recent policy decisions by the patent office over reach.

a. TCPA/Palladium must be integrated with laws which give to the
Englobulators absolute legal cudgel powers, such as the DMCA.  So far I
have not seen any proposal by the Englobulators to repeal the DMCA and
cognate laws, so if TCPA/Palladium is imposed, the DMCA will be used, just
as HP threatened to use it a couple of days ago.  And, of course, today
there is no imposed TCPA/Palladium, so the situation will be much worse
when there is.

b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
separate dongle?  Because, of course, the Englobulators proceed here on
principle.  The principle being that only the Englobulators have a right to
own printing presses/music studios/movie and animation studios.


 2..  Legal departments are full of people who are, among their
 many other grievious faults, technologically illiterate.
 Therefore when an insider is talking about something, they cannot
 tell when he is leaking inside information or not, and tend to
 have kittens, because they have to trust him (being unable to tell
 if he is leaking information covered by NDA), and are
 constitutionally incapable of trusting anyone.

 --digsig

There is a business, not yet come into existence, of providing standard
crypto services to law offices.

oo--JS.




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread AARG! Anonymous

Sampo Syreeni writes:

 On 2002-08-01, AARG!Anonymous uttered to [EMAIL PROTECTED],...:

 It does this by taking hashes of the software before transferring
 control to it, and storing those hashes in its internal secure
 registers.

 So, is there some sort of guarantee that the transfer of control won't be
 stopped by a check against cryptographic signature within the executable
 itself, in the future? That sort of thing would be trivial to enforce via
 licencing terms, after all, and would allow for the introduction of a
 strictly limited set of operating systems to which control would be
 transferred.

TCPA apparently does not have licensing terms per se.  They say,
in their FAQ, http://www.trustedcomputing.org/docs/Website_TCPA%20FAQ_0703021.pdf,
The TCPA spec is currently set up as a 'just publish' IP model.
So there are no licensing terms to enforce, and no guarantees that
people won't do bad things outside the scope of the spec.  Of course,
you realize that the same thing is true with PCs today, right?  There are
few guarantees in this life.

If you think about it, TCPA doesn't actually facilitate the kind of
crypto-signature-checking you are talking about.  You don't need all
this fancy hardware and secure hashes to do that.  Your worrisome
signature checking would be applied on the software which *hasn't
yet been loaded*, right?  All the TCPA hardware will give you is a
secure hash on the software which has already loaded before you ran.
That doesn't help you; in fact your code can pretty well predict the
value of this, given that it is running.  Think about this carefully,
it is a complicated point but you can get it if you take your time.

In short, to implement a system where only signed code can run, TCPA is
not necessary and not particularly helpful.


 I'm having a lot of trouble seeing the benefit in TCPA
 without such extra measures, given that open source software would likely
 evolve which circumvented any protection offered by the more open ended
 architecture you now describe.

I don't follow what you are getting at with the open source.  Realize that
when you boot a different OS, the TCPA attestation features will allow
third parties to detect this.  So your open source OS cannot masquerade
as a different one and fool a third party server into downloading data
to your software.  And likewise, data which was sealed (encrypted)
under a secure OS cannot be unsealed once a different OS boots, because
the sealing/unsealing is all done on-chip, and the chip uses the secure
hash registers to check if the unsealing is allowed.


 Then, when the data is decrypted and unsealed, the hash is compared to
 that which is in the TPM registers now.  This can make it so that data
 which is encrypted when software system X boots can only be decrypted
 when that same software boots.

 Again, such values would be RE'd and reported by any sane open source OS
 to the circuitry, giving access to whatever data there is. If this is
 prevented, one can bootstrap an absolutely secure platform where whatever
 the content provider says is the Law, including a one where every piece of
 runnable OS software actually enforces the kind of control over
 permissible signatures Peter is so worried about. Where's the guarantee
 that this won't happen, one day?

Not sure I follow this here... the sealed data cannot be reported by an
open source OS because the secret keys never leave the chip without being
themselves encrypted.  As for your second proposal, you are suggesting
that you could write an OS which would only run signed applications?
And run it on a TCPA platform?  Sure, I guess you could.  But you wouldn't
need TCPA features to do it.  See the comments above: any OS today could
be modified to only run apps that were signed with some special key.
You shouldn't blame TCPA for this.


 In answer to your question, then, for most purposes, there is no signing
 key that your TPM chip trusts, so the issue is moot.

 At the hardware level, yes.

TCPA is a hardware spec.  Peter was asking about TCPA, and I gave him the
answer.  You can hypothesize all the facist software you want, but you
shouldn't blame these fantasies on TCPA.

 At the software one, it probably won't be,
 even in the presence of the above considerations. After you install your
 next Windows version, you will be tightly locked in with whatever M$
 throws at you in their DLL's,

Doesn't Microsoft already sign their system DLLs in NT?

 and as I pointed out, there's absolutely no
 guarantee Linux et al. might well be shut out by extra features, in the
 future. In the end what we get is an architecture, which may not embody
 Peter's concerns right now, but which is built from the ground up to bring
 them into being, later.

Again, you are being entirely hypothetical here.  Please describe exactly
how either attestation or secure storage would assist in creating a boot
loader that would refuse to run Linux, or whatever other horrible disaster
you envision.


Re: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

James Donald writes:
 TCPA and Palladium give someone else super root privileges on my
 machine, and TAKE THOSE PRIVILEGES AWAY FROM ME.  All claims that
 they will not do this are not claims that they will not do this,
 but are merely claims that the possessor of super root privilege
 on my machine is going to be a very very nice guy, unlike my
 wickedly piratical and incompetently trojan horse running self.

What would be an example of a privilege that you fear would be taken
away from you with TCPA?  It will boot any software that you want, and
can provide a signed attestation of a hash of what you booted.  Are you
upset because you can't force the chip to lie about what you booted?
Of course they could have designed the chip to allow you to do that, but
then the functionality would be useless to everyone; a chip which could
be made to lie about its measurements might as well not exist, right?




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread David Wagner

James A. Donald wrote:
According to Microsoft, the end user can turn the palladium 
hardware off, and the computer will still boot.  As long as that 
is true, it is an end user option and no one can object.

Your point is taken.  That said, even if you could turn off TCPA 
Palladium and run some outdated version of Windows, whether users
would object is not entirely obvious.  For instance, suppose that,
thanks to TCPA/Palladium, Microsoft could design Office 2005 so that it
is impossible for StarOffice and other clones to read files created in
Office 2005.  Would some users object?  I don't know.  For many users,
being unable to read documents created in a recent version of Office
is simply not an option.  However, in any case we should consider in
advance the possible implications of this technology.




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread James A. Donald

--
On 31 Jul 2002 at 23:45, AARG! Anonymous wrote:
 So TCPA and Palladium could restrict which software you could 
 run. They aren't designed to do so, but the design could be 
 changed and restrictions added.

Their design, and the institutions and software to be designed 
around them, is disturbingly similar to what would be needed to 
restrict what software we could run.  TCPA institutions and 
infrastructure are much the same as SSSCA institutions and 
infrastructure.

According to Microsoft, the end user can turn the palladium 
hardware off, and the computer will still boot.  As long as that 
is true, it is an end user option and no one can object.

But this is not what the content providers want.  They want that 
if you disable the Fritz chip, the computer does not boot.  What 
they want is that it shall be illegal to sell a computer capable 
of booting if the Fritz chip is disabled.

If I have to give superroot powers to Joe in order to run Joe's 
software or play Joe's content, fair enough.  But the hardware and 
institutions to implement this are disturbingly similar to the 
hardware and institutions needed to implement the rule that I have 
to give superroot powers to Joe in order to play Peter's software 
or content.. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 FQhKMpDHys7gyFWenHCK9p7+Xfh1DwpaqGKcztxk
 20jFdJDiigV/b1fmHBudici59omqc/Ze0zXBVvQLk




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread Eric Murray

On Thu, Aug 01, 2002 at 02:33:43PM -0700, James A. Donald wrote:

 According to Microsoft, the end user can turn the palladium 
 hardware off, and the computer will still boot.  As long as that 
 is true, it is an end user option and no one can object.
 
 But this is not what the content providers want.  They want that 
 if you disable the Fritz chip, the computer does not boot.  What 
 they want is that it shall be illegal to sell a computer capable 
 of booting if the Fritz chip is disabled.

Nope.  They care that the Fritz chip is enabled whenever
their content is played.  There's no need to make it a legal
requirement if the market makes it a practical requirement.
The Linux folks just won't be able to watch the latest
Maria Lopez or Jennifer Carey DVDs.  But who cares about a few
geeks?  Only weirdos install alternative OSs anyhow, they can be
ignored.  Most of them will probably have second systems
with the Fritz chip enabled anyhow.

Eric




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread R. Hirschfeld

 From: James A. Donald [EMAIL PROTECTED]
 Date: Tue, 30 Jul 2002 20:51:24 -0700

 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
  both Palladium and TCPA deny that they are designed to restrict 
  what applications you run.  The TPM FAQ at 
  http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
  
 
 They deny that intent, but physically they have that capability. 

To make their denial credible, they could give the owner access to the
private key of the TPM/SCP.  But somehow I don't think that jibes with
their agenda.

If I buy a lock I expect that by demonstrating ownership I can get a
replacement key or have a locksmith legally open it.




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

Eric Murray writes:
 TCPA (when it isn't turned off) WILL restrict the software that you
 can run.  Software that has an invalid or missing signature won't be
 able to access sensitive data[1].   Meaning that unapproved software
 won't work.

 [1] TCPAmain_20v1_1a.pdf, section 2.2

We need to look at the text of this in more detail.  This is from
version 1.1b of the spec:

: This section introduces the architectural aspects of a Trusted Platform
: that enable the collection and reporting of integrity metrics.
:
: Among other things, a Trusted Platform enables an entity to determine
: the state of the software environment in that platform and to SEAL data
: to a particular software environment in that platform.
:
: The entity deduces whether the state of the computing environment in
: that platform is acceptable and performs some transaction with that
: platform. If that transaction involves sensitive data that must be
: stored on the platform, the entity can ensure that that data is held in
: a confidential format unless the state of the computing environment in
: that platform is acceptable to the entity.
:
: To enable this, a Trusted Platform provides information to enable the
: entity to deduce the software environment in a Trusted Platform. That
: information is reliably measured and reported to the entity. At the same
: time, a Trusted Platform provides a means to encrypt cryptographic keys
: and to state the software environment that must be in place before the
: keys can be decrypted.

What this means is that a remote system can query the local TPM and
find out what software has been loaded, in order to decide whether to
send it some data.  It's not that unapproved software won't work,
it's that the remote guy can decide whether to trust it.

Also, as stated earlier, data can be sealed such that it can only be
unsealed when the same environment is booted.  This is the part above
about encrypting cryptographic keys and making sure the right software
environment is in place when they are decrypted.

 Ok, technically it will run but can't access the data,
 but that it a very fine hair to split, and depending on the nature of
 the data that it can't access, it may not be able to run in truth.

 If TCPA allows all software to run, it defeats its purpose.
 Therefore Wagner's statement is logically correct.

But no, the TCPA does allow all software to run.  Just because a remote
system can decide whether to send it some data doesn't mean that software
can't run.  And just because some data may be inaccessible because it
was sealed when another OS was booted, also doesnt mean that software
can't run.

I think we agree on the facts, here.  All software can run, but the TCPA
allows software to prove its hash to remote parties, and to encrypt data
such that it can't be decrypted by other software.  Would you agree that
this is an accurate summary of the functionality, and not misleading?

If so, I don't see how you can get from this to saying that some software
won't run.  You might as well say that encryption means that software
can't run, because if I encrypt my files then some other programs may
not be able to read them.

Most people, as you may have seen, interpret this part about software
can't run much more literally.  They think it means that software needs
a signature in order to be loaded and run.  I have been going over and
over this on sci.crypt.  IMO the facts as stated two paragraphs up are
completely different from such a model.

 Yes, the spec says that it can be turned off.  At that point you
 can run anything that doesn't need any of the protected data or
 other TCPA services.   But, why would a software vendor that wants
 the protection that TCPA provides allow his software to run
 without TCPA as well, abandoning those protections?

That's true; in fact if you ran it earlier under TCPA and sealed some
data, you will have to run under TCPA to unseal it later.  The question
is whether the advantages of running under TCPA (potentially greater
security) outweigh the disadvantages (greater potential for loss of
data, less flexibility, etc.).

 I doubt many would do so, the majority of TCPA-enabled
 software will be TCPA-only.  Perhaps not at first, but eventually
 when there are enough TCPA machines out there.  More likely, spiffy
 new content and features will be enabled if one has TCPA and is
 properly authenticated, disabled otherwise.  But as we have seen
 time after time, today's spiffy new content is tomorrows
 virtual standard.

Right, the strongest case will probably be for DRM.  You might be able
to download all kinds of content if you are running an OS and application
that the server (content provider) trusts.  People will have a choice of
using TCPA and getting this data legally, or avoiding TCPA and trying to
find pirated copies as they do today.

 This will require the majority of people to run with TCPA turned on
 if they want the content.  TCPA doesn't need to be required by law,

Re: Challenge to David Wagner on TCPA

2002-07-31 Thread James A. Donald

--


On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
 both Palladium and TCPA deny that they are designed to restrict 
 what applications you run.  The TPM FAQ at 
 http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
 

They deny that intent, but physically they have that capability. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 ElmZA5NX6jAmhPu1EDT8Zl7D+IeQTSI/z1oo4lSn
 2qoSIC6KSr2LFLWyxZEETG/27dEy3yOWEnRtXzHy9




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread AARG! Anonymous

James Donald wrote:
 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
  both Palladium and TCPA deny that they are designed to restrict 
  what applications you run.  The TPM FAQ at 
  http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads

 They deny that intent, but physically they have that capability. 

Maybe, but the point is whether the architectural spec includes that
capability.  After all, any OS could restrict what applications you
run; you don't need special hardware for that.  The question is whether
restrictions on software are part of the design spec.  You should be
able to point to something in the TCPA spec that would restrict or limit
software, if that is the case.

Or do you think that when David Wagner said, Both Palladium and TCPA
incorporate features that would restrict what applications you could run,
he meant that *could* restrict what applications you run?  They *could*
impose restrictions, just like any OS could impose restrictions.

But to say that they *would* impose restrictions is a stronger
statement, don't you think?  If you claim that an architecture would
impose restrictions, shouldn't you be able to point to somewhere in the
design document where it explains how this would occur?

There's enormous amount of information in the TCPA spec about how to
measure the code which is going to be run, and to report those measurement
results so third parties can know what code is running.  But there's not
one word about preventing software from running based on the measurements.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Declan McCullagh

I imagine there's a world of difference between will and would.

-Declan


On Mon, Jul 29, 2002 at 03:35:32PM -0700, AARG!Anonymous wrote:
 Can you find anything in this spec that would do what David Wagner says
 above, restrict what applications you could run?  Despite studying this
 spec for many hours, no such feature has been found.
 
 So here is the challenge to David Wagner, a well known and justifiably
 respected computer security expert: find language in the TCPA spec to
 back up your claim above, that TCPA will restrict what applications
 you can run.  Either that, or withdraw the claim, and try to get Declan
 McCullagh to issue a correction.  (Good luck with that!)
 
 And if you want, you can get Ross Anderson to help you.  His reports are
 full of claims about Palladium and TCPA which seem equally unsupported
 by the facts.  When pressed, he claims secret knowledge.  Hopefully David
 Wagner will have too much self-respect to fall back on such a convenient
 excuse.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Jay Sulzberger

On Tue, 30 Jul 2002, James A. Donald wrote:

 --


 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
  both Palladium and TCPA deny that they are designed to restrict
  what applications you run.  The TPM FAQ at
  http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
  

 They deny that intent, but physically they have that capability.

 --digsig
  James A. Donald

If they do not restrict what programs I may run, then presumably, under
TCPA, I might run a cracking program on an encrypted file I obtained via
TCPA handshake+transmissal?

The claims that TCPA, Palladium, etc. do not give root to the Englobulators
is, on its face, ridiculous.  Their main design criterion is to do so.

oo--JS.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Nicko van Someren

On Wednesday, July 31, 2002, at 04:51  am, James A. Donald wrote:
 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
 both Palladium and TCPA deny that they are designed to restrict
 what applications you run.  The TPM FAQ at
 http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
 

 They deny that intent, but physically they have that capability.

And all kitchen knives are murder weapons.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Peter Fairbrother

 AARG! Anonymous wrote:

 James Donald wrote:
 On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
 both Palladium and TCPA deny that they are designed to restrict
 what applications you run.  The TPM FAQ at
 http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
 
 They deny that intent, but physically they have that capability.
 
 Maybe, but the point is whether the architectural spec includes that
 capability.  After all, any OS could restrict what applications you
 run; you don't need special hardware for that.  The question is whether
 restrictions on software are part of the design spec.  You should be
 able to point to something in the TCPA spec that would restrict or limit
 software, if that is the case.
 
 Or do you think that when David Wagner said, Both Palladium and TCPA
 incorporate features that would restrict what applications you could run,
 he meant that *could* restrict what applications you run?  They *could*
 impose restrictions, just like any OS could impose restrictions.
 
 But to say that they *would* impose restrictions is a stronger
 statement, don't you think?  If you claim that an architecture would
 impose restrictions, shouldn't you be able to point to somewhere in the
 design document where it explains how this would occur?
 
 There's enormous amount of information in the TCPA spec about how to
 measure the code which is going to be run, and to report those measurement
 results so third parties can know what code is running.  But there's not
 one word about preventing software from running based on the measurements.
 

The wise general will plan his defences according to his opponent's
capabilities, not according to his opponent's avowed intentions.

However, in this case the intention to attack with all available weapons has
not been well hidden. There may be some dupes who honestly profess that no
attack is planned, and some naif's who cannot or will not see the wood, but
they will reap the whirlwind.

My humble opinion,

-- Peter Fairbrother




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread James A. Donald

--
29 Jul 2002 at 15:35, AARG! Anonymous wrote:
   both Palladium and TCPA deny that they are designed to
   restrict what applications you run.

James A. Donald:
  They deny that intent, but physically they have that
  capability.

 On 31 Jul 2002 at 16:10, Nicko van Someren wrote:
 And all kitchen knives are murder weapons.

No problem if I also have a kitchen knife.

TCPA and Palladium give someone else super root privileges on my
machine, and TAKE THOSE PRIVILEGES AWAY FROM ME.  All claims that
they will not do this are not claims that they will not do this,
but are merely claims that the possessor of super root privilege
on my machine is going to be a very very nice guy, unlike my
wickedly piratical and incompetently trojan horse running self.

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 XQHdtzqDInBFsDcorfDvqJYRHTRhEBsM9eMJIH+w
 2+o4WjsTSV8RDUO7k3c71T9v9JQKwZGZC54BqW6DQ