Re: Challenge to David Wagner on TCPA

2002-08-13 Thread lynn . wheeler

actually it is possible to build chips that generate keys as part of
manufactoring power-on/test (while still in the wafer, and the private key
never, ever exists outside of the chip)  ... and be at effectively the same
trust level as any other part of the chip (i.e. hard instruction ROM).
using such a key pair than can uniquely authenticate a chip 
effectively becomes as much a part of the chip as the ROM or the chip
serial number, etc. The public/private key pair  if appropriately
protected (with evaluated, certified and audited process) then can be
considered somewhat more trusted than a straight serial number aka a
straight serial number can be skimmed and replayed ... where a digital
signature on unique data is harder to replay/spoof.  the hips come with
unique public/private key where the private key is never known.

sometimes this is a difficult consept ... the idea of a public/private key
pair as a form of a "difficult to spoof" chip serial   when all uses of
public/private key, asymmetric cryptograhy might have always been portrayed
as equilanet to x.509 identity certificates (it is possible to show in
large percentage of the systems that public/private key digital signatures
are sufficient for authentication and any possible certificates are both
redundant and superfulous).

misc. ref (aads chip strawman):
http://www.garlic.com/~lynn/index.html#aads
http://www.asuretee.com/



[EMAIL PROTECTED] on 6/13/2002 11:10 am wrote:

This makes a lot of sense, especially for "closed" systems like business
LANs and WANs where there is a reasonable centralized authority who can
validate the security of the SCP keys.  I suggested some time back that
since most large businesses receive and configure their computers in the IT
department before making them available to employees, that would be a time
that they could issue private certs on the embedded SCP keys. The
employees' computers could then be configured to use these private certs
for their business computing.

However the larger vision of trusted computing leverages the global
internet and turns it into what is potentially a giant distributed
computer.  For this to work, for total strangers on the net to have trust
in the integrity of applications on each others' machines, will require
some kind of centralized trust infrastructure.  It may possibly be
multi-rooted but you will probably not be able to get away from this
requirement.

The main problem, it seems to me, is that validating the integrity of the
SCP keys cannot be done remotely.  You really need physical access to the
SCP to be able to know what key is inside it.  And even that is not enough,
if it is possible that the private key may also exist outside, perhaps
because the SCP was initialized by loading an externally generated
public/private key pair.  You not only need physical access, you have to be
there when the SCP is initialized.

In practice it seems that only the SCP manufacturer, or at best the OEM who
(re) initializes the SCP before installing it on the motherboard, will be
in a position to issue certificates.  No other central authorities will
have physical access to the chips on a near-universal scale at the time of
their creation and installation, which is necessary to allow them to issue
meaningful certs.  At least with the PGP "web of trust" people could in
principle validate their keys over the phone, and even then most PGP users
never got anyone to sign their keys.  An effective web of trust seems much
more difficult to achieve with Palladium, except possibly in small groups
that already trust each other anyway.

If we do end up with only a few trusted root keys, most internet-scale
trusted computing software is going to have those roots built in. Those
keys will be extremely valuable, potentially even more so than Verisign's
root keys, because trusted computing is actually a far more powerful
technology than the trivial things done today with PKI.  I hope the
Palladium designers give serious thought to the issue of how those trusted
root keys can be protected appropriately.  It's not going to be enough to
say "it's not our problem".  For trusted computing to reach its potential,
security has to be engineered into the system from the beginning - and that
security must start at the root!

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to
[EMAIL PROTECTED]




Re: Challenge to David Wagner on TCPA

2002-08-13 Thread Jay Sulzberger

On Tue, 13 Aug 2002, AARG!Anonymous wrote:

< ... />

>
> However the larger vision of trusted computing leverages the global
> internet and turns it into what is potentially a giant distributed
> computer.  For this to work, for total strangers on the net to have
> trust in the integrity of applications on each others' machines, will
> require some kind of centralized trust infrastructure.  It may possibly
> be multi-rooted but you will probably not be able to get away from
> this requirement.

No.  Safe distributed computing can be attained without any such
centralized control system.  Just as thermodynamic behavior needs no
centralized system of control of atomic behavior, but rather proceeds by
way of statistical mechanics, so safe mass computations may be accomplished
by application of one engineering branch of statistical mechanics, called
information theory.  The main publications are from the Fifties and
Sixties.

oo--JS.




Re: Challenge to David Wagner on TCPA

2002-08-13 Thread AARG! Anonymous

Brian LaMacchia writes:

> So the complexity isn't in how the keys get initialized on the SCP (hey, it
> could be some crazy little hobbit named Mel who runs around to every machine
> and puts them in with a magic wand).  The complexity is in the keying
> infrastructure and the set of signed statements (certificates, for lack of a
> better word) that convey information about how the keys were generated &
> stored.  Those statements need to be able to represent to other applications
> what protocols were followed and precautions taken to protect the private
> key.  Assuming that there's something like a cert chain here, the root of
> this chain chould be an OEM, an IHV, a user, a federal agency, your company,
> etc. Whatever that root is, the application that's going to divulge secrets
> to the SCP needs to be convinced that the key can be trusted (in the
> security sense) not to divulge data encrypted to it to third parties.
> Palladium needs to look at the hardware certificates and reliably tell
> (under user control) what they are. Anyone can decide if they trust the
> system based on the information given; Palladium simply guarantees that it
> won't tell anyone your secrets without your explicit request..

This makes a lot of sense, especially for "closed" systems like business
LANs and WANs where there is a reasonable centralized authority who can
validate the security of the SCP keys.  I suggested some time back that
since most large businesses receive and configure their computers in
the IT department before making them available to employees, that would
be a time that they could issue private certs on the embedded SCP keys.
The employees' computers could then be configured to use these private
certs for their business computing.

However the larger vision of trusted computing leverages the global
internet and turns it into what is potentially a giant distributed
computer.  For this to work, for total strangers on the net to have
trust in the integrity of applications on each others' machines, will
require some kind of centralized trust infrastructure.  It may possibly
be multi-rooted but you will probably not be able to get away from
this requirement.

The main problem, it seems to me, is that validating the integrity of
the SCP keys cannot be done remotely.  You really need physical access
to the SCP to be able to know what key is inside it.  And even that
is not enough, if it is possible that the private key may also exist
outside, perhaps because the SCP was initialized by loading an externally
generated public/private key pair.  You not only need physical access,
you have to be there when the SCP is initialized.

In practice it seems that only the SCP manufacturer, or at best the OEM
who (re) initializes the SCP before installing it on the motherboard,
will be in a position to issue certificates.  No other central authorities
will have physical access to the chips on a near-universal scale at the
time of their creation and installation, which is necessary to allow
them to issue meaningful certs.  At least with the PGP "web of trust"
people could in principle validate their keys over the phone, and even
then most PGP users never got anyone to sign their keys.  An effective
web of trust seems much more difficult to achieve with Palladium, except
possibly in small groups that already trust each other anyway.

If we do end up with only a few trusted root keys, most internet-scale
trusted computing software is going to have those roots built in.
Those keys will be extremely valuable, potentially even more so than
Verisign's root keys, because trusted computing is actually a far more
powerful technology than the trivial things done today with PKI.  I hope
the Palladium designers give serious thought to the issue of how those
trusted root keys can be protected appropriately.  It's not going to be
enough to say "it's not our problem".  For trusted computing to reach
its potential, security has to be engineered into the system from the
beginning - and that security must start at the root!




Re: Challenge to David Wagner on TCPA

2002-08-12 Thread Brian A. LaMacchia

I just want to point out that, as far as Palladium is concerned, we really
don't care how the keys got onto the machine. Certain *applications* written
on top of Palladium will probably care, but all the hardware & the security
kernel really care about is making sure that secrets are only divulged to
the code that had them encrypted in the first place.  It's all a big trust
management problem (or a series of trust management problems) --
applications that are going to rely on SCP keys to protect secrets for them
are going to want some assurances about where the keys live and whether
there's a copy outside the SCP.  I can certainly envision potential
applications that would want guarantees that the key was generated on the
SCP & never left, and I can see other applications that want guarantees that
the key has a copy sitting on another SCP on the other side of the building.

So the complexity isn't in how the keys get initialized on the SCP (hey, it
could be some crazy little hobbit named Mel who runs around to every machine
and puts them in with a magic wand).  The complexity is in the keying
infrastructure and the set of signed statements (certificates, for lack of a
better word) that convey information about how the keys were generated &
stored.  Those statements need to be able to represent to other applications
what protocols were followed and precautions taken to protect the private
key.  Assuming that there's something like a cert chain here, the root of
this chain chould be an OEM, an IHV, a user, a federal agency, your company,
etc. Whatever that root is, the application that's going to divulge secrets
to the SCP needs to be convinced that the key can be trusted (in the
security sense) not to divulge data encrypted to it to third parties.
Palladium needs to look at the hardware certificates and reliably tell
(under user control) what they are. Anyone can decide if they trust the
system based on the information given; Palladium simply guarantees that it
won't tell anyone your secrets without your explicit request..

--bal

P.S. I'm not sure that I actually *want* the ability to extract the private
key from an SCP after it's been loaded, because presumably if I could ask
for the private key then a third party doing a black-bag job on my PC could
also ask for it.  I think what I want is the ability to zeroize the SCP,
remove all state stored within it, and cause new keys to be generated
on-chip.  So long as I can zero the chip whenever I want (or zero part of
it, or whatever) I can eliminate the threat posed by the manufacturer who
initialized the SCP in the first place.

Lucky Green <[EMAIL PROTECTED]> wrote:
> Ray wrote:
>>
>>> From: "James A. Donald" <[EMAIL PROTECTED]>
>>> Date: Tue, 30 Jul 2002 20:51:24 -0700
>>
>>> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
 both Palladium and TCPA deny that they are designed to restrict
 what applications you run.  The TPM FAQ at
 http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
 
>>>
>>> They deny that intent, but physically they have that capability.
>>
>> To make their denial credible, they could give the owner
>> access to the private key of the TPM/SCP.  But somehow I
>> don't think that jibes with their agenda.
>
> Probably not surprisingly to anybody on this list, with the exception
> of potentially Anonymous, according to the TCPA's own TPM Common
> Criteria Protection Profile, the TPM prevents the owner of a TPM from
> exporting the TPM's internal key. The ability of the TPM to keep the
> owner of a PC from reading the private key stored in the TPM has been
> evaluated to E3 (augmented). For the evaluation certificate issued by
> NIST, see:
>
> http://niap.nist.gov/cc-scheme/PPentries/CCEVS-020016-VR-TPM.pdf
>
>> If I buy a lock I expect that by demonstrating ownership I
>> can get a replacement key or have a locksmith legally open it.
>
> It appears the days when this was true are waning. At least in the PC
> platform domain.
>
> --Lucky
>
>
> -
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to
> [EMAIL PROTECTED]




RE: Challenge to David Wagner on TCPA

2002-08-11 Thread Russell Nelson

Jim Choate writes:
 > 
 > On Mon, 5 Aug 2002, Russell Nelson wrote:
 > 
 > > AARG!Anonymous writes:
 > >  > So don't read too much into the fact that a bunch of anonymous postings
 > >  > have suddenly started appearing from one particular remailer.  For your
 > >  > information, I have sent over 400 anonymous messages in the past year
 > >  > to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
 > >  > of them on TCPA related topics).
 > > 
 > > We have, of course, no way to verify this fact, since your messages
 > > are not cryptographically signed.  For someone who claims to be
 > > knowledgable about cryptography, this seems like a suspicious omission.
 > 
 > Bullshit Russ, plausable deniability alone justifies such behaviour.
 > 
 > Who sent them is irrelevant except to cultists of personality (eg CACL
 > adherents).

I agree that it's irrelevant.  So why is he trying to argue from
authority (always a fallacy anyway) without *even* having any way to
prove that he is that authority?  Fine, let him desire plausible
deniability.  I plausibly deny his appeal to (self-)authority as being
completely without merit.

-- 
-russ nelson  http://russnelson.com |
Crynwr sells support for free software  | PGPok | businesses persuade
521 Pleasant Valley Rd. | +1 315 268 1925 voice | governments coerce
Potsdam, NY 13676-3213  | +1 315 268 9201 FAX   |




Re: Challenge to David Wagner on TCPA

2002-08-11 Thread Ben Laurie

Lucky Green wrote:
> Ray wrote:
> 
>>>From: "James A. Donald" <[EMAIL PROTECTED]>
>>>Date: Tue, 30 Jul 2002 20:51:24 -0700
>>
>>>On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
>>>
both Palladium and TCPA deny that they are designed to restrict
what applications you run.  The TPM FAQ at 
http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads

>>>
>>>They deny that intent, but physically they have that capability.
>>
>>To make their denial credible, they could give the owner 
>>access to the private key of the TPM/SCP.  But somehow I 
>>don't think that jibes with their agenda.
> 
> 
> Probably not surprisingly to anybody on this list, with the exception of
> potentially Anonymous, according to the TCPA's own TPM Common Criteria
> Protection Profile, the TPM prevents the owner of a TPM from exporting
> the TPM's internal key. The ability of the TPM to keep the owner of a PC
> from reading the private key stored in the TPM has been evaluated to E3
> (augmented). For the evaluation certificate issued by NIST, see:
> 
> http://niap.nist.gov/cc-scheme/PPentries/CCEVS-020016-VR-TPM.pdf

Obviously revealing the key would defeat any useful properties of the 
TPM/SCP. However, unless the machine refuses to run stuff unless signed 
by some other key, its a matter of choice whether you run an OS that has 
the aforementioned properties.

Of course, its highly likely that if you want to watch products of Da 
Mouse on your PC, you will be obliged to choose a certain OS. In order 
to avoid more sinister uses, it makes sense to me to ensure that at 
least one free OS gets appropriate signoff (and no, that does not 
include a Linux port by HP). At least, it makes sense to me if I assume 
that the certain other OS will otherwise become dominant. Which seems 
likely.

Cheers,

Ben.

-- 
http://www.apache-ssl.org/ben.html   http://www.thebunker.net/

Available for contract work.

"There is no limit to what a man can do or how far he can go if he
doesn't mind who gets the credit." - Robert Woodruff




RE: Challenge to David Wagner on TCPA

2002-08-11 Thread Jim Choate

On Sat, 10 Aug 2002, Russell Nelson wrote:

> I agree that it's irrelevant.  So why is he trying to argue from
> authority (always a fallacy anyway) without *even* having any way to
> prove that he is that authority?

What has 'authority' got to do with it? Arguments from authority are
-worthless-. Make up your own mind as to its validity, who cares about
their 'proof'.

-Who- is irrelevant. What damns his argument -is- his appeal to
-authority-. Anyone who bases their argument on 'He said...' has already
lost the discussion and invalidated any point they might make. It's one of
the primary fallacies of (for example) Tim May and his consistent appeal
to who he knows or what 'they' said.

We agree, what I don't understand is why you keep expecting that dead
horse to get up...keep asking those damning questions ;)


 --


  Conform and be dull..J. Frank Dobie

 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






RE: Challenge to David Wagner on TCPA

2002-08-11 Thread Jim Choate

On Mon, 5 Aug 2002, Russell Nelson wrote:

> AARG!Anonymous writes:
>  > So don't read too much into the fact that a bunch of anonymous postings
>  > have suddenly started appearing from one particular remailer.  For your
>  > information, I have sent over 400 anonymous messages in the past year
>  > to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
>  > of them on TCPA related topics).
> 
> We have, of course, no way to verify this fact, since your messages
> are not cryptographically signed.  For someone who claims to be
> knowledgable about cryptography, this seems like a suspicious omission.

Bullshit Russ, plausable deniability alone justifies such behaviour.

Who sent them is irrelevant except to cultists of personality (eg CACL
adherents).

Base your analysis on facts and experiment.


 --


  Conform and be dull..J. Frank Dobie

 [EMAIL PROTECTED] www.ssz.com
 [EMAIL PROTECTED]  www.open-forge.org






Re: Challenge to David Wagner on TCPA

2002-08-10 Thread D.Popkin

-BEGIN PGP SIGNED MESSAGE-

AARG! Anonymous <[EMAIL PROTECTED]> writes:

> Lucky Green wrote:
> > Ray wrote:
> > > If I buy a lock I expect that by demonstrating ownership I 
> > > can get a replacement key or have a locksmith legally open it.

> > It appears the days when this was true are waning. At least in the PC
> > platform domain.

> We have had other systems which work like this for a long while.
> Many consumer devices are sealed such that if you open them you void
> the warranty.  This is to your advantage as a consumer; ...

There is exactly one person in the world qualified to decide what's to
the advantage of that consumer, and it's not AARG! Anonymous.

-BEGIN PGP SIGNATURE-
Version: 2.6.3ia
Charset: noconv

iQBVAwUBPVRO0PPsjZpmLV0BAQEwrQH/eXqkJVmXYmqNtweg6246KMXmCGekK/h6
HNmnd65WeR2A84pJdJFb8jZ2CX6bJ+XrboaDv8klJCo21xTkFxWIuA==
=DL2o
-END PGP SIGNATURE-




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread Eugen Leitl

On Sat, 3 Aug 2002, Morlock Elloi wrote:

> Ah, the computers. Well, those that want computers will have them.
> They may not be as cheap as today and there will not be as many of
> them, but I think that all people *I* deal with will have them, so I
> don't really care.

Sure, people will have computers. However, if we merrily slide down the
slippery slope the authentication might move into the network layer
eventually. You will be on the network, yet you will be not on the
network. 

One might be able to fab computers at small scale (FPGA, organic
transistors via inkjet, whatever), but it will be tough to create global
networks using just overlapping patches of wireless. Especially, if rogue
wireless will be rather illegal.




Re: Challenge to David Wagner on TCPA

2002-08-04 Thread Roy M.Silvernail

On Saturday 03 August 2002 05:12 pm, Morlock Elloi wrote:

> UUCP will work as long as people can talk over telephone and there are
> modems available. The harder and more inconvenient it becomes to connect
> the higher average IQ of participants will be.
>
> There is hope.
>
> Just imagine the absence of short-attention span morons that find uucp too
> complicated. Ask around.

But if WorldCom disolves in bankruptcy, will UUNet still be the center of the 
bang-path universe?

More seriously, I think many of us old-timers long for the time when a 
certain level of wizardry was required to get on the net. (before Prodigy and 
the September that Never Ended)
-- 
Roy M. Silvernail [ ] [EMAIL PROTECTED] 
(formerly uunet!comcon!cybrspc!roy)
DNRC Minister Plenipotentiary of All Things Confusing, Software Division
PGP Key 0x1AF39331 :  71D5 2EA2 4C27 D569  D96B BD40 D926 C05E
 Key available from [EMAIL PROTECTED]
I charge to process unsolicited commercial email




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread Morlock Elloi

> One might be able to fab computers at small scale (FPGA, organic
> transistors via inkjet, whatever), but it will be tough to create global
> networks using just overlapping patches of wireless. Especially, if rogue
> wireless will be rather illegal.

UUCP will work as long as people can talk over telephone and there are modems
available. The harder and more inconvenient it becomes to connect the higher
average IQ of participants will be.

There is hope.

Just imagine the absence of short-attention span morons that find uucp too
complicated. Ask around.






=
end
(of original message)

Y-a*h*o-o (yes, they scan for this) spam follows:
Yahoo! Health - Feel better, live better
http://health.yahoo.com




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread AARG! Anonymous

Mike Rosing wrote:
> On Fri, 2 Aug 2002, AARG! Anonymous wrote:
>
> > You don't have to send your data to Intel, just a master storage key.
> > This key encrypts the other keys which encrypt your data.  Normally this
> > master key never leaves your TPM, but there is this optional feature
> > where it can be backed up, encrypted to the manufacturer's public key,
> > for recovery purposes.  I think it is also in blinded form.
>
> In other words, the manufacturer has access to all your data because
> they have the master storage key.
>
> Why would everyone want to give one manufacturer that much power?

It's not quite that bad.  I mentioned the blinding.  What happens is
that before the master storage key is encrypted, it is XOR'd with a
random value, which is also output by the TPM along with the encrypted
recovery blob.  You save them both, but only the encrypted blob gets
sent to the manufacturer.  So when the manufacturer decrypts the data,
he doesn't learn your secrets.

The system is cumbersome, but not an obvious security leak.




RE: Challenge to David Wagner on TCPA

2002-08-04 Thread Morlock Elloi

The principal philosophical issue here is that the ownership of the "computer"
terminates.

So far most people owned their computers in the sense that they could make
transistors inside do anything they liked, provided they had some
easily-obtainable knowledge. Content/software vendors had their stuff executed
on enemy's territory with all imaginable consequences.

TCPA-ed computer is actually a single-seat movie theatre teleported to your
house. It's operated and owned by one or more corporations - what you pay when
"buying" the computer are up-front installation costs of the franchise.
Remember that theatres are enclosed spaces with entertainment with doors that
you need a ticket to go through.

Sheeple will get more entertainment. The only problem seems to be that small
independent producers will not get their stuff played there. Tough shit. If
small producers want to fuck with all world's theatres, they need to get
better. Parasiting is over. There is no natural right to program other's
machines. When I go to the theatre I don't want unwashed activists flashing
their stuff on the screen. At least not dumb ones.

Ah, the computers. Well, those that want computers will have them. They may not
be as cheap as today and there will not be as many of them, but I think that
all people *I* deal with will have them, so I don't really care.







=
end
(of original message)

Y-a*h*o-o (yes, they scan for this) spam follows:
Yahoo! Health - Feel better, live better
http://health.yahoo.com




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Jay Sulzberger

On Fri, 2 Aug 2002, Albion Zeglin wrote:

> Quoting Jay Sulzberger <[EMAIL PROTECTED]>:
>
>
> > b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
> > separate dongle?  Because, of course, the Englobulators proceed here on
> > principle.  The principle being that only the Englobulators have a right to
> > own printing presses/music studios/movie and animation studios.
> >
>

> A separate dongle can't verify the integrity of the processor.  The
> important part is that the processor's state (including initial RAM load)
> is verifiable.

But if you just want to show movies "securely" you need not use my general
purpose and today untrammeled computer.  You can either show movies in
movie houses, or use some slightly trammeled version of a "cable ready TV",
or the variant product mentioned earlier, the "donglified monitor/speaker".

There is no need for the MPAA to "verify the integrity of the processor" if
all the MPAA wants to do is sell me tickets to movies.

> Without this the OS could be virtualized and modified after the integrity
> check.

What does the enforcement of the laws against copyright infringement have
to do with my general purpose and today untrammeled computer?  There is no
relation of the sort you, and all the mass media, implicitly assume here.
Indeed no OS at all should be involved in the "secure showing of movies".
It is like using the standard C libraries to write "secure code"!

>
> Just imagine running Windows Media Player on a virtual machine, trapping
> the calls to the audio card and thus being able to copy content
> perfectly.  A dongle can't prevent this.

My donglified monitor/speakers combination, of course, offers greater
assurance.  Here is part of my argument: the explanation of my proposed
protocols can actually be understood.

>
> Eventually for TCPA to be effective against hardware hacks such as memory
> probes, not only will the harddrive storage be sealed, but RAM must be
> sealed as well.
> Once TCPA moves onprocessor, I expect encrypted RAM will be next.
>
> Albion.

The dilemma "Either give over all the computers in the world to the
Englobulators, or never get to see another big budget Hollywood movie." is
a false dichotomy.

oo--JS.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread AARG! Anonymous

Peter Trei envisions data recovery in a TCPA world:

> HoM:  I want to recover my data.
> Me:   OK: We'll pull the HD, and get the data off it.
> HoM:  Good - mount it as a secondary HD in my new system.
> Me:   That isn't going to work now we have TCPA and Palladium.
> HoM:  Well, what do you have to do?
> Me:   Oh, it's simple. We encrypt the data under Intel's TPME key,
>  and send it off to Intel. Since Intel has all the keys, they can
>  unseal all your data to plaintext, copy it, and then re-seal it for
>  your new system. It only costs $1/Mb.
> HoM:  Let me get this straight - the only way to recover this data is
> to let
>  Intel have a copy, AND pay them for it?
> Me:   Um... Yes. I think MS might be involved as well, if your were
> using
>  Word.
> HoM:  You are *so* dead.

It's not quite as bad as all this, but it is still pretty bad.

You don't have to send your data to Intel, just a master storage key.
This key encrypts the other keys which encrypt your data.  Normally this
master key never leaves your TPM, but there is this optional feature
where it can be backed up, encrypted to the manufacturer's public key,
for recovery purposes.  I think it is also in blinded form.

Obviously you'd need to do this backup step before the TPM crashed;
afterwards is too late.  So maybe when you first get your system it
generates the on-chip storage key (called the SRK, storage root key),
and then exports the recovery blob.  You'd put that on a floppy or some
other removable medium and store it somewhere safe.  Then when your
system dies you pull out the disk and get the recovery blob.

You communicate with the manufacturer, give him this recovery blob, along
with the old TPM key and the key to your new TPM in the new machine.
The manufacturer decrypts the blob and re-encrypts it to the TPM in the
new machine.  It also issues and distributes a CRL revoking the cert on
the old TPM key so that the old machine can't be used to access remote
TCPA data any more.  (Note, the CRL is not used by the TPM itself, it is
just used by remote servers to decide whether to believe client requests.)

The manufacturer sends the data back to you and you load it into the TPM
in your new machine, which decrypts it and stores the master storage key.
Now it can read your old data.

Someone asked if you'd have to go through all this if you just upgraded
your OS.  I'm not sure.  There are several secure registers on the
TPM, called PCRs, which can hash different elements of the BIOS, OS,
and other software.  You can lock a blob to any one of these registers.
So in some circumstances it might be that upgrading the OS would keep the
secure data still available.  In other cases you might have to go through
some kind of recovery procedure.

I think this recovery business is a real Achilles heel of the TCPA
and Palladium proposals.  They are paranoid about leaking sealed data,
because the whole point is to protect it.  So they can't let you freely
copy it to new machines, or decrypt it from an insecure OS.  This anal
protectiveness is inconsistent with the flexibility needed in an imperfect
world where stuff breaks.

My conclusion is that the sealed storage of TCPA will be used sparingly.
Ross Anderson and others suggest that Microsoft Word will seal all of
its documents so that people can't switch to StarOffice.  I think that
approach would be far too costly and risky, given the realities I have
explained above.  Instead, I would expect that only highly secure data
would be sealed, and that there would often be some mechanism to recover
it from elsewhere.  For example, in a DRM environment, maybe the central
server has a record of all the songs you have downloaded.  Then if your
system crashes, rather than go through a complicated crypto protocol to
recover, you just buy a new machine, go to the server, and re-download
all the songs you were entitled to.

Or in a closed environment, like a business which seals sensitive
documents, the data could be backed up redundantly to multiple central
file servers, each of which seal it.  Then if one machine crashes,
the data is available from others and there is no need to go through
the recovery protocol.

So there are solutions, but they will add complexity and cost.  At the
same time they do add genuine security and value.  Each application and
market will have to find its own balance of the costs and benefits.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Mike Rosing

On Fri, 2 Aug 2002, AARG! Anonymous wrote:

> You don't have to send your data to Intel, just a master storage key.
> This key encrypts the other keys which encrypt your data.  Normally this
> master key never leaves your TPM, but there is this optional feature
> where it can be backed up, encrypted to the manufacturer's public key,
> for recovery purposes.  I think it is also in blinded form.

In other words, the manufacturer has access to all your data because
they have the master storage key.

Why would everyone want to give one manufacturer that much power?

Or am I missing something...

> You communicate with the manufacturer, give him this recovery blob, along
> with the old TPM key and the key to your new TPM in the new machine.
> The manufacturer decrypts the blob and re-encrypts it to the TPM in the

and stores the blob in a safe place for future use.

> The manufacturer sends the data back to you and you load it into the TPM
> in your new machine, which decrypts it and stores the master storage key.
> Now it can read your old data.

and so can everyone else who visits the manufacturers database.

> I think this recovery business is a real Achilles heel of the TCPA
> and Palladium proposals.  They are paranoid about leaking sealed data,
> because the whole point is to protect it.  So they can't let you freely
> copy it to new machines, or decrypt it from an insecure OS.  This anal
> protectiveness is inconsistent with the flexibility needed in an imperfect
> world where stuff breaks.

Seems like an understatement to me :-)  Explaining to every CEO left
standing that one company may have access to all their buisness data
because congress wants to make TCPA a law could be a very power lobby.

> So there are solutions, but they will add complexity and cost.  At the
> same time they do add genuine security and value.  Each application and
> market will have to find its own balance of the costs and benefits.

Yeah baby, tell them CEO's their costs are going up.  That'll definitly
help TCPA die quickly.  Especially nowadays.

Patience, persistence, truth,
Dr. mike




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Albion Zeglin

Quoting Jay Sulzberger <[EMAIL PROTECTED]>:


> b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
> separate dongle?  Because, of course, the Englobulators proceed here on
> principle.  The principle being that only the Englobulators have a right to
> own printing presses/music studios/movie and animation studios.
> 

A separate dongle can't verify the integrity of the processor.  The important
part is that the processor's state (including initial RAM load) is verifiable.
Without this the OS could be virtualized and modified after the integrity check.

Just imagine running Windows Media Player on a virtual machine, trapping the 
calls to the audio card and thus being able to copy content perfectly.  A 
dongle can't prevent this.

Eventually for TCPA to be effective against hardware hacks such as memory probes, 
not only will the harddrive storage be sealed, but RAM must be sealed as well.
Once TCPA moves onprocessor, I expect encrypted RAM will be next.  

Albion.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread AARG! Anonymous

Peter Trei writes:

> It's rare enough that when a new anononym appears, we know
> that the poster made a considered decision to be anonymous.
>
> The current poster seems to have parachuted in from nowhere, 
> to argue a specific position on a single topic. It's therefore 
> reasonable  to infer that the nature of that position and topic has 
> some bearing on the decision to be anonymous.


Yes, my name is "AARG!".  That was the first thing my mother said after
I was born, and the name stuck.

Not really.  For Peter's information, the name associated with a
message through an anonymous remailer is simply the name of the
last remailer in the chain, whatever that remailer operator chose
to call it.  AARG is a relatively new remailer, but if you look at
http://anon.efga.org/Remailers/TypeIIList you will see that it is very
reliable and fast.  I have been using it as an exit remailer lately
because other ones that I have used often produce inconsistent results.
It has not been unusual to have to send a message two or three times
before it appears.  So far that has not been a problem with this one.

So don't read too much into the fact that a bunch of anonymous postings
have suddenly started appearing from one particular remailer.  For your
information, I have sent over 400 anonymous messages in the past year
to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
of them on TCPA related topics).




Re: CDR: RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Alif The Terrible


On Fri, 2 Aug 2002, AARG! Anonymous wrote:

>  I have sent over 400 anonymous messages in the past year
> to cypherpunks, coderpunks, sci.crypt and the cryptography list (35
> of them on TCPA related topics).

I see you are no too worries about traffic analysis?

-- 
Yours, 
J.A. Terranson
[EMAIL PROTECTED]

If Governments really want us to behave like civilized human beings, they
should give serious consideration towards setting a better example:
Ruling by force, rather than consensus; the unrestrained application of
unjust laws (which the victim-populations were never allowed input on in
the first place); the State policy of justice only for the rich and 
elected; the intentional abuse and occassionally destruction of entire
populations merely to distract an already apathetic and numb electorate...
This type of demogoguery must surely wipe out the fascist United States
as surely as it wiped out the fascist Union of Soviet Socialist Republics.

The views expressed here are mine, and NOT those of my employers,
associates, or others.  Besides, if it *were* the opinion of all of
those people, I doubt there would be a problem to bitch about in the
first place...






RE: Challenge to David Wagner on TCPA

2002-08-03 Thread Jay Sulzberger

On Fri, 2 Aug 2002, Trei, Peter wrote:

> > AARG! Anonymous[SMTP:[EMAIL PROTECTED]] writes
>   [...]
> > Now, there is an optional function which does use the manufacturer's key,
> > but it is intended only to be used rarely.  That is for when you need to
> > transfer your sealed data from one machine to another (either because you
> > have bought a new machine, or because your old one crashed).  In this
> > case you go through a complicated procedure that includes encrypting
> > some data to the TPME key (the TPM manufacturer's key) and sending it
> > to the manufacturer, who massages the data such that it can be loaded
> > into the new machine's TPM chip.
> >
> > So this function does require pre-loading a manufacturer key into the
> > TPM, but first, it is optional, and second, it frankly appears to be so
> > cumbersome that it is questionable whether manufacturers will want to
> > get involved with it.  OTOH it is apparently the only way to recover
> > if your system crashes.  This may indicate that TCPA is not feasible,
> > because there is too much risk of losing locked data on a machine crash,
> > and the recovery procedure is too cumbersome.  That would be a valid
> > basis on which to criticize TCPA, but it doesn't change the fact that
> > many of the other claims which have been made about it are not correct.
> [...]
>
> While I reserve the right to respond to the rest of the poster's letter,
> I'd like to call out this snippet, which gives a very good reason
> for both corporate and individual users to avoid TCPA as if it were
> weaponized anthrax (Hi NSA!).
> ...
> OK, It's 2004, I'm an IT Admin, and I've converted my corporation
> over to TCPA/Palladium machines. My Head of Marketing has his
> TCPA/Palladium desktop's hard drive jam-packed with corporate
> confidential documents he's been actively working on - sales
> projections,  product plans, pricing schemes. They're all sealed files.
>
> His machine crashes - the MB burns out.
> He wants to recover the data.
>
> HoM:  I want to recover my data.
> Me:   OK: We'll pull the HD, and get the data off it.
> HoM:  Good - mount it as a secondary HD in my new system.
> Me:   That isn't going to work now we have TCPA and Palladium.
> HoM:  Well, what do you have to do?
> Me:   Oh, it's simple. We encrypt the data under Intel's TPME key,
>   and send it off to Intel. Since Intel has all the keys, they can
>   unseal all your data to plaintext, copy it, and then re-seal it for
>   your new system. It only costs $1/Mb.
> HoM:  Let me get this straight - the only way to recover this data is to
> let
>   Intel have a copy, AND pay them for it?
> Me:   Um... Yes. I think MS might be involved as well, if your were using
>   Word.
> HoM:  You are *so* dead.
>
> ---
>
> Peter Trei

I think that many managers in this situation would feel reassured that both
Intel and Microsoft would be handling these sensitve documents.  Else why
do lawyers use Microsoft systems to send unencrypted documents between
offices?

ad technicalities: Just one more level of indirection^Wencryption would
answer the objections of those few managers of exquisite sensibilities, who
worry about Intel/Microsoft reading their documents.

oo--JS.




RE: Challenge to David Wagner on TCPA

2002-08-03 Thread James A. Donald

--
On 2 Aug 2002 at 14:36, Trei, Peter wrote:
> OK, It's 2004, I'm an IT Admin,
> and I've converted my corporation over to TCPA/Palladium machines. My
> Head of Marketing has his TCPA/Palladium desktop's hard drive
> jam-packed with corporate confidential documents he's been actively
> working on - sales projections,  product plans, pricing schemes.
> They're all sealed files.
>
> His machine crashes - the MB burns out.
> He wants to recover the data.
>
> HoM:  I want to recover my data.
> Me:   OK: We'll pull the HD, and get the data off it.
> HoM:  Good - mount it as a secondary HD in my new system.
> Me:   That isn't going to work now we have TCPA and Palladium.
> HoM:  Well, what do you have to do?
> Me:   Oh, it's simple. We encrypt the data under Intel's TPME key,
>  and send it off to Intel. Since Intel has all the keys, they can
>  unseal all your data to plaintext, copy it, and then re-seal it for
>  your new system. It only costs $1/Mb.
> HoM:  Let me get this straight - the only way to recover this data is
> to let
>  Intel have a copy, AND pay them for it?
> Me:   Um... Yes. I think MS might be involved as well, if your were
> using
>  Word.
> HoM:  You are *so* dead.

Obviously it is insane to use keys that you do not yourself control 
to keep secrets.  That, however, is not the purpose of TCPA/Palladium 
as envisaged by Microsoft.

The intent is that Peter can sell Paul software or content that will 
only run on ONE computer for ONE time period..

When the motherboard emits blue smoke, or the time runs out, 
whichever happens first, Paul has to buy new software.  If prices are 
lowered accordingly, this might be acceptable.

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 4Mqj1ia6DD0EYpdLMEd7al35eTYefnvhcFesBlMz
 25n9obdfhvRVxEkY4YtWw7BuFxrOKgTtfI1Dp8uAA




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread AARG! Anonymous

Sampo Syreeni writes:

> On 2002-08-01, AARG!Anonymous uttered to [EMAIL PROTECTED],...:
>
> >It does this by taking hashes of the software before transferring
> >control to it, and storing those hashes in its internal secure
> >registers.
>
> So, is there some sort of guarantee that the transfer of control won't be
> stopped by a check against cryptographic signature within the executable
> itself, in the future? That sort of thing would be trivial to enforce via
> licencing terms, after all, and would allow for the introduction of a
> strictly limited set of operating systems to which control would be
> transferred.

TCPA apparently does not have "licensing terms" per se.  They say,
in their FAQ, http://www.trustedcomputing.org/docs/Website_TCPA%20FAQ_0703021.pdf,
"The TCPA spec is currently set up as a 'just publish' IP model."
So there are no licensing terms to enforce, and no guarantees that
people won't do bad things outside the scope of the spec.  Of course,
you realize that the same thing is true with PCs today, right?  There are
few guarantees in this life.

If you think about it, TCPA doesn't actually facilitate the kind of
crypto-signature-checking you are talking about.  You don't need all
this fancy hardware and secure hashes to do that.  Your worrisome
signature checking would be applied on the software which *hasn't
yet been loaded*, right?  All the TCPA hardware will give you is a
secure hash on the software which has already loaded before you ran.
That doesn't help you; in fact your code can pretty well predict the
value of this, given that it is running.  Think about this carefully,
it is a complicated point but you can get it if you take your time.

In short, to implement a system where only signed code can run, TCPA is
not necessary and not particularly helpful.


> I'm having a lot of trouble seeing the benefit in TCPA
> without such extra measures, given that open source software would likely
> evolve which circumvented any protection offered by the more open ended
> architecture you now describe.

I don't follow what you are getting at with the open source.  Realize that
when you boot a different OS, the TCPA attestation features will allow
third parties to detect this.  So your open source OS cannot masquerade
as a different one and fool a third party server into downloading data
to your software.  And likewise, data which was sealed (encrypted)
under a secure OS cannot be unsealed once a different OS boots, because
the sealing/unsealing is all done on-chip, and the chip uses the secure
hash registers to check if the unsealing is allowed.


> >Then, when the data is decrypted and "unsealed", the hash is compared to
> >that which is in the TPM registers now.  This can make it so that data
> >which is encrypted when software system X boots can only be decrypted
> >when that same software boots.
>
> Again, such values would be RE'd and reported by any sane open source OS
> to the circuitry, giving access to whatever data there is. If this is
> prevented, one can bootstrap an absolutely secure platform where whatever
> the content provider says is the Law, including a one where every piece of
> runnable OS software actually enforces the kind of control over
> permissible signatures Peter is so worried about. Where's the guarantee
> that this won't happen, one day?

Not sure I follow this here... the sealed data cannot be reported by an
open source OS because the secret keys never leave the chip without being
themselves encrypted.  As for your second proposal, you are suggesting
that you could write an OS which would only run signed applications?
And run it on a TCPA platform?  Sure, I guess you could.  But you wouldn't
need TCPA features to do it.  See the comments above: any OS today could
be modified to only run apps that were signed with some special key.
You shouldn't blame TCPA for this.


> >In answer to your question, then, for most purposes, there is no signing
> >key that your TPM chip trusts, so the issue is moot.
>
> At the hardware level, yes.

TCPA is a hardware spec.  Peter was asking about TCPA, and I gave him the
answer.  You can hypothesize all the facist software you want, but you
shouldn't blame these fantasies on TCPA.

> At the software one, it probably won't be,
> even in the presence of the above considerations. After you install your
> next Windows version, you will be tightly locked in with whatever M$
> throws at you in their DLL's,

Doesn't Microsoft already sign their system DLLs in NT?

> and as I pointed out, there's absolutely no
> guarantee Linux et al. might well be shut out by extra features, in the
> future. In the end what we get is an architecture, which may not embody
> Peter's concerns right now, but which is built from the ground up to bring
> them into being, later.

Again, you are being entirely hypothetical here.  Please describe exactly
how either attestation or secure storage would assist in creating a boot
loader that would refuse to run Li

RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Mike Rosing

On Fri, 2 Aug 2002, Jay Sulzberger wrote:

> To deal with the tiny bit of truth in the claims of AARG! that some
> capabilities of DRM might be beneficial to me: Yes, of coures, there are
> few things that have zero benefits.  But this is hardly relevant.  A more
> relevant question here is: Can we get the benefits in a better way?  And of
> course, we can.  For the purposes of this narrow and hypothetical
> discussion, DRM might just be considered as a dongle forced on every home
> computer in the world.  The claims of benefit depend on this dongle being
> usable by me to make sure that you do not do certain things with my
> program/data when it is running on your computer, e.g., distribute the
> movie I send you.  Well, why must the dongle be on the whole computer
> system?  Why cannot it be simply a dongle that goes in a slot in a special
> TV screen/speaker system?  Now this is a "product"!, why we'll sell 'em the
> screens and we'll sell the dongle separately, etc..  Of course, the
> Englobulators have no interest in making and selling such dongles.  Indeed,
> were Phillips to start making and selling such, somehow a legal cause of
> action against Phillips would be discovered and the suits would commence.

I think this is what it boils down to.  If I want a dongle for
an arbitrary suite of products I should be able to go to some
store and buy it.  There's no reason it has to be built into
the motherboard.  the Microsoft X-box can have a built in dongle
chip, it's purpose is to ensure that only MS certified games
run on the box.  I don't see any problem with that.  And I don't
see any problem with Hollywood (or Bollywood either) selling HDTV's
with their own dongles.

As an argument to congress we need to stress that TCP's are fine
as isolated devices for specific purposes.  There is *NO NEED* to
make general purpose computers TCP's.  Where there is a market for
TCP's, I'd expect companies to want the ability to put their own
keys into the dongle, not some outside manufacturer who they might
not trust.

TCP's and DRM is useful to some people, and those people should
be able to buy it.  But there's really no need to force it on
everyone, and that's the point we need to get congress to
understand.

Patience, persistence, truth,
Dr. mike




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Jay Sulzberger

On Fri, 2 Aug 2002, James A. Donald wrote:

> --
> On 2 Aug 2002 at 10:43, Trei, Peter wrote:
> > Since the position argued involves nothing which would invoke
> > the malign interest of government powers or corporate legal
> > departments, it's not that. I can only think of two reasons why
> > our corrospondent may have decided to go undercover...
>
> I can think of two innocuous reasons, though the real reason is
> probably something else altogether:
>
> 1.  Defending copyright enforcement is extremely unpopular because
> it seemingly puts you on the side of the hollywood cabal, but in
> fact TCPA/Paladium, if it works as described, and if it is not
> integrated with legal enforcement, does not over reach in the
> fashion that most recent intellectual property legislation, and
> most recent policy decisions by the patent office over reach.

a. TCPA/Palladium must be integrated with laws which give to the
Englobulators absolute legal cudgel powers, such as the DMCA.  So far I
have not seen any proposal by the Englobulators to repeal the DMCA and
cognate laws, so if TCPA/Palladium is imposed, the DMCA will be used, just
as HP threatened to use it a couple of days ago.  And, of course, today
there is no imposed TCPA/Palladium, so the situation will be much worse
when there is.

b. Why must TCPA/Palladium be a dongle on the whole computer?  Why not a
separate dongle?  Because, of course, the Englobulators proceed here on
principle.  The principle being that only the Englobulators have a right to
own printing presses/music studios/movie and animation studios.

>
> 2..  Legal departments are full of people who are, among their
> many other grievious faults, technologically illiterate.
> Therefore when an insider is talking about something, they cannot
> tell when he is leaking inside information or not, and tend to
> have kittens, because they have to trust him (being unable to tell
> if he is leaking information covered by NDA), and are
> constitutionally incapable of trusting anyone.
>
> --digsig

There is a business, not yet come into existence, of providing standard
crypto services to law offices.

oo--JS.




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Jay Sulzberger

On Fri, 2 Aug 2002, Wall, Kevin wrote:

> First off, let me say that in general, I am against almost everything
> that the DCMA stands for and am no fan of DRM either. But I do think that
> we will lose credibility if we can't substantiate our claims, and part of
> that means recognizing and acknowledging what appears to be legitimate
> claims from the TCPA side.

Please forgive me for being too short in my indication of what a better
longer response from me would look like, which better longer response I
hope to include in a formal submission to the Department of Commerce
taskforce on DRM.

There is nothing to be said in favor of DRM.  DRM is simply the name for
the system under which the Englobulators would have root on every home and
small business computer on Earth.

ad propaganda: If we admit the principle that it is reasonable to outlaw
the sale of computers to individuals and to outlaw the private use of
computers, we place ourselves in a false posture, and a strategically
weaker position.  The present situation is not that of twenty-five years
ago when the VCR was coming to be used in private homes.  The struggles of
those days were about trammels on limited purpose devices.  DRM is not one
trammel on a limited device, nor is it even a set of trammels on several
differtent special purpose devices.

In the above paragraph I use the word "computer" to mean computers of the
sort we have today, that is, computers which have no wiretaps and no remote
control machinery in them.

ad my repeated rhetorical question "Claimed advantage to me here?": It was
an error of rhetoric to put these questions in my response to AAARG!.
These questions require consideration of indirect effects, which may only
be roughly estimated, if we wish to be precise at the two nines level.  But
in each case, when one runs down the game/rhetoric tree, one sees that
there is never any benefit to me in the claimed useful-to-all capabilities
of DRM.  I will not be able to force my wiretaps and my remote controls on
RIAA-MPAA-AAP.  As pointed out, section 4.12 of the Final Report of the
BPDG, simply specifies that, when DRM is forced on the world, Englobulator
machines will have no TCPA/Palladium/wiretaps/remote-controls in them.

To deal with the tiny bit of truth in the claims of AARG! that some
capabilities of DRM might be beneficial to me: Yes, of coures, there are
few things that have zero benefits.  But this is hardly relevant.  A more
relevant question here is: Can we get the benefits in a better way?  And of
course, we can.  For the purposes of this narrow and hypothetical
discussion, DRM might just be considered as a dongle forced on every home
computer in the world.  The claims of benefit depend on this dongle being
usable by me to make sure that you do not do certain things with my
program/data when it is running on your computer, e.g., distribute the
movie I send you.  Well, why must the dongle be on the whole computer
system?  Why cannot it be simply a dongle that goes in a slot in a special
TV screen/speaker system?  Now this is a "product"!, why we'll sell 'em the
screens and we'll sell the dongle separately, etc..  Of course, the
Englobulators have no interest in making and selling such dongles.  Indeed,
were Phillips to start making and selling such, somehow a legal cause of
action against Phillips would be discovered and the suits would commence.

oo--JS.


>
> Having said that, let me plunge right in and proceed to mark a complete
> fool of myself. Besides, so what if another hundred spambots harvest
> my e-mail address for breast enlargement ads (stupid spambots--think
> they could at least use my name to determine my sex and send me the
> herbal Viagra ads instead. ;-)
>
> Note that I'm interpreting Jay's reiterated question of
> "Claimed advantage to me here?" in the more general sense of
> advantage to anyone rather than to Jay personally. Not knowing
> him, the latter would be a rather difficult assessment to make.
>
> So, on with it already. Open mouth, insert foot... (yumm..
> filet of sole)...
>
> Jay Sulzberger writes...
>
> > On Thu, 1 Aug 2002, AARG!Anonymous wrote:
> >
> > > Eric Murray writes:
> > > > TCPA (when it isn't turned off) WILL restrict the software that you
> > > > can run.  Software that has an invalid or missing signature won't be
> > > > able to access "sensitive data"[1].   Meaning that unapproved software
> > > > won't work.
> > > >
> > > > [1] TCPAmain_20v1_1a.pdf, section 2.2
> > >
> > > We need to look at the text of this in more detail.  This is from
> > > version 1.1b of the spec:
> > >
> > > : This section introduces the architectural aspects of a Trusted
> > > : Platform that enable the collection and reporting of integrity
> > > : metrics.
> > > :
> > > : Among other things, a Trusted Platform enables an entity to
> > > : determine the state of the software environment in that platform
> > > : and to SEAL data to a particular software environment in that
> > > : platform.
> >
> >

RE: Challenge to David Wagner on TCPA

2002-08-02 Thread James A. Donald

--
On 2 Aug 2002 at 10:43, Trei, Peter wrote:
> Since the position argued involves nothing which would invoke
> the malign interest of government powers or corporate legal
> departments, it's not that. I can only think of two reasons why
> our corrospondent may have decided to go undercover...

I can think of two innocuous reasons, though the real reason is
probably something else altogether:

1.  Defending copyright enforcement is extremely unpopular because
it seemingly puts you on the side of the hollywood cabal, but in
fact TCPA/Paladium, if it works as described, and if it is not
integrated with legal enforcement, does not over reach in the
fashion that most recent intellectual property legislation, and
most recent policy decisions by the patent office over reach.

2..  Legal departments are full of people who are, among their
many other grievious faults, technologically illiterate.
Therefore when an insider is talking about something, they cannot
tell when he is leaking inside information or not, and tend to
have kittens, because they have to trust him (being unable to tell
if he is leaking information covered by NDA), and are
constitutionally incapable of trusting anyone. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 Alf9R2ZVGqWkLhwWX2H6TBqHOunrj2Fbxy+U0ORV
 2uPGI4gMDt1fTQkV1820PO3xWmAWPiaS0DqrbmobN




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread Jon Callas

On 8/1/02 1:14 PM, "Trei, Peter" <[EMAIL PROTECTED]> wrote:

> So my question is: What is your reason for shielding your identity?
> You do so at the cost of people assuming the worst about your
> motives.

Is this a tacit way to suggest that the only people who need anonymity or
pseudonymity are those with something to hide?

Jon




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Wall, Kevin

First off, let me say that in general, I am against almost everything that
the DCMA stands for and am no fan of DRM either. But I do think that
we will lose credibility if we can't substantiate our claims, and part
of that means recognizing and acknowledging what appears to be legitimate
claims from the TCPA side.

Having said that, let me plunge right in and proceed to mark a complete
fool of myself. Besides, so what if another hundred spambots harvest
my e-mail address for breast enlargement ads (stupid spambots--think
they could at least use my name to determine my sex and send me the
herbal Viagra ads instead. ;-)

Note that I'm interpreting Jay's reiterated question of
"Claimed advantage to me here?" in the more general sense of
advantage to anyone rather than to Jay personally. Not knowing
him, the latter would be a rather difficult assessment to make.

So, on with it already. Open mouth, insert foot... (yumm..
filet of sole)...

Jay Sulzberger writes...

> On Thu, 1 Aug 2002, AARG!Anonymous wrote:
> 
> > Eric Murray writes:
> > > TCPA (when it isn't turned off) WILL restrict the software that you
> > > can run.  Software that has an invalid or missing signature won't be
> > > able to access "sensitive data"[1].   Meaning that unapproved software
> > > won't work.
> > >
> > > [1] TCPAmain_20v1_1a.pdf, section 2.2
> >
> > We need to look at the text of this in more detail.  This is from
> > version 1.1b of the spec:
> >
> > : This section introduces the architectural aspects of a Trusted
> > : Platform that enable the collection and reporting of integrity
> > : metrics.
> > :
> > : Among other things, a Trusted Platform enables an entity to
> > : determine the state of the software environment in that platform
> > : and to SEAL data to a particular software environment in that
> > : platform.
> 
> 
> Claimed advantage to me here?

If you produce copyrighted materials that you don't want others to
illegal copy, it can protect your assets. Might also be useful in
protecting state secrets, but general crypto is sufficient for
that. (Don't need it at the hardware level unless you are worried
that some TLA gov't agency is out to get you.)

The advantage depends on one whether is a producer of goods, or merely
a consumer. I shall not make a judgement call as to which is more
important. Suffice it to say that both need each other.

[more from TCPA spec]
> > :
> > : The entity deduces whether the state of the computing environment in
> > : that platform is acceptable and performs some transaction with that
> > : platform. If that transaction involves sensitive data that must be
> > : stored on the platform, the entity can ensure that that data is held
> > : in a confidential format unless the state of the computing environment
> > : in that platform is acceptable to the entity.
> 
> Claimed advantage to me here?

One could use this to detect virus infected systems, systems infected
with root kits, etc., could they not? Also, ones alluded to above
come to mind.

> > :
> > : To enable this, a Trusted Platform provides information to enable
> > : the entity to deduce the software environment in a Trusted Platform.
> > : That information is reliably measured and reported to the entity.
> > : At the same time, a Trusted Platform provides a means to encrypt
> > : cryptographic keys and to state the software environment that must
> > : be in place before the keys can be decrypted.
> >
> > What this means is that a remote system can query the local TPM and
> > find out what software has been loaded, in order to decide whether to
> > send it some data.  It's not that unapproved software "won't work",
> > it's that the remote guy can decide whether to trust it.
> 
> Claimed advantage to me here?

Well, here's one place that I can see a potential value to consumers.
I've thought a lot about how one can secure peer-to-peer (P2P) systems.

Sure, if I want to allow my box be a P2P host, I can use a sandboxing
technique to control and restrict (at least in theory) what rights I
give other programs to run. [I'm think of a sense similar to the Java
sandbox used for running applets.]

However, the more interesting, and I believe more challenging piece is
what guarentees can you give *ME* as a user of P2P services if I design
some important code that I wish to utilize some generic P2P service.
Unless I want to pay specific services for a P2P or grid computing from
some company that I might happen to trust, be it IBM, HP, or whomever,
I'll probably use some (future?) P2P services that are open sourced freeware
that typical home users might host out of the generosity of their hearts
(whereby they allow others to use some of their spare cycles). While this
is all well and good, my level of trust would likely not be at the same
level it would be if I paid a company to use their services. The feeling
being if I buy a service from a reputable company and they intentionally
do something malicious such as steal private dat

RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Wall, Kevin

Mr AARG! writes...

> Eric Murray writes:
> > Yes, the spec says that it can be turned off.  At that point you
> > can run anything that doesn't need any of the protected data or
> > other TCPA services.   But, why would a software vendor that wants
> > the protection that TCPA provides allow his software to run
> > without TCPA as well, abandoning those protections?
> 
> That's true; in fact if you ran it earlier under TCPA and sealed some
> data, you will have to run under TCPA to unseal it later.  The question
> is whether the advantages of running under TCPA (potentially greater
> security) outweigh the disadvantages (greater potential for loss of
> data, less flexibility, etc.).

and in another reply to Peter Trei, Mr. AARG! also writes...

> Now, there is an optional function which does use the manufacturer's key, 
> but it is intended only to be used rarely.  That is for when you need to 
> transfer your sealed data from one machine to another (either because you 
> have bought a new machine, or because your old one crashed).  In this 
> case you go through a complicated procedure that includes encrypting 
> some data to the TPME key (the TPM manufacturer's key) and sending it 
> to the manufacturer, who massages the data such that it can be loaded 
> into the new machine's TPM chip. 
> 
> So this function does require pre-loading a manufacturer key into the 
> TPM, but first, it is optional, and second, it frankly appears to be so 
> cumbersome that it is questionable whether manufacturers will want to 
> get involved with it.  OTOH it is apparently the only way to recover 
> if your system crashes.  This may indicate that TCPA is not feasible, 
> because there is too much risk of losing locked data on a machine crash, 
> and the recovery procedure is too cumbersome.  That would be a valid 
> basis on which to criticize TCPA, but it doesn't change the fact that 
> many of the other claims which have been made about it are not correct. 

Correct me if I'm wrong (I'm sure you all will :), but wouldn't you also
have to possibly go through this exercise with the TPME key and sending
your system to the manufacturer when you wanted to, say, upgrade your
operating system or switch to a completely different OS? That will go
over like a lead balloon. (Gee... must be getting late. I almost wrote
"like a bag of dirt". Duh! Can't even remember cliches at my age.)

-kevin wall
P.S.- Please excuse the sh*t formating. We use Lookout! and MS Exstrange
  where I work.




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread rsedc

On Mon, Jul 29, 2002 at 03:35:32PM -0700, AARG! Anonymous wrote:
> Declan McCullagh writes at
> http://zdnet.com.com/2100-1107-946890.html:
> 
>"The world is moving toward closed digital rights management systems
>where you may need approval to run programs," says David Wagner,
>an assistant professor of computer science at the University of
>California at Berkeley.  "Both Palladium and TCPA incorporate features
>that would restrict what applications you could run."
> 
> But both Palladium and TCPA deny that they are designed to restrict what
> applications you run.  The TPM FAQ at
> http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads, in
> answer #1:
> 
> : The TPM can store measurements of components of the user's system, but
> : the TPM is a passive device and doesn't decide what software can or
> : can't run on a user's system.
> 
> An apparently legitimate but leaked Palladium White Paper at
> http://www.neowin.net/staff/users/Voodoo/Palladium_White_Paper_final.pdf
> says, on the page shown as number 2:
> 
> : A Palladium-enhanced computer must continue to run any existing
> : applications and device drivers.



> Can you find anything in this spec that would do what David Wagner says
> above, restrict what applications you could run?  Despite studying this
> spec for many hours, no such feature has been found.
> 
> So here is the challenge to David Wagner, a well known and justifiably
> respected computer security expert: find language in the TCPA spec to
> back up your claim above, that TCPA will restrict what applications
> you can run.  Either that, or withdraw the claim, and try to get Declan
> McCullagh to issue a correction.  (Good luck with that!)

'Applications' as used in Wagner's statement can be actions
or computer programs to accomplish the desired tasks for the
users/owners.

>From Webster's Revised Unabridged Dictionary (1913) [web1913]:

  Application \Ap`pli*ca"tion\, n. [L. applicatio, fr. applicare:
 cf. F. application. See {Apply}.]

 3. The act of applying as a means; the employment of means to
accomplish an end; specific use.

>From WordNet (r) 1.7 [wn]:

 3: a program that gives a computer instructions that provide
the user with tools to accomplish a task;

Both involve using the term 'accomplish'.
Whereas from WordNet (r) 1.7 [wn]:

  software
   n : (computer science) written programs or procedures or rules
   and associated documentation pertaining to the operation
   of a computer system and that are stored in read/write
   memory;

As you can see, 'application' differs from 'software' in that an
'application' needs to 'accomplish' the desired tasks.

If as you said later,

On Thu, Aug 01, 2002 at 04:45:15PM -0700, AARG! Anonymous wrote:
> But no, the TCPA does allow all software to run.  Just because a remote
> system can decide whether to send it some data doesn't mean that software
> can't run.  And just because some data may be inaccessible because it
> was sealed when another OS was booted, also doesnt mean that software
> can't run.
> 
> I think we agree on the facts, here.  All software can run, but the TCPA
> allows software to prove its hash to remote parties, and to encrypt data
> such that it can't be decrypted by other software.  Would you agree that
> this is an accurate summary of the functionality, and not misleading?

that the desired tasks cannot be accomplished, then the software might run
but the 'application' does not.

Note the TPM FAQ quoted is correct in using the term 'software' but that
is not the term used by Wagner. The sentence where the term 'application'
is used in the alleged Palladium White Paper might appear to be self
contraditory.

Therefore I do not think that Wagner needs to withdraw his claim.


David Chia
--
What do you call a boomerang that does not come back?  A Stick.




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Trei, Peter

> Jon Callas[SMTP:[EMAIL PROTECTED]]
> 
> 
> On 8/1/02 1:14 PM, "Trei, Peter" <[EMAIL PROTECTED]> wrote:
> 
> > So my question is: What is your reason for shielding your identity?
> > You do so at the cost of people assuming the worst about your
> > motives.
> 
> Is this a tacit way to suggest that the only people who need anonymity or
> pseudonymity are those with something to hide?
> 
> Jon
> 
Not really. However, in todays actual environment, this is frequently 
true that those with something to hide use anonymity. 

While some people have maintained nyms for many years (I can't
think of anyone maintaining explicit stong anonymity right now,
actually - remember Sue D. Nym? ),  and used them to talk about 
a variety of issues, it's pretty rare.

It's rare enough that when a new anononym appears, we know
that the poster made a considered decision to be anonymous.

The current poster seems to have parachuted in from nowhere, 
to argue a specific position on a single topic. It's therefore 
reasonable  to infer that the nature of that position and topic has 
some bearing on the decision to be anonymous.

Since the position argued involves nothing which would invoke the
malign interest of government powers or corporate legal departments, 
it's not that. I can only think of two reasons why our corrospondent
may have decided to go undercover... 

1. If we know who he/she/them were, it would weaken the argument
(for example, by making it clear that the poster has a vested interest
in the position maintained, or that 'AARGH! is the group effort of an
astroturf campaign).

2. If the true identity of the poster became known, he/she/them
fears some kind of retribution:
* The ostracism and detestation of his peers.
* The boycotting of his employer. 
* His employer objecting to his wasting company time on 
  Internet mailing lists.

Our corrospondent has not given us any reason not to 
infer the worst motives. This is, after all, a discipline where
paranoia and suspicion are job requirements.

Peter Trei
Disclaimer: The above represents my private , personal 
opinions only; do not misconstrue them to represent the 
opinions of others.




RE: Challenge to David Wagner on TCPA

2002-08-02 Thread Trei, Peter

> AARG! Anonymous[SMTP:[EMAIL PROTECTED]] writes
[...]
> Now, there is an optional function which does use the manufacturer's key,
> but it is intended only to be used rarely.  That is for when you need to
> transfer your sealed data from one machine to another (either because you
> have bought a new machine, or because your old one crashed).  In this
> case you go through a complicated procedure that includes encrypting
> some data to the TPME key (the TPM manufacturer's key) and sending it
> to the manufacturer, who massages the data such that it can be loaded
> into the new machine's TPM chip.
> 
> So this function does require pre-loading a manufacturer key into the
> TPM, but first, it is optional, and second, it frankly appears to be so
> cumbersome that it is questionable whether manufacturers will want to
> get involved with it.  OTOH it is apparently the only way to recover
> if your system crashes.  This may indicate that TCPA is not feasible,
> because there is too much risk of losing locked data on a machine crash,
> and the recovery procedure is too cumbersome.  That would be a valid
> basis on which to criticize TCPA, but it doesn't change the fact that
> many of the other claims which have been made about it are not correct.
[...]

While I reserve the right to respond to the rest of the poster's letter, 
I'd like to call out this snippet, which gives a very good reason
for both corporate and individual users to avoid TCPA as if it were 
weaponized anthrax (Hi NSA!).
...
OK, It's 2004, I'm an IT Admin, and I've converted my corporation 
over to TCPA/Palladium machines. My Head of Marketing has his
TCPA/Palladium desktop's hard drive jam-packed with corporate 
confidential documents he's been actively working on - sales 
projections,  product plans, pricing schemes. They're all sealed files.

His machine crashes - the MB burns out.
He wants to recover the data.

HoM:I want to recover my data.
Me: OK: We'll pull the HD, and get the data off it. 
HoM:Good - mount it as a secondary HD in my new system.
Me: That isn't going to work now we have TCPA and Palladium.
HoM:Well, what do you have to do?
Me: Oh, it's simple. We encrypt the data under Intel's TPME key, 
and send it off to Intel. Since Intel has all the keys, they can 
unseal all your data to plaintext, copy it, and then re-seal it for 
your new system. It only costs $1/Mb.
HoM:Let me get this straight - the only way to recover this data is to
let
Intel have a copy, AND pay them for it?
Me: Um... Yes. I think MS might be involved as well, if your were using
Word.
HoM:You are *so* dead.

---

Peter Trei




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread James A. Donald

 --
On 2 Aug 2002 at 0:36, David Wagner wrote:
> For instance, suppose that, thanks to TCPA/Palladium, Microsoft 
> could design Office 2005 so that it is impossible for StarOffice 
> and other clones to read files created in Office 2005.  Would 
> some users object?

In an anarchic society, or under a government that did not define 
and defend IP, TCPA/Palladium would probably give roughly the 
right amount of protection to intellectual property by technical 
means in place of legal means.

Chances are that the thinking behind Palladium is not "Let us sell 
out to the Hollywood lobby" but rather "Let us make those !@#$$%^& 
commie chinese pay for their *&^%$##@ software".

Of course, in a society with both legal and technical protection 
of IP, the likely outcome is oppressive artificial monopolies 
sustained both by technology and state power.

I would certainly much prefer TCPA/Palladium in place of existing
IP law.  What I fear is that instead legislation and technology
will each reinforce the other. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 R66NXPp5xZNDYn98jcVqH5q22ikRRFR3evv5xfwF
 2PNka92tYm9+/iBKaR+IcOoDA8BwXZlwcPD18Ogw8




Re: Challenge to David Wagner on TCPA

2002-08-02 Thread David G. Koontz

Jon Callas wrote:
> On 8/1/02 1:14 PM, "Trei, Peter" <[EMAIL PROTECTED]> wrote:
> 
> 
>>So my question is: What is your reason for shielding your identity?
>>You do so at the cost of people assuming the worst about your
>>motives.
> 
> 
> Is this a tacit way to suggest that the only people who need anonymity or
> pseudonymity are those with something to hide?
> 



RE: Challenge to David Wagner on TCPA

2002-08-02 Thread James A. Donald

--
On 2 Aug 2002 at 3:31, Sampo Syreeni wrote:
> More generally, as long as we have computers which allow data to
> be addressed as code and vice versa, the ability to control use
> of data will necessarily entail ability to control use of code.
> So, either we will get systems where circumventing copyright
> controls is trivial or ones where you cannot compile your own
> code. All the rest is just meaningless syntax.

The announced purpose of TCPA/Palladium is to introduce some
intermediate cases.  For example you could compile your own code,
and then encrypt it so that it can only run on a specific target
computer.

As somone who sells code, I would think this would be a great
idea, were it not for the excesses we have been seeing from the IP
lobbyists. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 iB5WVaGfx+zq5Dani1KQGdZIU5Kl21LDrc7w4e1m
 2PoKhj2EuUKqjKlZ/RN3VXdP0TFKxmpO/rR69KupZ




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread Jay Sulzberger

On Thu, 1 Aug 2002, AARG!Anonymous wrote:

> Eric Murray writes:
> > TCPA (when it isn't turned off) WILL restrict the software that you
> > can run.  Software that has an invalid or missing signature won't be
> > able to access "sensitive data"[1].   Meaning that unapproved software
> > won't work.
> >
> > [1] TCPAmain_20v1_1a.pdf, section 2.2
>
> We need to look at the text of this in more detail.  This is from
> version 1.1b of the spec:
>
> : This section introduces the architectural aspects of a Trusted Platform
> : that enable the collection and reporting of integrity metrics.
> :
> : Among other things, a Trusted Platform enables an entity to determine
> : the state of the software environment in that platform and to SEAL data
> : to a particular software environment in that platform.

Claimed advantage to me here?

> :
> : The entity deduces whether the state of the computing environment in
> : that platform is acceptable and performs some transaction with that
> : platform. If that transaction involves sensitive data that must be
> : stored on the platform, the entity can ensure that that data is held in
> : a confidential format unless the state of the computing environment in
> : that platform is acceptable to the entity.

Claimed advantage to me here?

> :
> : To enable this, a Trusted Platform provides information to enable the
> : entity to deduce the software environment in a Trusted Platform. That
> : information is reliably measured and reported to the entity. At the same
> : time, a Trusted Platform provides a means to encrypt cryptographic keys
> : and to state the software environment that must be in place before the
> : keys can be decrypted.
>
> What this means is that a remote system can query the local TPM and
> find out what software has been loaded, in order to decide whether to
> send it some data.  It's not that unapproved software "won't work",
> it's that the remote guy can decide whether to trust it.

Claimed advantage to me here?

>
> Also, as stated earlier, data can be sealed such that it can only be
> unsealed when the same environment is booted.  This is the part above
> about encrypting cryptographic keys and making sure the right software
> environment is in place when they are decrypted.

Claimed advantage to me here?

>
> > Ok, technically it will run but can't access the data,
> > but that it a very fine hair to split, and depending on the nature of
> > the data that it can't access, it may not be able to run in truth.
> >
> > If TCPA allows all software to run, it defeats its purpose.
> > Therefore Wagner's statement is logically correct.
>
> But no, the TCPA does allow all software to run.  Just because a remote
> system can decide whether to send it some data doesn't mean that software
> can't run.  And just because some data may be inaccessible because it
> was sealed when another OS was booted, also doesnt mean that software
> can't run.

Claimed advantage to me here?

>
> I think we agree on the facts, here.  All software can run, but the TCPA
> allows software to prove its hash to remote parties, and to encrypt data
> such that it can't be decrypted by other software.  Would you agree that
> this is an accurate summary of the functionality, and not misleading?

Of course we do not agree.  Under the DRM/TCPA regime I cannot legally do
the following thing:

Spoof your handshake and then run my cracker on the encrypted data you send me.

So some software will not legally run under DRM/TCPA.

>
> If so, I don't see how you can get from this to saying that some software
> won't run.  You might as well say that encryption means that software
> can't run, because if I encrypt my files then some other programs may
> not be able to read them.

See above.  Please be precise in your response.

>
> Most people, as you may have seen, interpret this part about "software
> can't run" much more literally.  They think it means that software needs
> a signature in order to be loaded and run.  I have been going over and
> over this on sci.crypt.  IMO the facts as stated two paragraphs up are
> completely different from such a model.

No.  They are exactly "software needs to be signed to run".  Otherwise why
cannot I run cp on the movie that Time-Warner-AOL sends me?

>
> > Yes, the spec says that it can be turned off.  At that point you
> > can run anything that doesn't need any of the protected data or
> > other TCPA services.   But, why would a software vendor that wants
> > the protection that TCPA provides allow his software to run
> > without TCPA as well, abandoning those protections?
>
> That's true; in fact if you ran it earlier under TCPA and sealed some
> data, you will have to run under TCPA to unseal it later.  The question
> is whether the advantages of running under TCPA (potentially greater
> security) outweigh the disadvantages (greater potential for loss of
> data, less flexibility, etc.).

Ah, so much for your claim that all software that 

Re: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

Eric Murray writes:
> TCPA (when it isn't turned off) WILL restrict the software that you
> can run.  Software that has an invalid or missing signature won't be
> able to access "sensitive data"[1].   Meaning that unapproved software
> won't work.
>
> [1] TCPAmain_20v1_1a.pdf, section 2.2

We need to look at the text of this in more detail.  This is from
version 1.1b of the spec:

: This section introduces the architectural aspects of a Trusted Platform
: that enable the collection and reporting of integrity metrics.
:
: Among other things, a Trusted Platform enables an entity to determine
: the state of the software environment in that platform and to SEAL data
: to a particular software environment in that platform.
:
: The entity deduces whether the state of the computing environment in
: that platform is acceptable and performs some transaction with that
: platform. If that transaction involves sensitive data that must be
: stored on the platform, the entity can ensure that that data is held in
: a confidential format unless the state of the computing environment in
: that platform is acceptable to the entity.
:
: To enable this, a Trusted Platform provides information to enable the
: entity to deduce the software environment in a Trusted Platform. That
: information is reliably measured and reported to the entity. At the same
: time, a Trusted Platform provides a means to encrypt cryptographic keys
: and to state the software environment that must be in place before the
: keys can be decrypted.

What this means is that a remote system can query the local TPM and
find out what software has been loaded, in order to decide whether to
send it some data.  It's not that unapproved software "won't work",
it's that the remote guy can decide whether to trust it.

Also, as stated earlier, data can be sealed such that it can only be
unsealed when the same environment is booted.  This is the part above
about encrypting cryptographic keys and making sure the right software
environment is in place when they are decrypted.

> Ok, technically it will run but can't access the data,
> but that it a very fine hair to split, and depending on the nature of
> the data that it can't access, it may not be able to run in truth.
>
> If TCPA allows all software to run, it defeats its purpose.
> Therefore Wagner's statement is logically correct.

But no, the TCPA does allow all software to run.  Just because a remote
system can decide whether to send it some data doesn't mean that software
can't run.  And just because some data may be inaccessible because it
was sealed when another OS was booted, also doesnt mean that software
can't run.

I think we agree on the facts, here.  All software can run, but the TCPA
allows software to prove its hash to remote parties, and to encrypt data
such that it can't be decrypted by other software.  Would you agree that
this is an accurate summary of the functionality, and not misleading?

If so, I don't see how you can get from this to saying that some software
won't run.  You might as well say that encryption means that software
can't run, because if I encrypt my files then some other programs may
not be able to read them.

Most people, as you may have seen, interpret this part about "software
can't run" much more literally.  They think it means that software needs
a signature in order to be loaded and run.  I have been going over and
over this on sci.crypt.  IMO the facts as stated two paragraphs up are
completely different from such a model.

> Yes, the spec says that it can be turned off.  At that point you
> can run anything that doesn't need any of the protected data or
> other TCPA services.   But, why would a software vendor that wants
> the protection that TCPA provides allow his software to run
> without TCPA as well, abandoning those protections?

That's true; in fact if you ran it earlier under TCPA and sealed some
data, you will have to run under TCPA to unseal it later.  The question
is whether the advantages of running under TCPA (potentially greater
security) outweigh the disadvantages (greater potential for loss of
data, less flexibility, etc.).

> I doubt many would do so, the majority of TCPA-enabled
> software will be TCPA-only.  Perhaps not at first, but eventually
> when there are enough TCPA machines out there.  More likely, spiffy
> new content and features will be enabled if one has TCPA and is
> properly authenticated, disabled otherwise.  But as we have seen
> time after time, today's spiffy new content is tomorrows
> virtual standard.

Right, the strongest case will probably be for DRM.  You might be able
to download all kinds of content if you are running an OS and application
that the server (content provider) trusts.  People will have a choice of
using TCPA and getting this data legally, or avoiding TCPA and trying to
find pirated copies as they do today.

> This will require the majority of people to run with TCPA turned on
> if they want the content.  TCPA does

Re: Challenge to David Wagner on TCPA

2002-08-01 Thread David Wagner

James A. Donald wrote:
>According to Microsoft, the end user can turn the palladium 
>hardware off, and the computer will still boot.  As long as that 
>is true, it is an end user option and no one can object.

Your point is taken.  That said, even if you could turn off TCPA &
Palladium and run some outdated version of Windows, whether users
would object is not entirely obvious.  For instance, suppose that,
thanks to TCPA/Palladium, Microsoft could design Office 2005 so that it
is impossible for StarOffice and other clones to read files created in
Office 2005.  Would some users object?  I don't know.  For many users,
being unable to read documents created in a recent version of Office
is simply not an option.  However, in any case we should consider in
advance the possible implications of this technology.




RE: Challenge to David Wagner on TCPA

2002-08-01 Thread Sampo Syreeni

On 2002-08-01, AARG!Anonymous uttered to [EMAIL PROTECTED],...:

>It does this by taking hashes of the software before transferring
>control to it, and storing those hashes in its internal secure
>registers.

So, is there some sort of guarantee that the transfer of control won't be
stopped by a check against cryptographic signature within the executable
itself, in the future? That sort of thing would be trivial to enforce via
licencing terms, after all, and would allow for the introduction of a
strictly limited set of operating systems to which control would be
transferred. I'm having a lot of trouble seeing the benefit in TCPA
without such extra measures, given that open source software would likely
evolve which circumvented any protection offered by the more open ended
architecture you now describe. Such a development would simply mean that
Peter's concern would be transferred a level up, without losing its
relevance. I'd also contend that this extra level of diversion is
precisely what TCPA, with its purported policy of "no trusted keys" aims
at.

>Then, when the data is decrypted and "unsealed", the hash is compared to
>that which is in the TPM registers now.  This can make it so that data
>which is encrypted when software system X boots can only be decrypted
>when that same software boots.

Again, such values would be RE'd and reported by any sane open source OS
to the circuitry, giving access to whatever data there is. If this is
prevented, one can bootstrap an absolutely secure platform where whatever
the content provider says is the Law, including a one where every piece of
runnable OS software actually enforces the kind of control over
permissible signatures Peter is so worried about. Where's the guarantee
that this won't happen, one day?

>In answer to your question, then, for most purposes, there is no signing
>key that your TPM chip trusts, so the issue is moot.

At the hardware level, yes. At the software one, it probably won't be,
even in the presence of the above considerations. After you install your
next Windows version, you will be tightly locked in with whatever M$
throws at you in their DLL's, and as I pointed out, there's absolutely no
guarantee Linux et al. might well be shut out by extra features, in the
future. In the end what we get is an architecture, which may not embody
Peter's concerns right now, but which is built from the ground up to bring
them into being, later.

More generally, as long as we have computers which allow data to be
addressed as code and vice versa, the ability to control use of data will
necessarily entail ability to control use of code. So, either we will get
systems where circumventing copyright controls is trivial or ones where
you cannot compile your own code. All the rest is just meaningless syntax.
In that light I bet you can guess why people are worried about TCPA and
its ilk.
-- 
Sampo Syreeni, aka decoy - mailto:[EMAIL PROTECTED], tel:+358-50-5756111
student/math+cs/helsinki university, http://www.iki.fi/~decoy/front
openpgp: 050985C2/025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2




RE: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

Peter Trei writes:

> I'm going to respond to AARGH!, our new Sternlight, by asking two questions.
>
> 1. Why can't I control what signing keys the Fritz chip trusts? 
>
> If the point of TCPA is make it so *I* can trust that *my* computer 
> to run the software *I* have approved, and refuse to run something 
> which a virus or Trojan has modifed (and this, btw, is the stated 
> intention of TCPA), then why the hell don't I have full control over 
> the keys? If I did, the thing might actually work to my benefit.
>
> The beneficiary of TCPA when I don't have ultimate root control is
> not I. It is someone else. That is not an acceptable situation.

You might be surprised to learn that under the TCPA, it is not necessary
for the TPM (the so-called "Fritz" chip) to trust *any* signing keys!

The TCPA basically provides two kinds of functionality: first, it can
attest to the software which was booted and loaded.  It does this by
taking hashes of the software before transferring control to it, and
storing those hashes in its internal secure registers.  At a later
time it can output those hashes, signed by its internal signature
key (generated on-chip, with the private key never leaving the chip).
The system also holds a cert issued on this internal key (which is called
the Endorsement key), and this cert is issued by the TPM manufacturer
(also called the TPME).  But this functionality does not require storing
the TPME key, just the cert it issued.

Second, the TCPA provides for secure storage via a "sealing" function.
The way this works, a key is generated and used to encrypt a data blob.
Buried in the blob can be a hash of the software which was running
at the time of the encryption (the same data which can be reported
via the attestation function).  Then, when the data is decrypted and
"unsealed", the hash is compared to that which is in the TPM registers
now.  This can make it so that data which is encrypted when software
system X boots can only be decrypted when that same software boots.
Again, this functionality does not require trusting anyone's keys.

Now, there is an optional function which does use the manufacturer's key,
but it is intended only to be used rarely.  That is for when you need to
transfer your sealed data from one machine to another (either because you
have bought a new machine, or because your old one crashed).  In this
case you go through a complicated procedure that includes encrypting
some data to the TPME key (the TPM manufacturer's key) and sending it
to the manufacturer, who massages the data such that it can be loaded
into the new machine's TPM chip.

So this function does require pre-loading a manufacturer key into the
TPM, but first, it is optional, and second, it frankly appears to be so
cumbersome that it is questionable whether manufacturers will want to
get involved with it.  OTOH it is apparently the only way to recover
if your system crashes.  This may indicate that TCPA is not feasible,
because there is too much risk of losing locked data on a machine crash,
and the recovery procedure is too cumbersome.  That would be a valid
basis on which to criticize TCPA, but it doesn't change the fact that
many of the other claims which have been made about it are not correct.

In answer to your question, then, for most purposes, there is no signing
key that your TPM chip trusts, so the issue is moot.  I suggest that you
go ask the people who misled you about TCPA what their ulterior motives
were, since you seem predisposed to ask such questions.


> 2. It's really curious that Mr. AARGH! has shown up simultaneously
> on the lists and on sci.crypt, with the single brief of supporting TCPA.
>
> While I totally support his or her right to post anonymously, I can only
> speculate that anonymity is being used to disguise some vested 
> interest in supporting TCPA. In other words, I infer that Mr. AARGH!
> is a TCPA insider, who is embarassed to reveal himself in public.
>
> So my question is: What is your reason for shielding your identity?
> You do so at the cost of people assuming the worst about your
> motives.

The point of being anonymous is that there is no persistent identity to
attribute motives to!  Of course I have departed somewhat from this rule
in the recent discussion, using a single exit remailer and maintaining
continuity of persona over a series of messages.  But feel free to make
whatever assumptions you like about my motives.  All I ask is that you
respond to my facts.


> Peter Trei
>
>  PS: Speculating about the most tyrannical uses to which 
> a technology can be put has generally proved a winning 
> proposition. 

Of course, speculation is entirely appropriate - when labeled as such!
But David Wagner gave the impression that he was talking about facts
when he said,

   "The world is moving toward closed digital rights management systems
   where you may need approval to run programs," says David Wagner,
   an assistant professor of computer science at the Univers

Re: Challenge to David Wagner on TCPA

2002-08-01 Thread Eric Murray

On Wed, Jul 31, 2002 at 11:45:35PM -0700, AARG! Anonymous wrote:
> Peter Trei writes:
> > AARG!, our anonymous Pangloss, is strictly correct - Wagner should have
> > said "could" rather than "would".
> 
> So TCPA and Palladium "could" restrict which software you could run.

TCPA (when it isn't turned off) WILL restrict the software that you
can run.  Software that has an invalid or missing signature won't be
able to access "sensitive data"[1].   Meaning that unapproved software
won't work.  Ok, technically it will run but can't access the data,
but that it a very fine hair to split, and depending on the nature of
the data that it can't access, it may not be able to run in truth.

If TCPA allows all software to run, it defeats its purpose.
Therefore Wagner's statement is logically correct.


Yes, the spec says that it can be turned off.  At that point you
can run anything that doesn't need any of the protected data or
other TCPA services.   But, why would a software vendor that wants
the protection that TCPA provides allow his software to run
without TCPA as well, abandoning those protections?
I doubt many would do so, the majority of TCPA-enabled
software will be TCPA-only.  Perhaps not at first, but eventually
when there are enough TCPA machines out there.  More likely, spiffy
new content and features will be enabled if one has TCPA and is
properly authenticated, disabled otherwise.  But as we have seen
time after time, today's spiffy new content is tomorrows
virtual standard.

This will require the majority of people to run with TCPA turned on
if they want the content.  TCPA doesn't need to be required by law,
the market will require it.  At some point, running without TCPA
will be as difficult as avoiding MS software in an otherwise all-MS
office theoretically possible, but difficult in practice.

"TCPA could be required" by the government or MS or  is, I agree, a red herring.  It is not outside
the realm of possibility, in fact I'd bet that someone at MS has
seriously thought through the implications.  But to my mind
the "requirement by defacto standard" scenerio I outline above
is much more likely, in fact it is certain to happen if TCPA
gets in more than say 50% of computers.

I worked for a short while on a very early version of TCPA with Geoff
Strongin from AMD.  We were both concerned that TCPA not be able to
be used to restrict user's freedom, and at the time I thought that
"you can always turn it off" was good enough.  Now I'm not so sure.
If someday all the stuff that you do with your computer touches data that can
only be operated on by TCPA-enabled software, what are you going to do?

BTW, what's your credentials?  You seem familiar with the TCPA spec, which
is no mean feat considering that it seems to have been written to
make it as difficult to understand as possible (or perhaps someone
hired an out-of-work ISO standards writer).  I think that Peter's
guess is spot on.  Of course having you participate as a nym
is much preferable to not having you participate at all, so don't
feel as though you have to out yourself or stop posting.


[1] TCPAmain_20v1_1a.pdf, section 2.2


Eric




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread R. Hirschfeld

> From: "James A. Donald" <[EMAIL PROTECTED]>
> Date: Tue, 30 Jul 2002 20:51:24 -0700

> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> > both Palladium and TCPA deny that they are designed to restrict 
> > what applications you run.  The TPM FAQ at 
> > http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
> > 
> 
> They deny that intent, but physically they have that capability. 

To make their denial credible, they could give the owner access to the
private key of the TPM/SCP.  But somehow I don't think that jibes with
their agenda.

If I buy a lock I expect that by demonstrating ownership I can get a
replacement key or have a locksmith legally open it.




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread Eric Murray

On Thu, Aug 01, 2002 at 02:33:43PM -0700, James A. Donald wrote:

> According to Microsoft, the end user can turn the palladium 
> hardware off, and the computer will still boot.  As long as that 
> is true, it is an end user option and no one can object.
> 
> But this is not what the content providers want.  They want that 
> if you disable the Fritz chip, the computer does not boot.  What 
> they want is that it shall be illegal to sell a computer capable 
> of booting if the Fritz chip is disabled.

Nope.  They care that the Fritz chip is enabled whenever
their content is played.  There's no need to make it a legal
requirement if the market makes it a practical requirement.
The Linux folks just won't be able to watch the latest
Maria Lopez or Jennifer Carey DVDs.  But who cares about a few
geeks?  Only weirdos install alternative OSs anyhow, they can be
ignored.  Most of them will probably have second systems
with the Fritz chip enabled anyhow.

Eric




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread James A. Donald

--
On 31 Jul 2002 at 23:45, AARG! Anonymous wrote:
> So TCPA and Palladium "could" restrict which software you could 
> run. They aren't designed to do so, but the design could be 
> changed and restrictions added.

Their design, and the institutions and software to be designed 
around them, is disturbingly similar to what would be needed to 
restrict what software we could run.  TCPA institutions and 
infrastructure are much the same as SSSCA institutions and 
infrastructure.

According to Microsoft, the end user can turn the palladium 
hardware off, and the computer will still boot.  As long as that 
is true, it is an end user option and no one can object.

But this is not what the content providers want.  They want that 
if you disable the Fritz chip, the computer does not boot.  What 
they want is that it shall be illegal to sell a computer capable 
of booting if the Fritz chip is disabled.

If I have to give superroot powers to Joe in order to run Joe's 
software or play Joe's content, fair enough.  But the hardware and 
institutions to implement this are disturbingly similar to the 
hardware and institutions needed to implement the rule that I have 
to give superroot powers to Joe in order to play Peter's software 
or content.. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 FQhKMpDHys7gyFWenHCK9p7+Xfh1DwpaqGKcztxk
 20jFdJDiigV/b1fmHBudici59omqc/Ze0zXBVvQLk




RE: Challenge to David Wagner on TCPA

2002-08-01 Thread Trei, Peter

I'm going to respond to AARGH!, our new Sternlight, by asking two questions.

1. Why can't I control what signing keys the Fritz chip trusts? 

If the point of TCPA is make it so *I* can trust that *my* computer 
to run the software *I* have approved, and refuse to run something 
which a virus or Trojan has modifed (and this, btw, is the stated 
intention of TCPA), then why the hell don't I have full control over 
the keys? If I did, the thing might actually work to my benefit.

The beneficiary of TCPA when I don't have ultimate root control is
not I. It is someone else. That is not an acceptable situation.

2. It's really curious that Mr. AARGH! has shown up simultaneously
on the lists and on sci.crypt, with the single brief of supporting TCPA.

While I totally support his or her right to post anonymously, I can only
speculate that anonymity is being used to disguise some vested 
interest in supporting TCPA. In other words, I infer that Mr. AARGH!
is a TCPA insider, who is embarassed to reveal himself in public.

So my question is: What is your reason for shielding your identity?
You do so at the cost of people assuming the worst about your
motives.

Peter Trei

 PS: Speculating about the most tyrannical uses to which 
a technology can be put has generally proved a winning 
proposition. 




Re: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

Peter Trei writes:
> AARG!, our anonymous Pangloss, is strictly correct - Wagner should have
> said "could" rather than "would".

So TCPA and Palladium "could" restrict which software you could run.
They aren't designed to do so, but the design could be changed and
restrictions added.

But you could make the same charge about any software!  The Mac OS could
be changed to restrict what software you can run.  Does that mean that
we should all stop using Macs, and attack them for something that they
are not doing and haven't said they would do?

The point is, we should look critically at proposals like TCPA and
Palladium, but our criticisms should be based in fact and not fantasy.
Saying that they could do something or they might do something is a much
weaker argument than saying that they will have certain bad effects.
The point of the current discussion is to improve the quality of the
criticism which has been directed at these proposals.  Raising a bunch
of red herrings is not only a shameful and dishonest way to conduct the
dispute, it could backfire if people come to realize that the system
does not actually behave as the critics have claimed.

Peter Fairbrother made a similar point:

> The wise general will plan his defences according to his opponent's
> capabilities, not according to his opponent's avowed intentions.

Fine, but note that at least TCPA as currently designed does not have this
specific capability of keeping some software from booting and running.
Granted, the system could be changed to allow only certain kinds of
software to boot, just as similar changes could be made to any OS or
boot loader in existence.

Back to Peter Trei (and again, Peter Fairbrother echoed his concern):

> However, TCPA and Palladium fall into a class of technologies with a
> tremendous potential for abuse. Since the trust model is directed against
> the computer's owner (he can't sign code as trusted, or reliably control 
> which signing keys are trusted), he has ceded ultimate control of what 
> he can and can't do with his computer to another. 

Under TCPA, he can do everything with his computer that he can do today,
even if the system is not turned off.  What he can't do is to use the
new TCPA features, like attestation or sealed storage, in such a way as
to violate the security design of those systems (assuming of course that
the design is sound and well implemented).  This is no more a matter of
turning over control of his computer than is using an X.509 certificate
issued by a CA to prove his identity.  He can't violate the security of
the X.509 cert.  He isn't forced to use it, but if he does, he can't
forge a different identity.  This is analogous to how the attestation
features of TCPA works.  He doesn't have to use it, but if he wants to
prove what software he booted, he doesn't have the ability to forge the
data and lie about it.

> Sure, TCPA can be switched off - until that switch is disabled. It 
> could potentially be permenantly disabled by a BIOS update, a 
> security patch, a commercial program which carries signed 
> disabling code as a Trojan, or over the net through a backdoor or 
> vulnerability in any networked software. Or by Congress 
> which could make running a TCPA capable machine with TCPA 
> turned off illegal.

This is why the original "Challenge" asked for specific features in the
TCPA spec which could provide this claimed functionality.  Even if TCPA
is somehow kept turned on, it will not stop any software from booting.

Now, you might say that they can then further change the TCPA so that
it *does* stop uncertified software from booting.  Sure, they could.
But you know what?  They could do that without the TCPA hardware.
They could put in a BIOS that had a cert in it and only signed OS's could
boot.  That's not what TCPA does, and it's nothing like how it works.
A system like this would be a very restricted machine and you might
justifiably complain if the manufacturer tried to make you buy one.
But why criticize TCPA for this very different functionality, which
doesn't use the TCPA hardware, the TCPA design, and the TCPA API?

> With TCPA, I now have to trust that a powerful third party, over
> which I have no control, and which does not necessarily have
> my interests are heart, will not abuse it's power. I don't
> want to have to do that.

How could this be true, when there are no features in the TCPA design
to allow this powerful third party to restrict your use of your computer
in any way?

(By the way, does anyone know why these messages are appearing on
cypherpunks but not on the [EMAIL PROTECTED] mailing list,
when the responses to them show up in both places?  Does the moderator of
the cryptography list object to anonymous messages?  Or does he think the
quality of them is so bad that they don't deserve to appear?  Or perhaps
it is a technical problem, that the anonymous email can't be delivered
to his address?  If someone replies to this message, please include this
fin

Re: Challenge to David Wagner on TCPA

2002-08-01 Thread AARG! Anonymous

James Donald writes:
> TCPA and Palladium give someone else super root privileges on my
> machine, and TAKE THOSE PRIVILEGES AWAY FROM ME.  All claims that
> they will not do this are not claims that they will not do this,
> but are merely claims that the possessor of super root privilege
> on my machine is going to be a very very nice guy, unlike my
> wickedly piratical and incompetently trojan horse running self.

What would be an example of a privilege that you fear would be taken
away from you with TCPA?  It will boot any software that you want, and
can provide a signed attestation of a hash of what you booted.  Are you
upset because you can't force the chip to lie about what you booted?
Of course they could have designed the chip to allow you to do that, but
then the functionality would be useless to everyone; a chip which could
be made to lie about its measurements might as well not exist, right?




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread xganon

On Wed, 31 Jul 2002 16:10:26 +0100, you wrote:
>
> On Wednesday, July 31, 2002, at 04:51  am, James A. Donald wrote:
> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> both Palladium and TCPA deny that they are designed to restrict
> what applications you run.  The TPM FAQ at
> http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
> 
>
> They deny that intent, but physically they have that capability.
>
> And all kitchen knives are murder weapons.

TCPA and Palladium can be forced to restrict 90%+ of all applications at the whim of 
Bill Gates, 
or the United States government.

Kitchen knives, on the other hand, are under non-uniform and widely-distributed, 
uncoordinated 
control. Even Bill Gates and the United States government acting in concert cannot 
make all knives 
become murder weapons, nor make all knives become non-murder weapons.
~~~




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread James A. Donald

--
29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> > > both Palladium and TCPA deny that they are designed to
> > > restrict what applications you run.

James A. Donald:
> > They deny that intent, but physically they have that
> > capability.

 On 31 Jul 2002 at 16:10, Nicko van Someren wrote:
> And all kitchen knives are murder weapons.

No problem if I also have a kitchen knife.

TCPA and Palladium give someone else super root privileges on my
machine, and TAKE THOSE PRIVILEGES AWAY FROM ME.  All claims that
they will not do this are not claims that they will not do this,
but are merely claims that the possessor of super root privilege
on my machine is going to be a very very nice guy, unlike my
wickedly piratical and incompetently trojan horse running self.

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 XQHdtzqDInBFsDcorfDvqJYRHTRhEBsM9eMJIH+w
 2+o4WjsTSV8RDUO7k3c71T9v9JQKwZGZC54BqW6DQ




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Peter Fairbrother

> AARG! Anonymous wrote:

> James Donald wrote:
>> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
>>> both Palladium and TCPA deny that they are designed to restrict
>>> what applications you run.  The TPM FAQ at
>>> http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
>> 
>> They deny that intent, but physically they have that capability.
> 
> Maybe, but the point is whether the architectural spec includes that
> capability.  After all, any OS could restrict what applications you
> run; you don't need special hardware for that.  The question is whether
> restrictions on software are part of the design spec.  You should be
> able to point to something in the TCPA spec that would restrict or limit
> software, if that is the case.
> 
> Or do you think that when David Wagner said, "Both Palladium and TCPA
> incorporate features that would restrict what applications you could run,"
> he meant "that *could* restrict what applications you run"?  They *could*
> impose restrictions, just like any OS could impose restrictions.
> 
> But to say that they *would* impose restrictions is a stronger
> statement, don't you think?  If you claim that an architecture would
> impose restrictions, shouldn't you be able to point to somewhere in the
> design document where it explains how this would occur?
> 
> There's enormous amount of information in the TCPA spec about how to
> measure the code which is going to be run, and to report those measurement
> results so third parties can know what code is running.  But there's not
> one word about preventing software from running based on the measurements.
> 

The wise general will plan his defences according to his opponent's
capabilities, not according to his opponent's avowed intentions.

However, in this case the intention to attack with all available weapons has
not been well hidden. There may be some dupes who honestly profess that no
attack is planned, and some naif's who cannot or will not see the wood, but
they will reap the whirlwind.

My humble opinion,

-- Peter Fairbrother




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Declan McCullagh

I imagine there's a world of difference between "will" and "would."

-Declan


On Mon, Jul 29, 2002 at 03:35:32PM -0700, AARG!Anonymous wrote:
> Can you find anything in this spec that would do what David Wagner says
> above, restrict what applications you could run?  Despite studying this
> spec for many hours, no such feature has been found.
> 
> So here is the challenge to David Wagner, a well known and justifiably
> respected computer security expert: find language in the TCPA spec to
> back up your claim above, that TCPA will restrict what applications
> you can run.  Either that, or withdraw the claim, and try to get Declan
> McCullagh to issue a correction.  (Good luck with that!)
> 
> And if you want, you can get Ross Anderson to help you.  His reports are
> full of claims about Palladium and TCPA which seem equally unsupported
> by the facts.  When pressed, he claims secret knowledge.  Hopefully David
> Wagner will have too much self-respect to fall back on such a convenient
> excuse.




RE: Challenge to David Wagner on TCPA

2002-07-31 Thread Trei, Peter

> AARG! Anonymous[SMTP:[EMAIL PROTECTED]] writes:
> Declan McCullagh writes at
> http://zdnet.com.com/2100-1107-946890.html:
> 
>"The world is moving toward closed digital rights management systems
>where you may need approval to run programs," says David Wagner,
>an assistant professor of computer science at the University of
>California at Berkeley.  "Both Palladium and TCPA incorporate features
>that would restrict what applications you could run."
> 
> But both Palladium and TCPA deny that they are designed to restrict what
> applications you run. 
> 
[...]

> So here is the challenge to David Wagner, a well known and justifiably
> respected computer security expert: find language in the TCPA spec to
> back up your claim above, that TCPA will restrict what applications
> you can run.  
> 
AARG!, our anonymous Pangloss, is strictly correct - Wagner should have
said "could" rather than "would".

However, TCPA and Palladium fall into a class of technologies with a
tremendous potential for abuse. Since the trust model is directed against
the computer's owner (he can't sign code as trusted, or reliably control 
which signing keys are trusted), he has ceded ultimate control of what 
he can and can't do with his computer to another. 

Sure, TCPA can be switched off - until that switch is disabled. It 
could potentially be permenantly disabled by a BIOS update, a 
security patch, a commercial program which carries signed 
disabling code as a Trojan, or over the net through a backdoor or 
vulnerability in any networked software. Or by Congress 
which could make running a TCPA capable machine with TCPA 
turned off illegal.

With TCPA, I now have to trust that a powerful third party, over
which I have no control, and which does not necessarily have
my interests are heart, will not abuse it's power. I don't
want to have to do that.

Peter Trei
Disclaimer: The above represents my personal opinion only. Do
not misconstrue it as representing anyone elses.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Nicko van Someren

On Wednesday, July 31, 2002, at 04:51  am, James A. Donald wrote:
> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
>> both Palladium and TCPA deny that they are designed to restrict
>> what applications you run.  The TPM FAQ at
>> http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
>> 
>
> They deny that intent, but physically they have that capability.

And all kitchen knives are murder weapons.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread Jay Sulzberger

On Tue, 30 Jul 2002, James A. Donald wrote:

> --
>
>
> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> > both Palladium and TCPA deny that they are designed to restrict
> > what applications you run.  The TPM FAQ at
> > http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
> > 
>
> They deny that intent, but physically they have that capability.
>
> --digsig
>  James A. Donald

If they do not restrict what programs I may run, then presumably, under
TCPA, I might run a cracking program on an encrypted file I obtained via
TCPA handshake+transmissal?

The claims that TCPA, Palladium, etc. do not give root to the Englobulators
is, on its face, ridiculous.  Their main design criterion is to do so.

oo--JS.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread AARG! Anonymous

James Donald wrote:
> On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> > both Palladium and TCPA deny that they are designed to restrict 
> > what applications you run.  The TPM FAQ at 
> > http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
>
> They deny that intent, but physically they have that capability. 

Maybe, but the point is whether the architectural spec includes that
capability.  After all, any OS could restrict what applications you
run; you don't need special hardware for that.  The question is whether
restrictions on software are part of the design spec.  You should be
able to point to something in the TCPA spec that would restrict or limit
software, if that is the case.

Or do you think that when David Wagner said, "Both Palladium and TCPA
incorporate features that would restrict what applications you could run,"
he meant "that *could* restrict what applications you run"?  They *could*
impose restrictions, just like any OS could impose restrictions.

But to say that they *would* impose restrictions is a stronger
statement, don't you think?  If you claim that an architecture would
impose restrictions, shouldn't you be able to point to somewhere in the
design document where it explains how this would occur?

There's enormous amount of information in the TCPA spec about how to
measure the code which is going to be run, and to report those measurement
results so third parties can know what code is running.  But there's not
one word about preventing software from running based on the measurements.




Re: Challenge to David Wagner on TCPA

2002-07-31 Thread James A. Donald

--


On 29 Jul 2002 at 15:35, AARG! Anonymous wrote:
> both Palladium and TCPA deny that they are designed to restrict 
> what applications you run.  The TPM FAQ at 
> http://www.trustedcomputing.org/docs/TPM_QA_071802.pdf reads
> 

They deny that intent, but physically they have that capability. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 ElmZA5NX6jAmhPu1EDT8Zl7D+IeQTSI/z1oo4lSn
 2qoSIC6KSr2LFLWyxZEETG/27dEy3yOWEnRtXzHy9