TCPA/MS

2002-06-30 Thread Dave Howe

Phil Youngblood posted the following to the securecomp server - thought
it might interest people here, given the recent discussion of M$'s DRM
stuff...
--
This from the Eula for the latest Windows Media Player patch.

* Digital Rights Management (Security).  You agree that in order to
protect the integrity of content and software protected by digital
rights
management (Secure Content), Microsoft may provide security related
updates to the OS Components that will be automatically downloaded onto
your computer.  These security related updates may disable your ability
to
copy and/or play Secure Content and use other software on your computer.
If we provide such a security update, we will use reasonable efforts to
post notices on a web site explaining the update.




Re: maximize best case, worst case, or average case? (TCPA)

2002-06-30 Thread Ryan Lackey

I think dongles (and non-copyable floppies) have been around since the early 
80s at least...maybe the 70s.  Tamper-resistant CPU modules have been around 
since the ATM network, I believe, in the form of PIN processors stored
inside safes)

The fundamental difference between a dongle and a full trusted module 
containing the critical application code is that with a dongle, you can
just patch the application to skip over the checks (although they can be
repeated, and relatively arcane).

If the whole application, or at least the non-cloneable parts of the 
application, exist in a sealed module, the rest of the application can't
be patched to just skip over this code.

Another option for this is a client server or oracle model where the really 
sensitive pieces (say, a magic algorithm for finding oil from GIS data,
or a good natural language processor) are stored on vendor-controlled
hardware centrally located, with only the UI executing on the end user's 
machine.

What I'd really like is a design which accomplishes the good parts of TCPA,
ensuring that when code claims to be executing in a certain form, it really is,
and providing a way to guarantee this remotely -- without making it easy
to implement restrictions on content copying.  It would be nice to have the
good parts of TCPA, and given the resistance to DRM, if security and TCPA 
have their fates bound, they'll probably both die an extended and painful 
death.

I suppose the real difference between a crypto-specific module and a general 
purpose module is how much of the UI is within the trusted platform envelope.
If the module is only used for handling cryptographic keys, as an addition to
an insecure general purpose CPU, with no user I/O, it seems unlikely to be
useful for DRM.  If the entire machine is inside the envelope, it seems 
obviously useful for DRM, and DRM would likely be the dominant application.
If only a limited user IO is included in the envelope, sufficient for
user authentication and keying, and to allow the user to load 
initially-trusted code onto the general purpose CPU, but where the user
can fully use whatever general purpose code on the general purpose CPU,
even uncertified code, with the certified module, it's not really useful
for DRM, but still useful for the non-DRM security applications which are
the alleged purpose behind TCPA.

(given that text piracy doesn't seem to be a serious commercial concern,
simply keeping video and audio playback and network communications outside 
the TCPA envelope entirely is good enough, in practice...this way, both 
authentication and keying can be done in text mode, and document 
distribution control, privacy of records, etc. can be accomplished, provided 
there is ALSO the ability to do arbitrary text processing and computing 
outside the trusted envelope, .)

If it's the user's own data being protected, you don't need to worry about 
the user intentionally circumventing the protections.  Any design which
removes control from the 'superuser' of the machine is fundamentally about
protecting someone other than the user.

This, I think, is the difference between TCPA and smartcards.  Notice
which one has in its short lifetime attracted far more enmity :)


Quoting [EMAIL PROTECTED] [EMAIL PROTECTED]:
 
 
 I remember looking at possibility at adding tamper resisistent hardware
 chip to PCs back in 83 or 84 time frame (aka the TCPA idea for PCs is going
 on at least 20 years old now).  It was the first time I ran into embedding
 chip in a metal case that would create electrical discharge frying the chip
 if the container was breached.
 
 Remember when applications came with their own copy-protection floppy
 disks?  it was possible to build up a library of such disks 
 requiring all sorts of remove, search, insert ... when switching from one
 application to another. They eventually disappeared ... but imagine if they
 had survived into the multitasking era  when it would have been
 necessary to have multiple different copy protection floppy disks crammed
 into the same drive at the same time. The chip was suppose to provide an
 analog to the CPU serial number used for licensing software on mainframes
  dating at least from the original IBM 370s (store cpuid hardware
 instruction).
 
 Some of the higher-end applications still do that with some form of dongle
 (originally in the serial port) that comes with the application  it
 doesn't quite have the downside of trying to cram multiple floppies into
 the same drive concurrently; the serial port dongles allow for them to be
 inline cascaded ... and in theory still be able to use the serial port for
 other use at the same time.
 
 i believe that there is some statistic some place about the UK and the US
 are really great  that in those two countries the copyright piracy is
 estimated to only be 50 percent.

-- 
Ryan Lackey [RL7618 RL5931-RIPE][EMAIL PROTECTED]
CTO and Co-founder, HavenCo Ltd.+44 7970 633 

Re: Ross's TCPA paper

2002-06-30 Thread Barney Wolff

A pseudonym that I can give up at will and that can never afterwards
be traced to me is equivalent to an anonym.

I'm not suggesting that anonymity be outlawed, or that every merchant
be required to reject anonymous or pseudonymous customers.  All I'm
suggesting is that small merchants MUST NOT be required to accept
such customers.

On Sun, Jun 30, 2002 at 08:38:29AM -0700, bear wrote:
 
 On Sun, 30 Jun 2002, Barney Wolff wrote:
 
 The trouble I have with this is that I'm not only a consumer, I'm
 also a merchant, selling my own professional services.  And I just
 will not, ever, perform services for an anonymous client.  That's
 my choice, and the gov't will take it away only when they can pry
 it from my cold dead fingers. :)
 
 Are you one of those who makes no distinction between anonymity
 and pseudonymity?  'Cause I've been talking about pseudonymity,
 and all your answers have been talking about anonymity.
 
   Bear

-- 
Barney Wolff
I never met a computer I didn't like.




Re: maximize best case, worst case, or average case? (TCPA)

2002-06-30 Thread lynn . wheeler

security modules are also inside the swipe  pin-entry boxes that you see
at check-out counters.

effectively both smartcards and dongles are forms of hardware tokens 
the issue would be whether a smartcard form factor might be utilized in a
copy protection scheme similar to TCPA paradigm  a single hardware chip
that you register for all you applications  or in the dongle paradigm
 you get a different smartcard for each application (with the downside
of the floppy copy protection scenario where a user with a half dozen
active copy protected applications all wanted their smartcard crammed
into the same smartcard reader simultaneously).

many of the current chipcards  i believe are used in the magnetic
stripe swipe mode for authenticating specific transactions   most of
the rest are used for password substitute at login type events. Many of the
chipcards following the straight payment card model result in end-user
having large number of different institutional tokens (similar to the
floppy copy protect paradigm).  Following the institutional-specific and/or
application-specific token paradigm starts to become difficult to manage as
the number of tokens increase and the probability that multiple are
required simultaneously increases.

That eventually leads into some sort of person-centric or device-centric
paradigm  not so much an issue of the form factor  (floppy, chipcard,
dongle, etc)  but an issue of whether there are potentially large
numbers of institutional/application specific objects or small numbers of
person/device specific objects.

So a simple issue is the trade-off between the institutional/application
specific objects  which seem to have some amount of acceptance (payment
cards, chip cards, various dongle forms, etc) but in many instances can
scale poorly ... especially if  multiple different such objects have to be
available concurrently  vis-a-vis switching to a person/device specific
object paradigm (chipcard, dongles, etc, potentially exactly same
formfactor but different paradigm)



[EMAIL PROTECTED] on 6/30/2002 12:39 pm wrote:


I think dongles (and non-copyable floppies) have been around since the
early
80s at least...maybe the 70s.  Tamper-resistant CPU modules have been
around
since the ATM network, I believe, in the form of PIN processors stored
inside safes)

The fundamental difference between a dongle and a full trusted module
containing the critical application code is that with a dongle, you can
just patch the application to skip over the checks (although they can be
repeated, and relatively arcane).

If the whole application, or at least the non-cloneable parts of the
application, exist in a sealed module, the rest of the application can't
be patched to just skip over this code.

Another option for this is a client server or oracle model where the really

sensitive pieces (say, a magic algorithm for finding oil from GIS data,
or a good natural language processor) are stored on vendor-controlled
hardware centrally located, with only the UI executing on the end user's
machine.

What I'd really like is a design which accomplishes the good parts of
TCPA,
ensuring that when code claims to be executing in a certain form, it really
is,
and providing a way to guarantee this remotely -- without making it easy
to implement restrictions on content copying.  It would be nice to have the
good parts of TCPA, and given the resistance to DRM, if security and TCPA
have their fates bound, they'll probably both die an extended and painful
death.

I suppose the real difference between a crypto-specific module and a
general
purpose module is how much of the UI is within the trusted platform
envelope.
If the module is only used for handling cryptographic keys, as an addition
to
an insecure general purpose CPU, with no user I/O, it seems unlikely to be
useful for DRM.  If the entire machine is inside the envelope, it seems
obviously useful for DRM, and DRM would likely be the dominant application.
If only a limited user IO is included in the envelope, sufficient for
user authentication and keying, and to allow the user to load
initially-trusted code onto the general purpose CPU, but where the user
can fully use whatever general purpose code on the general purpose CPU,
even uncertified code, with the certified module, it's not really useful
for DRM, but still useful for the non-DRM security applications which are
the alleged purpose behind TCPA.

(given that text piracy doesn't seem to be a serious commercial concern,
simply keeping video and audio playback and network communications outside
the TCPA envelope entirely is good enough, in practice...this way, both
authentication and keying can be done in text mode, and document
distribution control, privacy of records, etc. can be accomplished,
provided
there is ALSO the ability to do arbitrary text processing and computing
outside the trusted envelope, .)

If it's the user's own data being protected, you don't 

videotaping = liar cheat?

2002-06-30 Thread Major Variola (ret)

At 08:16 PM 6/29/02 +0200, Anonymous wrote:
When an artist releases a song or some other creative product to the
world, they typically put some conditions on it.  If you want to listen

to and enjoy the song, you are obligated to agree to those conditions.
If you can't accept the conditions, you shouldn't take the creative
work.

The artist is under no obligation to release their work.  It is like a
gift to the world.  They are free to put whatever conditions they like
on that gift, and you are free to accept them or not.

If you take the gift, you are agreeing to the conditions.  If you then
violate the stated conditions, such as by sharing the song with others,

you are breaking your agreement.  You become a liar and a cheat.

First, What's your point?  This list does not require that participants
agree with anyone else's sense of ethics.  This list often considers the

effect of tech on civilization, but you are not required to endorse
(or recognize, or scorn) civilization.  This list often discusses
certain
ethics by themselves, but nothing is taken for granted, and the
timid/naif
may be a little frightened by this.

Second, it is quite clear, even to contract-law/laissez-faire types like
myself, that
some DRM-interested companies are attempting to use the law
to remove some rights from consumers (about gadgets and bits and RF).
Many of us have a maniacally dim view of such manipulation.

Third, if you don't get understand why some people are driven
to understand technology, you should probably go back to your TV.


 When someone makes you an offer and you don't find the terms
acceptable, you simply refuse.  You don't take advantage by taking what

they provide and refusing to do your part.

And you don't sue someone for what users of their product do.




Re: Ross's TCPA paper

2002-06-30 Thread bear

On Sun, 30 Jun 2002, Barney Wolff wrote:

The trouble I have with this is that I'm not only a consumer, I'm
also a merchant, selling my own professional services.  And I just
will not, ever, perform services for an anonymous client.  That's
my choice, and the gov't will take it away only when they can pry
it from my cold dead fingers. :)

Are you one of those who makes no distinction between anonymity
and pseudonymity?  'Cause I've been talking about pseudonymity,
and all your answers have been talking about anonymity.

Bear




Re: Diffie-Hellman and MITM

2002-06-30 Thread Marcel Popescu

From: gfgs pedo [EMAIL PROTECTED]

 One solution suggested against the man in the middle
 attack is using the interlock protocol

This is the one I vaguely recalled, thank you.

 All mallory would have to do is send the half of the
 (n th) packet when he receives the half of (n+1)th
 packet since the 1 st packet was faked by mallory.

Interesting attack... assuming that a one-block delay doesn't look
suspicious.

What if every message except the very first one has a hash of the previously
received message?

A - (M -) B: half 1 of message A1
B - (M -) A: half 1 of message B1 | hash (half 1 of message A1)
A - (M -) B: half 2 of message A1 | hash (half 1 of message B1)
B - (M -) A: half 2 of message B1 | hash (half 2 of message A1)
A - (M -) B: half 1 of message A2 | hash (half 2 of message B1)
... and so on

Nah... won't work; since M captures A1 and B1, he can compute the hashes for
both the initial bogus message and the (delayed) genuine ones. Same if they
try hasing all the previous messages.

What if they send the hash of the *other* half? (The program splitting the
messages already has the full ones.)

A - (M -) B: half 1 of message A1 | hash (half 2 of message A1)
B - (M -) A: half 1 of message B1 | hash (half 2 of message B1)
A - (M -) B: half 2 of message A1 | hash (half 1 of message A1)
B - (M -) A: half 2 of message B1 | hash (half 1 of message B1)
... and so on

Nope, no good... M fakes the first message in both direction, and then he
always has a good one, so he can compute the hashes.

The only thing that might, as far as I can see, succeed (with a high
probability) would be for everyone to hash the *next* half - meaning that,
together with half 2 of message N, there will be the hash of half one of
message N + 1. However, I don't see how this would be possible for an
interactive communication...

Thanks,
Mark





Re: Ross's TCPA paper

2002-06-30 Thread Barney Wolff

On Sat, Jun 29, 2002 at 10:03:33PM -0700, bear wrote:
 ...
 I won't give up the right NOT to do business with anonymous customers,
 or anyone else with whom I choose not to do business.
 
 A few years ago merchants were equally adamant and believed
 equally in the rightness of maintaining their right to not
 do business with blacks, chicanos, irish, and women.  It'll
 pass as people wake up and smell the coffee.  Unfortunately
 that won't be until after at least a decade of really vicious
 abuses of private data by merchants who believe in their
 god-given right to snoop on their customers.

The trouble I have with this is that I'm not only a consumer, I'm
also a merchant, selling my own professional services.  And I just
will not, ever, perform services for an anonymous client.  That's
my choice, and the gov't will take it away only when they can pry
it from my cold dead fingers. :)  It's not that I hate my govt,
although I liked it a whole lot better before 1/20/01, but I will
not risk aiding and abetting criminality, even if I can pretend I
don't know I'm doing it.

Oh by the way, last time you visited your favorite kinky sex shop,
didn't you notice the surveillance camera in the corner?  And didn't
you see the cashier at your ${house_of_worship} last ${sabbath}?

The right to anonymity seems to be a new one, not a traditional one
that we're about to lose.  It may be a needed defense against the
ever-increasing ability to correlate data.  All I'm really railing
against is the notion that just because I'm selling something I MUST
accept your anonymity.

 ...
 I don't see any way that DRM addresses the privacy concern
 of database linking.  Especially since I expect database
 linking to be done using specialized software that doesn't
 have to get inspected by anybody with a motive to prevent it,

I certainly agree that DRM cannot protect privacy violation by a
user with access rights.

The whole issue of database correlation and anonymity was insightfully
explored by Heinlein in The Moon is a Harsh Mistress in 1966.

-- 
Barney Wolff
I never met a computer I didn't like.




Re: Ross's TCPA paper

2002-06-30 Thread bear

On Sun, 30 Jun 2002, Barney Wolff wrote:

A pseudonym that I can give up at will and that can never afterwards
be traced to me is equivalent to an anonym.

Actually, I don't have a problem with it being traced afterwards,
if a crime has been committed and there's a search warrant or
equivalent to trace it in order to further the investigation of
a specific crime.  And that's a pseudonym, not anonymity.

My problem is that if merchant's information is easily linkable,
or if several merchants have access to the same linkable field,
then privacy is out the window.  It's reasonable for a merchant
to know every deal I've ever done with him (pseudonymity).  It's
not reasonable for a merchant to know nothing at all about my
past dealings with anyone including himself (anonymity) nor for
a merchant to know every deal I've done in my life, with everyone
(marketing databases based on linkable ID's).

Ray




Re: maximize best case, worst case, or average case? (TCPA)

2002-06-30 Thread Ryan Lackey

Quoting [EMAIL PROTECTED] [EMAIL PROTECTED]:
 
 security modules are also inside the swipe  pin-entry boxes that you see
 at check-out counters.

Yep -- anything which handles PINs, specifically, and some non-ATM smartcard
payment systems.
 
 effectively both smartcards and dongles are forms of hardware tokens 
 the issue would be whether a smartcard form factor might be utilized in a
 copy protection scheme similar to TCPA paradigm  a single hardware chip
 that you register for all you applications  or in the dongle paradigm
  you get a different smartcard for each application (with the downside
 of the floppy copy protection scenario where a user with a half dozen
 active copy protected applications all wanted their smartcard crammed
 into the same smartcard reader simultaneously).

From a DRM perspective, any system which doesn't put the entire digital stream
and all convenient analog streams inside the trusted, tamperproof boundary
is probably highly imperfect, perhaps to the point where it's really just
a speedbump, no more effective than popping up a dialog box saying please
don't pirate this software with a click though EULA.

A concrete example is the DVD.  RPC 1 allowed raw access to the encrypted
data; the encryption could be broken through several techniques (disassembly
of software players to recover keys, or as happened, vulnerabilities in the
algorithm).

Then they came out with RPC 2.  Implementation is highly imperfect (for a 
variety of reasons), but in theory, this renders the whole DeCSS issue 
relatively dead -- the drive itself will refuse to output a bitstream of any
kind if the region coding is wrong.

RPC 2 can, in theory, prevent the playback of media on drives without the
right region code.  It doesn't, however, prevent grabbing the bitstream off
a licensed dvd in a correct-region player, turning that into a DivX, and 
distributing it widely.

Any system which uses a tamper-resistant envelope which doesn't encompass the
entire digital playback stream will end up with this same vulnerability.  It
deters casual defeat of the DRM system -- you need to specifically seek
out a pirate copy of the movie in the first place, rather than buying a grey
market import.

In addition, there is the analog hole; even if the digital bitstream is
protected fully, any high-quality analog output can be re-digitized and
turned into a fairly acceptable version.   People even go so far as to 
do telecine of a kind, aiming a video camera at the screen in a theater.

If it is possible for the underground to distribute a worthwhile copy some
hours or days after initial release, any system with digital or analog hole
will suffer.  This is why, for instance, movies are widely divxed or 
illegally VCD'd; movies are still worth seeing a few hours after the first
copies hit the distributors and reviewers (still a few weeks or months ahead
of public release).  However, a live event on pay per view, like a boxing
match or world cup, is much less widely pirated in divx form; even if you can
get a good digital or analog copy of it after the event, who wants to watch it
then?

I think this means, given a constant level of piracy and limitations on DRM,
there is a market incentive to do live and simultaneous global media events,
vs. things which are watchable later for roughly the same value.  Also, 
streaming p2p systems or pirate networks are far easier to detect and shut down
than systems with high inbuilt latency.

If content providers shifted their business model to emphasize these 
ephemeral forms of content, rather than things with lasting value,
they would be able to avoid problems with piracy simply by going after
very large, centralized real-time distributors.  This is ultimately 
far more cost effective and politically viable than trying to lock every
device in the world down.  I think there is already a marketing focus on
making events out of the release of even durable forms of content --
book launches, movie premieres, etc. -- in the future, perhaps, this
initial event will be the source of the majority of revenue, with residuals
after that event wrapped up in the form of service fees for access to 
an unlimited library.  After all, isn't going to an event like Woodstock
worth far more to the average user than a complete audio/video record
of the event after the fact?

 
 many of the current chipcards  i believe are used in the magnetic
 stripe swipe mode for authenticating specific transactions   most of
 the rest are used for password substitute at login type events. Many of the
 chipcards following the straight payment card model result in end-user
 having large number of different institutional tokens (similar to the
 floppy copy protect paradigm).  Following the institutional-specific and/or
 application-specific token paradigm starts to become difficult to manage as
 the number of tokens increase and the probability that multiple are
 required simultaneously increases.
 
 

Re: Uplifting Brin

2002-06-30 Thread R. A. Hettinga

--- begin forwarded text


Status:  U
Date: Sat, 29 Jun 2002 19:59:26 +0200 (CEST)
From: Eugen Leitl [EMAIL PROTECTED]
To: Tom [EMAIL PROTECTED]
cc: R. A. Hettinga [EMAIL PROTECTED], [EMAIL PROTECTED]
Subject: Re: Uplifting Brin

On Sat, 29 Jun 2002, Tom wrote:

 Ithink Brin has got some cool ideas about opening up the window of
 visability on the gov but where he falls down is the trust is issue.

Brin's ideas only look cool if you think that we're living in a perfect
anarchy, and that there's no intrinsic edge in centralism. Surveillance
technology favours centralism, however, and we all know why anarchy is
unstable with current type of agents, right? So he built the foundations
of his glass castle on a pile of stinky poo.

 Brin forget the adage that power corupts and absolute power corupts
 absolutley.

Brin is confabulating an alternative reality. This reality doesn't give a
flying fuck about what he thinks is going to happen. He's actually doing
damage, because quite a few people find his ideas superficially appealing,
and stagnate in happy complacency. Sheep like the idea of Brinworld.
Goverments love the idea of Brinworld, because it makes selling the
current brand of FUD much easier. Happy sheep don't mind being shorn so
much.

 The Cypherpunks do come off often as being the
 lonegunmen-dressed-in-black -knights-of-the-impossible-cause.but

Dunno, they come off a lot like narcistic wankers to me. Maybe some of
them are writing code, but they must be damn secretive about it. There are
literally very few people in the world who're good architects, and
actually develop stuff.

 thats a very needed role to have on most any society. While .001% of
 the populace is doing that role they are comming up witht he means and
 ways to subvert things if they should all get one way or the
 highwayesque.

--- end forwarded text


-- 
-
R. A. Hettinga mailto: [EMAIL PROTECTED]
The Internet Bearer Underwriting Corporation http://www.ibuc.com/
44 Farquhar Street, Boston, MA 02131 USA
... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience. -- Edward Gibbon, 'Decline and Fall of the Roman Empire'