Re: Piracy is wrong

2002-07-05 Thread Mikko Särelä

On Thu, 4 Jul 2002 [EMAIL PROTECTED] wrote:
 Let us make a more realistic supposition:
 
 Let us suppose instead he organized an entertainment where a
 lightly clad singer sang and danced, and showed that video on
 television interspersed with advertisments, and I then captured
 that video on my hard disk, deleted the ads, and put it on the
 internet.
 
 In that case, where is my promise?  Doubtless I must have made it
 in the same moment of forgetfulness as I signed the social
 contract. 

Nowadays, nowhere. And that is mostly because of copyrights. If there were
no copyright laws, I bet you would have to sign all sort of things to get
tv channels home. And yes, it would be quite a pain in the ass to do this
way 'afterwords' when people already have tv's and expect them to work
without doing anything. 

-- 
MikkoOne Ring to rule them all,
  One Ring to find them,
  One Ring to bring them all
  And in the Darkness bind them.




Re: Piracy is wrong

2002-07-05 Thread jamesd

--
 On 5 Jul 2002 at 3:10, Nomen Nescio wrote:
 Suppose you know someone who has been working for years on a
 novel. But he lacks confidence in his work and he's never shown
 it to anyone. Finally you persuade him to let you look at a copy
 of his manuscript, but he makes you promise not to show any of
 it to anyone else.

 Hopefully it is clear in this situation that no one is doing
 anything evil.  Even though he is giving you the document with
 conditions beyond those specified in the current regime of
 copyright, he is not taking advantage of you.  Even though you
 hold the bits to his manuscript and he has put limitations on
 what you can do with them, he is not coercing you. You
 voluntarily accepted those conditions as part of the agreement
 under which you received the document.

 It should also be clear that it would be ethically wrong for you
 to take the manuscript and show it to other people.  Even if you
 take an excerpt, as allowed under fair use exemptions to
 copyright protection, and include it in a document for
 commentary or review purposes, that would be a violation of your
 promise.  This example demonstrates that when two people reach a
 mutual agreement about how they will handle some information,
 they are ethically bound by it even beyond the regulations of
 copyright law.

Let us make a more realistic supposition:

Let us suppose instead he organized an entertainment where a
lightly clad singer sang and danced, and showed that video on
television interspersed with advertisments, and I then captured
that video on my hard disk, deleted the ads, and put it on the
internet.

In that case, where is my promise?  Doubtless I must have made it
in the same moment of forgetfulness as I signed the social
contract. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 bAhnMLd4HxDL/1pvlkk6Ga1VpR1eMM5jp1ff+rbD
 2k/NTfC76YawZx8bnVYHGPHiRnNt5axoRlaDUDJP8




Re: Diffie-Hellman and MITM

2002-07-05 Thread Morlock Elloi

 Consider setting up a secure video call with somebody,
 and each of you reading the hash of your DH parameter to the other.
 It's really hard for a MITM to fake that - but if you don't know
 what the other person looks or sounds like, do you know it's really them,
 or did you just have an unbreakably secure call with the wrong person?

Whatever you deploy to define somebody should be used as authentication
channel. You are exactly as secure as as you can define somebody. Your al
quaeda coworker probably has your never published public key. Your online-found
busty and wet blonde is probably named Gordon.


=
end
(of original message)

Y-a*h*o-o (yes, they scan for this) spam follows:
Sign up for SBC Yahoo! Dial - First Month Free
http://sbc.yahoo.com




Re: Ross's TCPA paper

2002-07-05 Thread jamesd

--
On 5 Jul 2002 at 14:45, AARG! Anonymous wrote:
 Right, and you can boot untrusted OS's as well.  Recently there
 was discussion here of HP making a trusted form of Linux that
 would work with the TCPA hardware.  So you will have options in
 both the closed source and open source worlds to boot trusted
 OS's, or you can boot untrusted ones, like old versions of
 Windows.  The user will have more choice, not less.

Yes he will, but the big expansion of choice is for the the seller
of content and software, who will have more choices as to how he
can cripple what he sells you.  For example he can sell you music
that will only play on a particular music player on your
particular machine.

But that is not enough to give the content industry what it wants,
for someone can still break it on one machine, perhaps by
intercepting the bitstream to the the DA, and having broken it on
one machine, can run it on all machines all over the internet.
Break once, run everywhere.

Microsoft has also been talking out of both sides of its mouth, by
saying that this will also protect against break once, run
everywhere.  The only way that this can protect against
break-once-run-everywhere is to reduce user choice, to make it
mandatory that the user can only run government trusted software,
and to reduce seller choice, prohibit sellers from providing
unacceptable software, such as napster like software. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 XQJ33SB0W84Cm4Mw0+3lnN4nsUtaB4B6cIa1dP/2
 2s67UXEL+Y5FHrr52MYArwzRuptDlBNVQIJOj/n/8




copyright restrictions are coercive and immoral (Re: Piracy is wrong)

2002-07-05 Thread Adam Back

On Fri, Jul 05, 2002 at 03:10:07AM +0200, Nomen Nescio wrote:
 Suppose you know someone who has been working for years on a novel.
 But he lacks confidence in his work and he's never shown it to anyone.
 Finally you persuade him to let you look at a copy of his manuscript,
 but he makes you promise not to show any of it to anyone else.
 
 [...]

I agree with the Anonymous posters analysis.

I would further elaborate with regard to current copyright related
laws:

- parties are free to enter into NDA or complex distribution and use
contracts surrounding exchange of content or information generally as
anonymous describes, and this is good and non-coercive

- but that private contract places no burden on other parties if that
agreement is broken and the content distributed anyway.  This is
exactly analogous to the trade secret scenario where once the trade
secret is out, it's tough luck for the previous trade secret owner --
clearly it's no longer a secret.

- where I find current copyright laws at odds with a coercion free
society is in placing restrictions on people who did not agree to any
NDA contract.  ie. There are laws which forbid copying or use of
information by people who never entered into any agreement with the
copyright holder, but obtained their copy from a third party.

- in a free society (one without a force monopoly central government)
I don't think copyright would exist -- voluntary agreements -- NDAs of
the form anonymous describes -- would be the only type of contract.

- the only form of generally sanctioned force would be in response to
violence initiated upon oneself.

- if the media cartels chose to hire their own thugs to threaten
violence to people who did not follow the cartels ideas about binding
people to default contracts they did not voluntarily enter into, that
would be quite analogous to the current situation where the media
cartels are lobbying government to increase the level of the threats
of violence, and make more onerous the terms of the non-voluntary
contracts.  

(Also in a free society individuals would be able to employ the
services of security firms protection services to defend themselves
from the media cartels thugs, as the media cartels would not have the
benefit of a force monopoly they have the lobbying power to bribe to
obtain enforcement subsidies).

Adam




Re: Ross's TCPA paper

2002-07-05 Thread Seth David Schoen

Hadmut Danisch writes:

 You won't be able to enter a simple shell script through the
 keyboard. If so, you could simple print protected files as
 a hexdump or use the screen (or maybe the sound device or any
 LED) as a serial interface.
 
 Since you could use the keyboard to enter a non-certified
 program, the keyboard is to be considered as a nontrusted
 device. This means that you either
 
 * have to use a certified keyboard which doesn't let 
   you enter bad programs
 
 * don't have a keyboard at all
 
 * or are not able to use shell scripts (at least not in
   trusted context). This means a 
   strict separation between certified software and data.

The latter is closest to what's intended in Palladium.  Individual
programs using Palladium features are able to prevent one another from
reading their executing or stored state.  You can write your own
programs, but somebody else can also write programs which can process
data in a way that your programs can't interact with.

The Palladium security model and features are different from Unix, but
you can imagine by rough analogy a Unix implementation on a system
with protected memory.  Every process can have its own virtual memory
space, read and write files, interact with the user, etc.  But
normally a program can't read another program's memory without the
other program's permission.

The analogy starts to break down, though: in Unix a process running as
the superuser or code running in kernel mode may be able to ignore
memory protection and monitor or control an arbitrary process.  In
Palladium, if a system is started in a trusted mode, not even the OS
kernel will have access to all system resources.  That limitation
doesn't stop you from writing your own application software or scripts.

Interestingly, Palladium and TCPA both allow you to modify any part of
the software installed on your system (though not your hardware).  The
worst thing which can happen to you as a result is that the system
will know that it is no longer trusted, or will otherwise be able to
recognize or take account of the changes you made.  In principle,
there's nothing wrong with running untrusted; particular applications
or services which relied on a trusted feature, including sealed
storage (see below), may fail to operate.

Palladium and TCPA both allow an application to make use of
hardware-based encryption and decryption in a scheme called sealed
storage which uses a hash of the running system's software as part of
the key.  One result of this is that, if you change relevant parts of
the software, the hardware will no longer be able to perform the
decryption step.  To oversimplify slightly, you could imagine that the
hardware uses the currently-running OS kernel's hash as part of this
key.  Then, if you change the kernel in any way (which you're
permitted to do), applications running under it will find that they're
no longer able to decrypt sealed files which were created under the
original kernel.  Rebooting with the original kernel will restore the
ability to decrypt, because the hash will again match the original
kernel's hash.

(I've been reading TCPA specs and recently met with some Microsoft
Palladium team members.  But I'm still learning about both systems and
may well have made some mistakes in my description.)

-- 
Seth Schoen
Staff Technologist[EMAIL PROTECTED]
Electronic Frontier Foundationhttp://www.eff.org/
454 Shotwell Street, San Francisco, CA  94110 1 415 436 9333 x107




RE: Ross's TCPA paper

2002-07-05 Thread Lucky Green

Hadmut Danisch wrote:
 On Wed, Jul 03, 2002 at 10:54:43PM -0700, Bill Stewart wrote:
  At 12:59 AM 06/27/2002 -0700, Lucky Green wrote:
  I fully agree that the TCPA's efforts offer potentially beneficial 
  effects. Assuming the TPM has not been compromised, the TPM should 
  enable to detect if interested parties have replaced you 
 NIC with the 
  rarer, but not unheard of, variant that ships out the contents of 
  your operating RAM via DMA and IP padding outside the abilities of 
  your OS to detect.
  
  It can?  I thought that DMA was there to let you avoid 
 bothering the 
  CPU.  The Alternate NIC card would need to have a CPU of 
 its own to do 
  a good job of this, but that's not hard.
 
 I don't think so. As far as I understood, the 
 bus system (PCI,...) will be encrypted as well. You'll have
 to use a NIC which is certified and can decrypt the 
 information on the bus. Obviously, you won't get a 
 certification for such an network card.

You won't and Bill won't. But those who employ such NIC's will have no
difficulty obtaining certification.

 But this implies other problems:
 
 You won't be able to enter a simple shell script through the 
 keyboard. If so, you could simple print protected files as a 
 hexdump or use the screen (or maybe the sound device or any
 LED) as a serial interface.
 
 Since you could use the keyboard to enter a non-certified 
 program, the keyboard is to be considered as a nontrusted 
 device. This means that you either
 
 * have to use a certified keyboard which doesn't let 
   you enter bad programs
 
 * don't have a keyboard at all
 
 * or are not able to use shell scripts (at least not in
   trusted context). This means a 
   strict separation between certified software and data.

Sure you can use shell scripts. Though I don't understand how a shell
script will help you in obtaining a dump of the protected data since
your script has insufficient privileges to read the data. Nor can you
give the shell script those privileges since you don't have supervisor
mode access to the CPU. How does your shell script plan to get past the
memory protection?

What am I missing?
--Lucky




Re: Ross's TCPA paper

2002-07-05 Thread Hadmut Danisch

On Thu, Jul 04, 2002 at 10:54:34PM -0700, Lucky Green wrote:
 
 Sure you can use shell scripts. Though I don't understand how a shell
 script will help you in obtaining a dump of the protected data since
 your script has insufficient privileges to read the data. Nor can you
 give the shell script those privileges since you don't have supervisor
 mode access to the CPU. How does your shell script plan to get past the
 memory protection?
 


That's why I was talking about a shell script (or take any
other program to be interpreted).

What does need to be certified: The shell or the shell script?
The CPU doesn't recognize the shell script as a program, this
is just some plain data entered through the keyboard like
writing a letter. A shell script is not a program, it is
data entered at a program's runtime.

This moves one step forward:

The hardware (palladium chip, memory management, etc.) can
check the binary program to be loaded. So you won'te be able
to run a compiled program and to access protected information.

But once a certified software is running, it takes input
(reading mouse, keyboard, files, asking DNS, connecting 
servers,...). This input might cause (by interpretation, by
bug or however) the certified software to do certain things
which do not comply with DRM requirements.

At this stage, the running binary software itself is the
instance to provide the DRM security, not the palladium 
memory management anymore. 

I agree that this is not yet an open sesame, but it shows
that the game does not play on the binary/memory management
layer only.

But who controls runtime input?

History shows, that M$ software is anything but able
to deal with malicious input. That's why the world is
using virus filters. That's nothing else than an external
filter to keep malicious input from an attacker away
from the running software.

By analogy, Palladium might require the same: an input
filter between attacker and running software. Since the
attacker is sitting in front of the computer this time,
this filter has to be applied to the user interface,
keyboard and mouse.

Maybe they'll install a filter between the keyboard and
the software, thus building a certified keyboard, which
filters out any malicious key sequences. And maybe you
can use your keyboard only, if you have downloaded the
latest patterns (like your daily virus filter update).

I agree that this depends on the assumption that 
the certified software is not perfect and can't
deal with arbitrary input. But that's reality.

Hadmut









Re: Ross's TCPA paper

2002-07-05 Thread AARG! Anonymous

Seth Schoen writes:
 The Palladium security model and features are different from Unix, but
 you can imagine by rough analogy a Unix implementation on a system
 with protected memory.  Every process can have its own virtual memory
 space, read and write files, interact with the user, etc.  But
 normally a program can't read another program's memory without the
 other program's permission.

 The analogy starts to break down, though: in Unix a process running as
 the superuser or code running in kernel mode may be able to ignore
 memory protection and monitor or control an arbitrary process.  In
 Palladium, if a system is started in a trusted mode, not even the OS
 kernel will have access to all system resources.

Wouldn't it be more accurate to say that a trusted OS will not peek
at system resources that it is not supposed to?  After all, since the
OS loads the application, it has full power to molest that application
in any way.  Any embedded keys or certs in the app could be changed by
the OS.  There is no way for an application to protect itself against
the OS.

And there is no need; a trusted OS by definition does not interfere with
the application's use of confidential data.  It does not allow other
applications to get access to that data.  And it provides no back doors
for root or the system owner or device drivers to get access to the
application data, either.

At http://vitanuova.loyalty.org/2002-07-03.html you provide more
information about your meeting with Microsoft.  It's an interesting
writeup, but the part about the system somehow protecting the app from the
OS can't be right.  Apps don't have that kind of structural integrity.
A chip in the system cannot protect them from an OS virtualizing that
chip.  What the chip does do is to let *remote* applications verify that
the OS is running in trusted mode.  But local apps can never achieve
that degree of certainty, they are at the mercy of the OS which can
twiddle their bits at will and make them believe anything it wants.
Of course a trusted OS would never behave in such an uncouth manner.


 That limitation
 doesn't stop you from writing your own application software or scripts.

Absolutely.  The fantasies which have been floating here of filters
preventing people from typing virus-triggering command lines are utterly
absurd.  What are people trying to prove by raising such nonsensical
propositions?  Palladium needs no such capability.


 Interestingly, Palladium and TCPA both allow you to modify any part of
 the software installed on your system (though not your hardware).  The
 worst thing which can happen to you as a result is that the system
 will know that it is no longer trusted, or will otherwise be able to
 recognize or take account of the changes you made.  In principle,
 there's nothing wrong with running untrusted; particular applications
 or services which relied on a trusted feature, including sealed
 storage (see below), may fail to operate.

Right, and you can boot untrusted OS's as well.  Recently there was
discussion here of HP making a trusted form of Linux that would work with
the TCPA hardware.  So you will have options in both the closed source and
open source worlds to boot trusted OS's, or you can boot untrusted ones,
like old versions of Windows.  The user will have more choice, not less.


 Palladium and TCPA both allow an application to make use of
 hardware-based encryption and decryption in a scheme called sealed
 storage which uses a hash of the running system's software as part of
 the key.  One result of this is that, if you change relevant parts of
 the software, the hardware will no longer be able to perform the
 decryption step.  To oversimplify slightly, you could imagine that the
 hardware uses the currently-running OS kernel's hash as part of this
 key.  Then, if you change the kernel in any way (which you're
 permitted to do), applications running under it will find that they're
 no longer able to decrypt sealed files which were created under the
 original kernel.  Rebooting with the original kernel will restore the
 ability to decrypt, because the hash will again match the original
 kernel's hash.

Yes, your web page goes into somewhat more detail about how this would
work.  This way a program can run under a secure OS and store sensitive
data on the disk, such that booting into another OS will then make it
impossible to decrypt that data.

Some concerns have been raised here about upgrades.  Did Microsoft
discuss how that was planned to work, migrating from one version of a
secure OS to another?  Presumably they have different hashes, but it
is necessary for the new one to be able to unseal data sealed by the
old one.

One obvious solution would be for the new OS to present a cert to the chip
which basically said that its OS hash should be treated as an alias
of the older OS's hash.  So the chip would unseal using the old OS hash
even when the new OS was running, based on the fact that this cert was