At Thu, 31 Aug 2006 14:38:34 -0400, "Jonathan S. Shapiro" <[EMAIL PROTECTED]> wrote: > I do not believe that the same is true for TPM. The problem with TPM is > that the one widely publicized application is DRM. In discussions on > this list, we have identified a number of scenarios where TPM protects > the interests of the *customer*. TPM per se is merely a mechanism for > mechanically embedding certain contract terms. Some of those contracts > are socially bad, some are socially neutral, and some are socially > positive.
That's the real question, isn't it? The TPM supporters are cherry-picking the use cases and evaluating the scenarios mostly under the aspects of protection, and a narrow set of other interests. So are doing some of the critics, I should add. That's why I am targeting at a level of analysis that transcends the individual use cases. A complete evaluation of the expected net effect on society is desperately lacking, but the threats are numerous and have been expressed by many parties. To downplay this to the DRM example is an understatement of the criticism that exists. > There does not appear to be any technical means for differentiating > among these. There doesn't need to be. We can do it in the old-fashioned, human way. > But arguing against TPM because of the single example of DRM does not > strike me as a sound approach. In principle, it is a good thing that > parties to a contract should be able to verify compliance. This is one of the claims that I would like to see analyzed further. I have given several specific, real world examples where this does not seem to be the case, but where breaking the terms of a contract is not only legitimate, but sometimes even a responsibility. The most recent examples where the Pentagon papers, emergency action to avert danger, lieing in a job interview, and shrink-wrap licenses. > DRM is an unfortunate perversion of this technical capability. My current opinion is that the analysis indicates that it is not a perversion of the technology, but that the perversion is inherent in the technology, because of the inherent nature of information as non-proprietarizable. Interestingly enough, the same argument shows that the technology fundamentally doesn't work in the long run. However, even if it doesn't work in principle, its attempted implementation can potentially do a lot of harm in the meantime. Thanks, Marcus _______________________________________________ L4-hurd mailing list [email protected] http://lists.gnu.org/mailman/listinfo/l4-hurd
