Bill Joy suggests limits to freedom and research.

2000-03-16 Thread Tim May

At 10:54 PM -0500 3/15/00, Duncan Frissell wrote:
At 04:38 PM 3/15/00 -0500, Trei, Peter wrote:
Bill thinks - and I think he may well be right - that we are approaching the
point where a single individual could build a lethal, virulent disease,
or (somewhat later) an unrestricted nanotech self-replicator. What then?

My problem with the argument is that we were dealing with it on the 
extropian list in '92 and others were dealing with it long before 
that.  The term "singularity" was applied to the case where change 
becomes a vertical asymptote (1/x).  Novels like "Blood Music" have 
discussed it as well.  Why did it take until 1998 for this to occur 
to Bill?  Nuclear annihilation provided lots of fodder for the 
identical discussion in '50s SF and "The Shape of Things to Come" 
deals with it as well.  That's what I meant by it being an old 
discussion.

This is exactly right. It's exasperating to see neo-journalists pick 
up on recycled discussions as being revealed wisdom. Old beer in new 
bottles.

Bill Joy has done some fine work, but he is no more insightful about 
the future than various Nobel Prize-winning scientists were in the 
50s, 60s, and 70s about nuclear war, one world governments, or 
running out of resources.


No need to waste time on him.

BTW, far more detailed scenarios for nanotech general assemblers were 
covered in exhaustive detail at the "Assembler Multitudes" fora every 
two weeks in the early 90s. Organized by Ted Kaehler, one of the 
original developers of Smalltalk, the gatherings were stimulating, 
detailed, and thoughtful. We had about 15-25 regular participants, 
and many specialists also dropped in. Eric Drexler, Mark Miller, Marc 
Stiegler, Ralph Merkle, Howard Landeman, Markus Krummenacker, and 
many others.

We even covered scenarios about attemping to stifle research...I was 
overjoyed--no pun intended--to describe to them just how pointless 
stifling research would be, because of strong crypto, data havens, 
etc. It was at one of these meetings, in 1993, that I presented 
BlackNet. I did this by literally soliciting members of the Assembler 
Multitudes seminar to sell bootleg nanotech research, untraceably. At 
the actual physical meeting, I revealed that I done this to 
demonstrate the tip of the iceberg on such approaches.

Bill Joy's essay is just not very deep. As I said, old beer in new bottles.


--Tim May
-- 
-:-:-:-:-:-:-:
Timothy C. May  | Crypto Anarchy: encryption, digital money,
ComSec 3DES:   831-728-0152 | anonymous networks, digital pseudonyms, zero
W.A.S.T.E.: Corralitos, CA  | knowledge, reputations, information markets,
"Cyphernomicon" | black markets, collapse of governments.



Re: Bill Joy suggests limits to freedom and research.

2000-03-15 Thread Duncan Frissell

At 11:20 AM 3/15/00 -0500, Trei, Peter wrote:
I'd like to suggest that people take a serious look at Bill Joy's
"Why the future doesn't need us",  the cover article
in the current Wired magazine. It can be found online at
http://www.wired.com/wired/archive/8.04/joy.html.

I printed it out and will be working my way through it.

Sounds like 50's SF films vs. 50's written (US) SF.  "There are some things 
man was not meant to know" vs. "men as gods".

Traditional fight.

As to future risks to freedom - we weren't thinking of asking permission in 
any case.

DCF

"They can attempt to outlaw weapons but they can't outlaw the Platonic 
Ideal of a weapon and modern technology makes it absolutely trivial to 
convert a Platonic Ideal of a weapon into an actual weapon whenever one 
desires."



Re: Bill Joy suggests limits to freedom and research.

2000-03-15 Thread Richard Thieme

Bill Joy certainly has reputation capital to burn,  but it is in the domain
of computer science (as he himself says) and not philosophical inquiry.
This article contains opinions and book reviews, not deep thinking into the
issues it merely suggests. It if *were* deep thinking, Wired would never
publish it.

The article is a swiftly moving river of name-dropping and a list of "books
I have read." The books are invoked in serial fashion from popular Silicon
Valley culture, but the critical implications and ideas are not integrated
into a synthesis, that is, these ideas are not digested and integrated into
a comprehensive insightful exploration. Yes, Bill Guy has done some great
work, but his critical thinking could use an assist from other disciplines
like the humanities with which he is not very familiar. One added quote
from Nietzsche does not count.

Why? Because an inquiry into what it means to be human requires an
understanding of culture and how symbols define our identities and very
selves, and a historical perspective that shows awareness of how identities
have shifted in the past, how values and cultures function in the human
equation, and how the older word for psyche - "soul" - can still play a
part in illuminating the possibilities for being human. I do not mean that
in any simplistic sense but as a distinction or domain that refers to a
distinctly human field of subjectivity. Joy may have had a drink with John
Searle (in one of the earlier meeting-dropping party-dropping name-dropping
indulgences) but he does not seem to have understood his analysis of
artificial intelligence.

So I disagree with your assessment of this article. It is very shallow and
continues the New Wired tradition of righteous Silicon Valley name-dropping
a la People magazine as a substitute for deep thinking and clear exposition.

Richard Thieme





At 11:20 AM 03/15/2000 -0500, Trei, Peter wrote:
I'd like to suggest that people take a serious look at Bill Joy's 
"Why the future doesn't need us",  the cover article 
in the current Wired magazine. It can be found online at
http://www.wired.com/wired/archive/8.04/joy.html.

Bill (one of the Great Old Men of the Internet, with vi, BSD,
Java, and Jini to his credit) it not a nut. He has reputation
capital to burn. He's talking about the possible imminent end 
of the human species.

Briefly, he argues that current advances in biotech,
computers and robotics are creating such powerful
instrumentalities that either we'll make machines smarter
than ourselves, which will take over, or some nut will unleash
a nanotech self-replicator or an engineered micro-organism
to doom the human race.

Bill suggests that perhaps we need to consider if there are 
technological areas where we should not venture, because 
of the potential danger of the knowledge. 

This article is important, not only for what it says, but also 
how people are going to use it. It is manna from heaven to 
those who would further centralize and tighten control over
people, and will undoubtedly be cited by those who would
restrict privacy and anonymity.

This article is partially a dystopic response to Kurzweil's
"In the Age of Spiritual Machines", a book which I found
provocative, if flawed.

Peter Trei



Richard Thieme 


  ThiemeWorks ... professional speaking and
  business consulting:
ThiemeWorks
P. O. Box 170737the impact of computer technology
Milwaukee Wisconsin on people in organizations:
53217-8061  helping people stay flexible 
voice: 414.351.2321 and effective
fax: 414.351.5779   during times of accelerated change.
cell: 414.704.4598

http://www.thiemeworks.com
http://www.richardthieme.com  - for information on Professional Speaking



Re: Bill Joy suggests limits to freedom and research.

2000-03-15 Thread Bram Cohen

On Wed, 15 Mar 2000, Trei, Peter wrote:

 http://www.wired.com/wired/archive/8.04/joy.html.
 
 Briefly, he argues that current advances in biotech,
 computers and robotics are creating such powerful
 instrumentalities that either we'll make machines smarter
 than ourselves, which will take over, or some nut will unleash
 a nanotech self-replicator or an engineered micro-organism
 to doom the human race.

It would have to compete with these already highly competitive nanotech
self-replicators we call 'bacteria'.

-Bram Cohen