On 9/24/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:

 "Anyway, I am curious if anyone would like to share
experiences they've had trying to get Singularitarian
concepts across to ordinary (but let's assume
college-educated) Joes out there. Successful
experiences are valued but also unsuccessful ones. "

The people that I have talked to are fairly aware of
exponential tech growth. I have had far more success
talking about AGI than extended lifespans or anything
else similar.  It is hard to keep conversations from
being theoretical and sidetracked. 

Talking about the novamante approach has always worked
well to bringing people around to the possibility that
a strong AGI could come about. After that, the
conversation turns to either how useful it would be,
or how safe it would be.

 When talking about use, it is easy to explain by
giving examples. When talking about safety, I always
bring in disembodied AGI vs. embodied and the normal
"range of possible minds" debate. If they are still
wary, I talk about the possible inevitability of AGI.
I relate it to the making of the atom bomb during
WWII. Do we want someone aware of the danger and
motivated to make it, and standard practice
guidelines, as safe as possible? Or would you rather
someone with bad intent and recklessness to make the
attempt?


__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to