Yes this is true. Sometimes though I think that we need to build AGI weapons
ASAP. Why? The human race needs to protect itself from other potentially
aggressive beings. Humans treat animals pretty bad as an example. The earth
is a sitting duck. How do we defend ourselves? Clumsy nukes? Not good
enough. there needs to be new breakthroughs in AGI/nanotech/digital physics
that brings in new weaponry. That's the ugly reality. The alternative is to
say that no other advanced beings exist or if they do, assume that they'll
be friendly. Sounds sci-fi-ish but it is not.

 

John

 

 

From: Edward W. Porter [mailto:[EMAIL PROTECTED] 



John,

 

Robin's original post said

"I've been invited to write an article for an upcoming special issue of IEEE
Spectrum on "Singularity", which in this context means rapid and large
social change from human-level or higher artificial intelligence.  "

I assume he is smart enought to know that superintelligent machines pose
some threats and will have significant social consequences (that's why it is
called the singularity).  And certainly such threats have been discussed on
this list many times before.  

 

I personally think it is possible AGI could bring in a much better
existence, but only if intelligence augmentation makes us more intelligent
as nations and as a world, if it lets us stay competative with the machines
we build,  and if it causes us to build mainly only machines that have been
designed to be campatible with, and hopefully care for, us..

 

Ed Poter






-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=63907802-91ca37

Reply via email to