Hi,

The reason that so many in the intellectual community see
Singularity discussion as garbage is because there is so little
definitional consensus that it's close to impossible to determine
what's actually being discussed.

I doubt this...

I think the reason that Singularity discussion is disrespected is
that, no matter how you work the specifics of the definition, it all
seems science-fictional to most people...

and we Singularitarians are disrespected for taking sci-fi speculation
too seriously (instead of focusing on money and family, getting a
haircut and getting a real job, etc. etc. ;-)

The basic Singularity concept is incredibly mundane.  In the midst of
all this futuristic excitement, we sometimes forget this.  A single
genetically engineered child born with a substantially
smarter-than-human IQ would constitute a Singularity, because we would
have no ability to predict the specifics of what it would do, whereas
we have a much greater ability to predict the actions of typical
humans.

I think this is not necessarily true.

a)
A very superordinarily smart genetically engineered child could well
waste all its time playing five-dimensional chess, or World of
Warcraft for that matter ... it could then wind up being pretty easy
to predict.

Similarly, the emergence of an Einstein on a planet of human retards
(er, "differently intellectually advantaged" indviduals...) would not
necessarily be a Singularity type event for that planet.  The alien
Einstein might well just stay in the corner meditating and
theorizing....

b)
I find it very unlikely, but I can imagine a Singularity scenario in
which there is strong nanotech plus a host of highly powerful narrow
AI programs, but no artificial general intelligence beyond the human
level.  This could result in massive transformations of reality as we
know it, at an incredibly rapid rate, yet with no superhuman
intelligence.  This would be a Kurzweilian Singularity, and whether it
would be a Vingean Singularity comes out to depend on the
particularities of how one disambiguates the natural language concept
of "intelligence"...


I happen to think that the emergence of superhuman, rapidly
self-improving AI **is** what is going to characterize the Singularity
... but, I don't agree that this is the only species of Singularity
worth talking or thinking about...

-- Ben

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to