Mike,

On 10/10/06, deering <[EMAIL PROTECTED]> wrote:

Going beyond the definition of Singularity we can make some educated guesses
about the most likely conditions under which the Singularity will occur.
Due to technological synergy, the creation of STHI will happen coincident
with the achievement of molecular manufacturing and the completion of all
basic biological molecular functions including gene expression control
functions and proteomics.  For all three of these highly significant events
to occur at the same time ensures sudden and massive sociological
disruption.

Not at all.  It should be theoretically possible to create an AGI that
is superintelligent and has all this knowledge, but uses it in a
nondisruptive way to help humanity.  A Friendly AI could introduce
technologies with feature sets that bring out the cooperative aspect
of humanity rather than our warlike aspect.  This might be both what a
normatively altruistic AI and a collective volition AI would do.

In that case you would have all these significant events without
accompanying sociological upheaval.  You could even go back to a
society superficially similar to the Middle Ages (or, say, steampunk
novels), except you are perpetually youthful and can fabricate tools
out of thin air.  For more on this:

http://www.acceleratingfuture.com/michael/blog/?p=3

--
Michael Anissimov
Lifeboat Foundation      http://lifeboat.com
http://acceleratingfuture.com/michael/blog

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to