Joshua,

The answer to this question is partially determined by how you define
Singularity.

If the Singularity means that everyone survives, then the most
powerful argument against that type of Singularity is that we'll very
likely be wiped out first by an AI using self-replicating hardware to
maximize an arbitrary nonhuman atom pattern.

If a Singularity means the emergence of smarter-than-human
intelligence, then the most persuasive argument for that never
happening this century is that we wage nanowar and set ourselves back
to the Stone Age.

--
Michael Anissimov
Lifeboat Foundation      http://lifeboat.com
http://acceleratingfuture.com/michael/blog

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to