Samantha Atkins wrote:
The Singularity won't happen this century because:

a) Those capable of actually building >human intelligence are too busy with arcane philosophical conundrums, trivia and pointless debates;


While there are a lot of people on these lists wasting time with pointless chatter, many of these people do not have the capability, or the desire, to build or work in the field of AGI. The ones that really are working on AGI are not making this kind of noise and are consequently less noticeable in this regard. I have seen others on the AGI or SL4 list claim that there are even people in academia working on AGI or related theories, but are doing so quietly.

And even if this were not the case, there is little reason to believe that it will prevent a singularity from happening in the next 95 years.

b) We are headed rapidly to the disintegration of freedom and economic vitality in the US leading to a global economic downturn of disastrous proportions and/or global war that kills off too much of the resource base and the freedom/time to use it.

There is a sour administration in the US but that can always change. This scenario is by no means a guarantee. World War 3 could start tomorrow or never at all.

Here is a thought... Suppose that the economy will collapse or some other disaster will be likely within 35 years, with likely no obvious way to prevent this from happening except for a singularity or some profoundly advanced technology. This would mean that we will probably, in the next 35 years, A) face disaster, or B) singularity. I don't see any reason to be optimistic or pessimistic one way or the other -about- the long term future, because the future is not something you can reliably predict. An optimist will always have hope on their side, a pessimist will never be disappointed, but neither can predict the future.
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to