If you think not then make your argument why humans plus computers and
narrow AI is sufficient.

On Fri, Dec 12, 2014 at 2:26 PM, Samantha Atkins <[email protected]> wrote:
>
> It is simple.  Do you believe we have enough effective intelligence on the
> planet to deal with our current and likely next three decade problems
> effectively or not.  If not then clearly we must produce more.
>
> On Thu, Dec 11, 2014 at 9:55 PM, Piaget Modeler via AGI <[email protected]>
> wrote:
>>
>> To me it seemed like such a long series of implications (A--> B--> C-->D
>> ...)
>> that, in my opinion, it became less and less likely to be true,
>> especially
>> without evidence.
>>
>> It appears to me, that as we evolve as a species we develop more ability
>> to
>> work cooperatively.    For example, the United Nations, the World Court
>> in the Haig, Financial Markets, the international space station. In fact,
>> there is more coordination now as war is faught not solely through
>> military
>> might, but also through the media, and via financial markets, and
>> economic
>> sanctions.  This requires increased coordination, which humans have
>> risen to
>> meet.
>>
>> Humans learn by error, so runway technological acceleration exceeding
>> human
>> capacity is the same as the errors of Three Mile Island, Chernobyl,
>> Fukushima,
>> the BP Gulf Spill, Global warming, and a host of other disasters that
>> have been
>> encountered. If the error is not critical or fatal, then the capacity to
>> learn from
>> and compensate for the error is usually within the realm of human
>> capabiliities.
>> If the human species fails to adapt, then the species will suffer the
>> consequences.
>> I don't think AGI is necessary at all.  Carl Sagan did mention though
>> that becoming
>> a space faring species is necessary. I'd submit that a serious space
>> program is more
>> important than a serious AGI program.
>>
>> Besides, 100% error avoidance is an impossible goal anyway.  We learn
>> from mistakes.
>> We need them.
>>
>> ~PM
>>
>>
>> ------------------------------
>> Date: Thu, 11 Dec 2014 20:57:55 -0800
>> Subject: Re: [agi] Re: The Need for Cross-Transformational Compressions
>> From: [email protected]
>> To: [email protected]
>>
>>
>> It is a believable postulate based in our own experience.  It does not
>> require direct evidence and I didn't say it did.  Why did you take one
>> thing out and one that is not even essential?
>>
>> On Thu, Dec 11, 2014 at 5:30 PM, Piaget Modeler via AGI <[email protected]>
>> wrote:
>>
>>
>> >
>> > My view is that AGI is *required* if human beings are going to have any
>> > vibrant future at all and required within the next few decades. Why?
>> > Humans, and logically any technological species, evolve with certain
>> > species characteristics including limitations on their effective
>> > intelligence and ability to work cooperatively well together. As a
>> > runaway technological acceleration occurs the species eventually hits a
>> > point where its effective intelligence is ineffective in capacity and
>> > speed for coming up with timely good enough solutions to more and more
>> > complex and rapidly developing problems. Eventually, without either AGI
>> > or rewriting its own nature significantly, the species fails to make
>> > good enough choices followed through well enough to avoid calamity.
>> >
>> >
>> > - samantha
>> >
>>
>> Samantha, what evidence is there of other comparable human species
>> evolution
>> following this scenario you've outlined?  Is this a conjecture on your
>> part or has
>> this happened before, and if so, when?
>>
>> Kindly advise,
>>
>> ~PM
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/2997756-fc0b9b09> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>>
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/2997756-fc0b9b09> |
>> Modify
>> <https://www.listbox.com/member/?&;>
>> Your Subscription <http://www.listbox.com>
>>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to