On 12/28/07, Stan Nilsen <[EMAIL PROTECTED]> wrote:
> the concept of Singularity that raises objection for me is
> "over-estimating" the prediction of strength in Artificial Intelligence.
> A corollary might be the under-estimation of the human brain.
>
> "Intelligence is not linear" may be the same idea, I've not seen the
> full argument.  The idea that I propose is that there is a limitation of
> intelligence.  Essentially this boundary is value related.

"Intelligence is not linear" is a related idea, though not the same -
"intelligence isn't linear" refers to the term "super-intelligence"
being meaningless, because a calculator is already superintelligent
compared to humans (albeit in a very limited field).

We've added an objection that's more like the one you are proposing. Thanks.


-- 
http://www.saunalahti.fi/~tspro1/ | http://xuenay.livejournal.com/

Organizations worth your time:
http://www.singinst.org/ | http://www.crnano.org/ | http://lifeboat.com/

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=80177301-31cd59

Reply via email to