The program that is isomorphically equivalent to raindrop positions inputted
into the hypothetical computer implements a brain. I have a blinkey safety
light on the back of my bicycle that goes on and off at 1 sec frequency.
There exists a hypothetical computer that that takes a 1 sec on/off pulse as
program instructions and implements my brain. This doesn't say much as the
hypothetical computer is almost 100% equivalent to my brain. Where is the
hypothetical computer? Still have to come up with it.

 

But Lanier does scrape the surface of something bigger with all this. He is
pointing to an intelligence in all things or some structure in all things
that has some amount of potential intelligent as with potential energy in
physics, or some effect with intelligence IOW the structure means something.


 

And I found this interesting that he said - 

 

This means that software packaged as being "non-intelligent" is more likely
to improve, because the designers will receive better critical feedback from
users. The idea of intelligence removes some of the "evolutionary pressure"
from software, by subtly indicating to users it is they, rather than the
software, that should be changing.

 

As it happens, machine decision making is already running our household
finances to a scary degree, but it's doing so with a Wizard of Oz-like
remote authority that keeps us from questioning it. I'm referring to the
machines that calculate our credit ratings. Most of us have decided to
change our habits in order to appeal to these machines. We have simplified
ourselves in order to be comprehensible to simplistic data-bases, making
them look smart and authoritative. Our demonstrated willingness to
accommodate machines in this way is ample reason to adopt a standing bias
against the idea of artificial intelligence.

 

As it is true. There is a herding effect by AI and computers in general to
be aware of.

 

John

 

 

 

From: Eric B. Ramsay [mailto:[EMAIL PROTECTED] 
Sent: Friday, February 22, 2008 10:12 AM
To: singularity@v2.listbox.com
Subject: [singularity] Re: Revised version of Jaron Lanier's thought
experiment.

 

I came across an old Discover magazine this morning with yet another article
by Lanier on his rainstorm thought experiment. After reading the article it
occurred to me that what he is saying may be equivalent to:

Imagine a sufficiently large computer that works according to the
architecture of our ordinary PC's. In the space of Operating Systems (code
interpreters), we can find an operating system such that it will run the
input from the rainstorm such that it appears identical to a computer
running a brain.

If this is true, then functionalism is not affected since we must not forget
to combine program + OS. Thus the rainstorm by itself has no emergent
properties.

Eric B. Ramsay



  _____  


singularity |  <http://www.listbox.com/member/archive/11983/=now> Archives
<http://www.listbox.com/member/archive/rss/11983/> |
<http://www.listbox.com/member/?&;>
Modify Your Subscription

 <http://www.listbox.com> 

 

-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b
Powered by Listbox: http://www.listbox.com

Reply via email to