--- "Eric B. Ramsay" <[EMAIL PROTECTED]> wrote:

> I don't know when Lanier wrote the following but I would be interested to
> know what the AI folks here think about his critique (or direct me to a
> thread where this was already discussed). Also would someone be able to
> re-state his rainstorm thought experiment more clearly -- I am not sure I
> get it:
> 
>      http://www.jaronlanier.com/aichapter.html

This is a nice proof of the non-existence of consciousness (or qualia).  Here
is another (I came across on sl4):

  http://youtube.com/watch?v=nx6v30NMFV8

Such reductions to absurdity are possible because the brain is programmed to
not accept the logical result.

Consciousness is hard to define but you know what it is.  It is what makes you
aware, the "little person inside your head" that observes the world through
your perceptions, that which distinguishes you from a philosophical zombie. 
We normally associate consciousness with human traits such as episodic memory,
response to pleasure and pain, fear of death, language, and a goal of seeking
knowledge through experimentation.  (Imagine a person without any of these
qualities).

These traits are programmed into our DNA because they increase our fitness. 
You cannot change them, which is what these proofs would do if you could
accept them.

Unfortunately, this question will have a profound effect on the outcome of a
singularity.  Assuming recursive self improvement in a competitive
environment, we should expect agents (possibly including our uploads) to
believe in their own consciousness, but there is no evolutionary pressure to
also believe in human consciousness.  Even if we successfully constrain the
process so that agents have the goal of satisfying our extrapolated volition,
then logically we should expect those agents (knowing what we cannot know) to
conclude that human brains are just computers and our existence doesn't
matter.  It is ironic that our programmed beliefs leads us to advance
technology to the point where the question can no longer be ignored.


-- Matt Mahoney, [EMAIL PROTECTED]

-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b
Powered by Listbox: http://www.listbox.com

Reply via email to