--- Niels-Jeroen Vandamme <[EMAIL PROTECTED]> wrote: > Without consciousness, there could be no perception. I am surely conscious > right now, and how I am will remain a mystery for many years.
A thermostat perceives the temperature and acts on it. Is it conscious? We think we know what consciousness is. It is something that every human has, and possibly some animals, but no machine has one. It is only in the context of AI that we realize we don't have a good definition of it. There is no test to detect consciousness. We can only test for properties that we normally associate with humans, but that is not the same thing. When logic conflicts with instinct, instinct wins and the logic gets contorted. The heated discussion on the copy paradox is a perfect example. Your consciousness is tranferred to the copy only if the original is destroyed, or destroyed in certain ways, or under certain conditions. We discuss this ad-infinitum, but it always leads to a contradiction because we refuse to accept that consciousness does not exist, because if you accept it you die. So the best you can do is accept both contradictory beliefs and leave it at that. So how do we approach the question of uploading without leading to a contradiction? I suggest we approach it in the context of outside observers simulating competing agents. How will these agents evolve? We would expect that agents will produce other agents similar to themselves but not identical, either through biological reproduction, genetic engineering, or computer technology. The exact mechanism doesn't matter. In any case, those agents will evolve an instinct for self preservation, because that makes them fitter. They will fear death. They will act on this fear by using technology to extend their lifespans. When we approach the question in this manner, we can ask if they upload, and if so, how? We do not need to address the question of whether consciousness exists or not. The question is not what should we do, but what are we likely to do? > >From: Matt Mahoney <[EMAIL PROTECTED]> > >Reply-To: [email protected] > >To: [email protected] > >Subject: Re: [singularity] critiques of Eliezer's views on AI > >Date: Mon, 25 Jun 2007 17:19:20 -0700 (PDT) > > > > > >--- Jey Kottalam <[EMAIL PROTECTED]> wrote: > > > > > On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > > > > > > You can only transfer > > > > consciousness if you kill the original. > > > > > > What is the justification for this claim? > > > >There is none, which is what I was trying to argue. Consciousness does not > >actually exist. What exists is a universal belief in consciousness. The > >belief exists because those who did not have it did not pass on their DNA. > > > > > >-- Matt Mahoney, [EMAIL PROTECTED] -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8
