--- John Ku <[EMAIL PROTECTED]> wrote: > On 2/17/08, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > Nevertheless we can make similar reductions to absurdity with respect to > > qualia, that which distinguishes you from a philosophical zombie. There > is no > > experiment to distinguish whether you actually experience redness when you > see > > a red object, or simply behave as if you do. Nor is there any aspect of > this > > behavior that could not (at least in theory) be simulated by a machine. > > You are relying on a partial conceptual analysis of qualia or > consciousness by Chalmers that maintains that there could be an exact > physical duplicate of you that is not conscious (a philosophical > zombie). While he is in general a great philosopher, I suspect his > arguments here ultimately rely too much on moving from, "I can create > a mental image of a physical duplicate and subtract my image of > consciousness from it," to therefore, such things are possible.
My interpretation of Chalmers is the opposite. He seems to say that either machine consciousness is possible or human consciousness is not. > At any rate, a functionalist would not accept that analysis. On a > functionalist account, consciousness would reduce to something like > certain representational activities which could be understood in > information processing terms. A physical duplicate of you would have > the same information processing properties, hence the same > consciousness properties. Once we understand the relevant properties > it would be possible to test whether something is conscious or not by > seeing what information it is or is not capable of processing. It is > hard to test right now because we have at the moment only very > incomplete conceptual analyses. It seems to me the problem is defining consciousness, not testing for it. What computational property would you use? For example, one might ascribe consciousness to the presence of episodic memory. (If you don't remember something happening to you, then you must have been unconscious). But in this case, any machine that records a time sequence of events (for example, a chart recorder) could be said to be conscious. Or you might ascribe consciousness to entities that learn, seek pleasure, and avoid pain. But then I could write a simple program like http://www.mattmahoney.net/autobliss.txt with these properties. It seems to me that any other testable property would have the same problem. -- Matt Mahoney, [EMAIL PROTECTED] ------------------------------------------- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b Powered by Listbox: http://www.listbox.com
