On Sun, Jun 1, 2008 at 2:15 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> --- On Sat, 5/31/08, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>
>> But in future, there could be impostor agents that act like
>> they have humanlike subjective experience but don't ... and we
>> could uncover them by analyzing their internals...
>
> What internal properties of a Turing machine distinguish one that has 
> subjective experiences from an equivalent machine (implementing the same 
> function) that only pretends to have subjective experience?


You're asking a different question.

What I said was that internal properties could distinguish

a) a machine having HUMANLIKE subjective experiences

from

b) a machine just claiming to have HUMANLIKE subjective experiences,
but not really having them

I don't care to get into the stupid argument as to whether "subjective
experience" is a meaningful and useful concept or not.  If you find
that it is not, feel free not to use it; but please don't pester those
of us who **do** find it meaningful and useful ;-)

I wrote my views on subjective vs objective reality in The Hidden
Pattern quite explicitly if you're curious.  I view them as separate
realms of being which however have interesting correlations btw them.
In this sense I'm a panpsychist and I would say that every program has
some qualia attached to it.  But not necessarily human-qualia-like
qualia... that's a different story.

-- Ben G


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to