On 10/12/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:
Imagine going through the amount of change in the human life course
(infant --> child --> teen --> young adult --> middle aged adult -->
old person) within, say, a couple days.  Your self model wouldn't
really have time to catch up.  You'd have no time to be a stable
"you."  Even if there were (as intended e.g. in Friendly AI designs) a
stable core of supergoals throughout all the changes

On the other hand, just because an intelligence is changing it's
self-perception at an increased rate doesn't necessarily mean it won't
have self-identity.

It may seem to us slow-to-adapt humans as if an AI's behaviour is
completely at odds with it's previous self-definition purely because
we can't conceive of the thought process leading to it's next phase of
self-identity. At least we won't be able to conceive of it fast enough
to catch up before the AI's self-identity morphs once more.

I think that whether an AI has "self" will depend on whether it is
programmed to do so. More specifically, it will depend on whether it
makes an attempt to preserve its self-identity when undergoing large
amounts of structural and systemic change.

--
-Joel

"Wish not to seem, but to be, the best."
               -- Aeschylus

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to