On 12/20/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:
(...)
For example, to encourage the storytelling/empathy connection to exist
in an AI system, one might want to give the system an explicit
cognitive process of hypothetically "putting itself in someone else's
place."  So, when it hears a story about character X, it creates
internally a fabricated story in which it takes the place of character
X.  There is no reason to think this kind of strategy would come
naturally to an AI, particularly given its intrinsic dissimilarity to
humans.  But there is also no reason that kind of strategy couldn't be
forced, with the impact of causing the system to understand humans
better than it might otherwise.
(...)

I half-remember quotes from the now somewhat quaint scifi book "The
Robots of Dawn", by Asimov. There the character Dr. Fastolfe justifies
his creation of "humaniform robots" saying things like "there is no
mind without a body and no body without mind" and "an inhuman body
develops an inhuman mind".

Your assertion about storytelling putting our selves in someone else's
skin makes me wonder that those fictional are true. Perhaps in order
to put "her" self into the skin of a human by means of storytelling,
an AI beforehand needs to have a human model of herself. Perhaps a
virtual body, a simulation, but even though something that would serve
to "anchor" her to the point of view of humans.

By the way, I wholeheartedly agree with the storytelling-induced
"self-transference" stuff. When I read a story, specially one of those
in first person, I feel like kind I am "incarnated" in the character.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983

Reply via email to