[singularity] Storytelling, empathy and AI

2006-12-20 Thread Ben Goertzel

This post is a brief comment on PJ Manney's interesting essay,

http://www.pj-manney.com/empathy.html

Her point (among others) is that, in humans, storytelling is closely
tied with empathy, and is a way of building empathic feelings and
relationships.  Mirror neurons and other related mechanisms are
invoked.

I basically agree with all this.

However, I would add that among AI's with a nonhuman cognitive
architecture, this correlation need not be the case.  Humans are built
so that among humans storytelling helps build empathy.  OTOH, for an
AI storytelling might not increase empathy one whit.

It is interesting to think specifically about the architectural
requirements that having storytelling increase empathy may place on
an AI system.

For example, to encourage the storytelling/empathy connection to exist
in an AI system, one might want to give the system an explicit
cognitive process of hypothetically putting itself in someone else's
place.  So, when it hears a story about character X, it creates
internally a fabricated story in which it takes the place of character
X.  There is no reason to think this kind of strategy would come
naturally to an AI, particularly given its intrinsic dissimilarity to
humans.  But there is also no reason that kind of strategy couldn't be
forced, with the impact of causing the system to understand humans
better than it might otherwise.

-- Ben G

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983


Re: [singularity] Storytelling, empathy and AI

2006-12-20 Thread LĂșcio de Souza Coelho

On 12/20/06, Ben Goertzel [EMAIL PROTECTED] wrote:
(...)

For example, to encourage the storytelling/empathy connection to exist
in an AI system, one might want to give the system an explicit
cognitive process of hypothetically putting itself in someone else's
place.  So, when it hears a story about character X, it creates
internally a fabricated story in which it takes the place of character
X.  There is no reason to think this kind of strategy would come
naturally to an AI, particularly given its intrinsic dissimilarity to
humans.  But there is also no reason that kind of strategy couldn't be
forced, with the impact of causing the system to understand humans
better than it might otherwise.

(...)

I half-remember quotes from the now somewhat quaint scifi book The
Robots of Dawn, by Asimov. There the character Dr. Fastolfe justifies
his creation of humaniform robots saying things like there is no
mind without a body and no body without mind and an inhuman body
develops an inhuman mind.

Your assertion about storytelling putting our selves in someone else's
skin makes me wonder that those fictional are true. Perhaps in order
to put her self into the skin of a human by means of storytelling,
an AI beforehand needs to have a human model of herself. Perhaps a
virtual body, a simulation, but even though something that would serve
to anchor her to the point of view of humans.

By the way, I wholeheartedly agree with the storytelling-induced
self-transference stuff. When I read a story, specially one of those
in first person, I feel like kind I am incarnated in the character.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983