Mark,

I cannot hit everything now, so at least one part:

Are you *absolutely positive* that "real pain and real
feelings" aren't an emergent phenomenon of sufficiently complicated and
complex feedback loops?  Are you *really sure* that a sufficiently
sophisticated AGI won't experience pain?

Except some truths found in the world of math, I'm not *absolutely
positive* about anything ;-), but I don't see why it should, and when
running on computers we currently have, I don't see how it could..
Note that some people suffer from rare disorders that prevent them
from the sensation of pain (e.g. congenital insensitivity to pain).
Some of them suffer from slight mental retardation, but not all. Their
brains are pretty complex systems demonstrating general intelligence
without the pain sensation. In some of those cases, the pain is killed
by increased production of endorphins in the brain, and in other cases
the pain info doesn't even make it to the brain because of
malfunctioning nerve cells which are responsible for transmitting the
pain signals (caused by genetic mutations). Particular feelings (as we
know it) require certain sensors and chemistry. Sophisticated logical
structures (at least in our bodies) are not enough for actual
feelings. For example, to feel pleasure, you also need things like
serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and
endorphins.  Worlds of real feelings and logic are loosely coupled.

Regards,
Jiri Jelinek

On 5/23/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> AGIs (at least those that could run on current computers)
> cannot really get excited about anything. It's like when you represent
> the pain intensity with a number. No matter how high the number goes,
> it doesn't really hurt. Real feelings - that's the key difference
> between us and them and the reason why they cannot figure out on their
> own that they would rather do something else than what they were asked
> to do.

So what's the difference in your hardware that makes you have real pain and
real feelings?  Are you *absolutely positive* that "real pain and real
feelings" aren't an emergent phenomenon of sufficiently complicated and
complex feedback loops?  Are you *really sure* that a sufficiently
sophisticated AGI won't experience pain?

I think that I can guarantee (as in, I'd be willing to bet a pretty large
sum of money) that a sufficiently sophisticated AGI will act as if it
experiences pain . . . . and if it acts that way, maybe we should just
assume that it is true.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to