--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> http://www.goertzel.org/books/spirit/uni3.htm  --> VIRTUAL ETHICS

The book chapter describes the need for ethics and cooperation in virtual
worlds, but does not address the question of whether machines can feel pain. 
If you feel pain, you will insist it is real, but that is because you are
trying to avoid it.   If you define pain as a signal that an intelligent
system has the goal of avoiding, then you have reduced the problem to defining
intelligence, because otherwise very simple systems feel pain, for example, a
thermostat when the room is too hot or cold.  Are animals intelligent?

You could, alternatively, define pain as something that has to be "felt", but
that implies the requirement for a consciousness or self awareness, for which
there is no experimental test.

I am not aware of any definition that allows for pain in humans but not
machines that doesn't either make an arbitrary distinction between the two, or
deny that the human brain can be simulated by a computer.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to