--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:

> On 6/13/07, Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> > On 6/13/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > >
> > > If yes, then how do you define pain in a machine?
> > >
> > A pain in a machine is the state in the machine that a person
> > empathizing with the machine would avoid putting the machine into,
> > other things being equal (that is, when there is no higher goal in
> > going through the pain).
> >
> To clarify:
> (1) there exists a person empathizing with that machine
> (2) this person would avoid putting the machine into the state of pain

I would avoid deleting all the files on my hard disk, but it has nothing to do
with pain or empathy.

Let us separate the questions of pain and ethics.  There are two independent
questions.

1. What mental or computational states correspond to pain?
2. When is it ethical to cause a state of pain?

One possible definition of pain is any signal that an intelligent system has
the goal of avoiding, for example,

- negative reinforcement in any animal capable of reinforcement learning.
- the negative of the "reward" signal received by an AIXI agent.
- excess heat or cold to a thermostat.

I think pain by any reasonable definition exists independently of ethics. 
Ethics is more complex.  Humans might decide, for example, that it is OK to
inflict pain on a mosquito but not a butterfly, or a cow but not a cat, or a
programmable logic gate but not a video game character.  The issue here is not
pain, but our perception of resemblance to humans.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to