--- Richard Loosemore <[EMAIL PROTECTED]> wrote:

> Matt Mahoney wrote:
> > --- Richard Loosemore <[EMAIL PROTECTED]> wrote:
> >> I have to say that this is only one interpretation of what it would mean 
> >> for an AGI to experience something, and I for one believe it has no 
> >> validity at all.  It is purely a numeric calculation that makes no 
> >> reference to what "pain" (or any other kind of subjective experience) 
> >> actually is.
> > 
> > I would like to hear your definition of pain and/or negative
> reinforcement. 
> > Can you answer the question of whether a machine (say, an AGI or an
> uploaded
> > human brain) can feel pain?
> 
> When I get a chance to finish my consciousness paper.  The question of 
> what it is is quite complex.  I'll get back to this later.
> 
> But most people are agreed that just having an algorithm avoid a state 
> is not equivalent to pain.

Call it utility if you like, but it is clearly a numeric quantity.  If you
prefer A to B and B to C, then clearly you will prefer A to C.  You can make
rational choices between, say, 2 of A or 1 of B.

You could relate utility to money, but money is a nonlinear scale.  A dollar
will make some people happier than others, and a million dollars will not make
you a million times happier than one dollar.  Money also has no utility to
babies, animals, and machines, all of which can be trained through
reinforcement learning.  So if you can propose an alternative to bits as a
measure of utility, I am interested to hear about it.

I don't believe that the ability to feel pleasure and pain depends on
consciousness.  That is just a circular definition. 
http://en.wikipedia.org/wiki/Philosophical_zombie


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=75059022-0fd637

Reply via email to