Matt Mahoney wrote:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
I have to say that this is only one interpretation of what it would mean for an AGI to experience something, and I for one believe it has no validity at all. It is purely a numeric calculation that makes no reference to what "pain" (or any other kind of subjective experience) actually is.

I would like to hear your definition of pain and/or negative reinforcement. Can you answer the question of whether a machine (say, an AGI or an uploaded
human brain) can feel pain?

When I get a chance to finish my consciousness paper. The question of what it is is quite complex. I'll get back to this later.

But most people are agreed that just having an algorithm avoid a state is not equivalent to pain.


Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=75048359-9a2e59

Reply via email to