Richard> Mark Waser wrote:
>> AGIs (at least those that could run on current computers)
>> cannot really get excited about anything. It's like when you
Richard> represent
>> the pain intensity with a number. No matter how high the number
Richard> goes,
>> it doesn't really hurt. Real feelings - that's the key difference
>> between us and them and the reason why they cannot figure out on
Richard> their
>> own that they would rather do something else than what they were
Richard> asked
>> to do.
>> So what's the difference in your hardware that makes you have real
>> pain and real feelings?  Are you *absolutely positive* that "real
>> pain and real feelings" aren't an emergent phenomenon of
>> sufficiently complicated and complex feedback loops?  Are you
>> *really sure* that a sufficiently sophisticated AGI won't
>> experience pain?
>> 
>> I think that I can guarantee (as in, I'd be willing to bet a pretty
>> large sum of money) that a sufficiently sophisticated AGI will act
>> as if it experiences pain . . . . and if it acts that way, maybe we
>> should just assume that it is true.

Richard> Jiri,

Richard> I agree with Mark's comments here, but would add that I think
Richard> we can do more than just take a hands-off Turing attitude to
Richard> such things as pain: I believe that we can understand why a
Richard> system built in the right kind of way *must* experience
Richard> feelings of exactly the sort we experience.

Richard> I won't give the whole argument here (I presented it at the
Richard> Consciousness conference in Tucson last year, but have not
Richard> yet had time to write it up as a full paper).

What is Thought? argues the same thing (Chapter 14). I'd be curious
to see if your argument is different.

Richard> I think it is a serious mistake for anyone to say that the
Richard> difference between machines cannot in principle experience
Richard> real feelings.  Sure, if they are too simple they will not,
Richard> but all of our discussions, on this list, are not about those
Richard> kinds of too-simple systems.

Richard> Having said that: there are some conventional approaches to
Richard> AI that are so crippled that I don't think they will ever
Richard> become AGI, let alone have feelings.  If you were criticizing
Richard> those specifically, rather than just AGI in general, I'm on
Richard> your side!  :-;


Richard> Richard Loosemore

Richard> ----- This list is sponsored by AGIRI:
Richard> http://www.agiri.org/email To unsubscribe or change your
Richard> options, please go to:
Richard> http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to