Brent Meeker writes:

> OK, an AI needs at least motivation if it is to do anything, and we > could call motivation a feeling or emotion. Also, some sort of hierarchy > of motivations is needed if it is to decide that saving the world has > higher priority than putting out the garbage. But what reason is there > to think that an AI apparently frantically trying to save the world > would have anything like the feelings a human would under similar > circumstances? It might just calmly explain that saving the world is at > the top of its list of priorities, and it is willing to do things which > are normally forbidden it, such as killing humans and putting itself at > risk of destruction, in order to attain this goal. How would you add > emotions such as fear, grief, regret to this AI, given that the external > behaviour is going to be the same with or without them because the > hierarchy of motivation is already fixed?

You are assuming the AI doesn't have to exercise judgement about secondary objectives - judgement that may well involve conflicts of values that have to resolve before acting. If the AI is saving the world it might for example, raise it's cpu voltage and clock rate in order to computer faster - electronic adrenaline. It might cut off some peripheral functions, like running the printer. Afterwards it might "feel regret" when it cannot recover some functions.
Although there would be more conjecture in attributing these feelings to the AI 
than to a person acting in the same situation, I think the principle is the 
same.  We think the persons emotions are part of the function - so why not the 
AI's too.

Do you not think it is possible to exercise judgement with just a hierarchy of 
motivation? Alternatively, do you think a hierarchy of motivation will 
automatically result in emotions? For example, would something that the AI is 
strongly motivated to avoid necessarily cause it a negative emotion, and if so 
what would determine if that negative emotion is pain, disgust, loathing or 
something completely different that no biological organism has ever experienced?

Stathis Papaioannou
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to